How to structure Tensorflow model code?

It’s hard for me to find how to structure the Tensorflow model code. I would like to structure it as a class for easy reuse in the future. In addition, my current structure is messy, and the graph of the tensocardboard graph has several “models” inside.

I currently have the following:

import tensorflow as tf
import os

from utils import Utils as utils

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'

class Neural_Network:
    # Neural Network Setup
    num_of_epoch = 50

    n_nodes_hl1 = 500
    n_nodes_hl2 = 500
    n_nodes_hl3 = 500

    def __init__(self):
        self.num_of_classes = utils.get_num_of_classes()
        self.num_of_words = utils.get_num_of_words()

        # placeholders
        self.x = tf.placeholder(tf.float32, [None, self.num_of_words])
        self.y = tf.placeholder(tf.int32, [None, self.num_of_classes])

        with tf.name_scope("model"):
            self.h1_layer = tf.layers.dense(self.x, self.n_nodes_hl1, activation=tf.nn.relu, name="h1")
            self.h2_layer = tf.layers.dense(self.h1_layer, self.n_nodes_hl2, activation=tf.nn.relu, name="h2")
            self.h3_layer = tf.layers.dense(self.h2_layer, self.n_nodes_hl3, activation=tf.nn.relu, name="h3")

            self.logits = tf.layers.dense(self.h3_layer, self.num_of_classes, name="output")

    def predict(self):
        return self.logits

    def make_prediction(self, query):
        result = None

        with tf.Session() as sess:
            saver = tf.train.import_meta_graph('saved_models/testing.meta')
            saver.restore(sess, 'saved_models/testing')

            sess.run(tf.global_variables_initializer())

            prediction = self.predict()
            prediction = sess.run(prediction, feed_dict={self.x : query})
            prediction = prediction.tolist()
            prediction = tf.nn.softmax(prediction)
            prediction = sess.run(prediction)
            print prediction

            return utils.get_label_from_encoding(prediction[0])

    def train(self, data):

        print len(data['values'])
        print len(data['labels'])

        prediction = self.predict()

        cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=prediction, labels=self.y))
        optimizer = tf.train.AdamOptimizer().minimize(cost)

        with tf.Session() as sess:
            sess.run(tf.global_variables_initializer())

            writer = tf.summary.FileWriter("mygraph/logs", tf.get_default_graph())

            for epoch in range(self.num_of_epoch):
                optimised, loss = sess.run([optimizer, cost],
                                           feed_dict={self.x: data['values'], self.y: data['labels']})

                if epoch % 1 == 0:
                    print("Completed Training Cycle: " + str(epoch) + " out of " + str(self.num_of_epoch))
                    print("Current Loss: " + str(loss))

                    saver = tf.train.Saver()
                    saver.save(sess, 'saved_models/testing')
                    print("Model saved")

What I found on the Internet is that many use lower-level code, such as tf.Variables and tf.Constant , so they are much more capable of sharing their code. However, since I'm relatively new to Tensorflow, I would like to use a higher level code first.

Can someone advise me how to structure my code?

+4
1

, , , , , , .


- , ?

, - . , , , : TensorFlow 2 , .

  • - Graph, TensorFlow,
    • tensors (, tf.placeholder, tf.constant, tf.Variables ..)
    • operations (tf.add, tf.matmul ..). Graph , : , , ..

, (, tf.get_variable tf.Graph.get_tensor_by_name).

  • - , TensorFlow Graph API Python ( ++ Java,...). , , . , , factory, .

?

, , .

  • TensorFlow Graph, , Graph, , . " " , , , Graph
  • Python, . -, .
    , TensorFlow Python ( ++ Java), . Python - : .

    , Python , TensorFlow Graph ( Graph), , TensorFlow Graph , Python-instance-to-a-TensorFlow- Graph .

    " " document , Python-instance-connected-to-a-TensorFlow- Graph Python ( v1 v2), - .

    # Create some variables.
    v1 = tf.Variable(..., name="v1")
    v2 = tf.Variable(..., name="v2")
    ...
    
    # Add ops to save and restore all the variables.
    saver = tf.train.Saver()
    
    # Later, launch the model, use the saver to restore variables from disk, and
    # do some work with the model.
    with tf.Session() as sess:
      # Restore variables from disk.
      saver.restore(sess, "/tmp/model.ckpt")
      print("Model restored.")
      # Do some work with the model
      ...
    

( :)) , , / TensorFlow.


, .

+2

Source: https://habr.com/ru/post/1680470/


All Articles