How to build a multiple input graph with tensor flow?

Is it possible to define a TensorFlow graph with multiple inputs? For example, I want to give the graphic two images and one text, each of which is processed by a bunch of layers with an fc-layer at the end. Then there is a node that computes a loss function that takes into account three views. The goal is to allow the three networks to go back, given the joint presentation of lossy. Is it possible? any example / tutorial about this? thanks in advance!

+6
source share
1 answer

This is a completely straightforward thing. For "single entry" you have something like:

def build_column(x, input_size): w = tf.Variable(tf.random_normal([input_size, 20])) b = tf.Variable(tf.random_normal([20]) processing1 = tf.nn.sigmoid(tf.matmul(x, w) + b) w = tf.Variable(tf.random_normal([20, 3])) b = tf.Variable(tf.random_normal([3]) return tf.nn.sigmoid(tf.matmul(x, w) + b) input1 = tf.placeholder(tf.float32, [None, 2]) output1 = build_column(input1, 2) # 2-20-3 network 

and you can just add more such columns and combine them at any time

 input1 = tf.placeholder(tf.float32, [None, 2]) output1 = build_column(input1, 2) input2 = tf.placeholder(tf.float32, [None, 10]) output2 = build_column(input1, 10) input3 = tf.placeholder(tf.float32, [None, 5]) output3 = build_column(input1, 5) whole_model = output1 + output2 + output3 # since they are all the same size 

and you get a network that looks like this:

  2-20-3\ \ 10-20-3--SUM (dimension-wise) / 5-20-3/ 

or make an unambiguous conclusion

 w1 = tf.Variable(tf.random_normal([3, 1])) w2 = tf.Variable(tf.random_normal([3, 1])) w3 = tf.Variable(tf.random_normal([3, 1])) whole_model = tf.matmul(output1, w1) + tf.matmul(output2, w2) + tf.matmul(output3, w3) 

To obtain

  2-20-3\ \ 10-20-3--1--- / 5-20-3/ 
+9
source

Source: https://habr.com/ru/post/1011841/


All Articles