Tensorflow multi-tasking deep learning

Has anyone tried to do multitask deep learning with TensorFlow? That is, separating the lower layers without separating the upper layers. A simple illustration example will help a lot.

+5
source share
1 answer

There is a similar question here , the answer used keras.

This is similar when using tensor flow. The idea is this: we can define several network outputs and, therefore, multiple loss functions (goals). Then we tell the optimizer to minimize the combined loss function, usually using a linear combination.

Concept chart Multipage Deep Learning Chart

This chart is drawn in accordance with this.

Let's say we train a classifier that predicts a figure in an image, a maximum of 5 digits per image. Here we have identified 6 output levels: digit1 , digit2 , digit3 , digit4 , digit5 , length . The digit level should output 0 ~ 9 if there is such a digit, or X (replace it with a real number in practice) if there is no digit in its position . Same for length , it should output 0 ~ 5 if the image contains 0 ~ 5 digits, or X if it contains more than 5 digits.

Now, to train it, we just add all the cross-entropy loss of each softmax function:

 # Define loss and optimizer lossLength = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(length_logits, true_length)), 1e-37, 1e+37)) lossDigit1 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit1_logits, true_digit1)), 1e-37, 1e+37)) lossDigit2 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit2_logits, true_digit2)), 1e-37, 1e+37)) lossDigit3 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit3_logits, true_digit3)), 1e-37, 1e+37)) lossDigit4 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit4_logits, true_digit4)), 1e-37, 1e+37)) lossDigit5 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit5_logits, true_digit5)), 1e-37, 1e+37)) cost = tf.add( tf.add( tf.add( tf.add( tf.add(cL,lossDigit1), lossDigit2), lossDigit3), lossDigit4), lossDigit5) optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost) 
+6
source

Source: https://habr.com/ru/post/1239783/


All Articles