How to convert the output tensor into one hot tensor?

I need to calculate the loss from the output of softmax against the target. My goal is like [0,0,1], and the output is [0,3,0,3,0,4] For this purpose, the forecast is correct. But the cost function like below does not take into account such accuracy

self._output = output = tf.nn.softmax(y)
self._cost = cost = tf.reduce_mean(tf.square( output - tf.reshape(self._targets, [-1])))

How can I easily convert the output of [0,3,0,3,0,4] to [0,0,1] to TF?

+4
source share
1 answer

A typical loss function used to compare two probability distributions is called cross-entropy . TensorFlow has a function tf.nn.softmax_cross_entropy_with_logits that implements this loss. In your case, you can simply:

self._cost = tf.nn.softmax_cross_entropy_with_logits(
                 y, tf.reshape(self._targets, [-1]))

[0.3, 0.3, 0.4] , tf.one_hot :

sess = tf.InteractiveSession()
a = tf.constant([0.3, 0.3, 0.4])
one_hot_a = tf.one_hot(tf.nn.top_k(a).indices, tf.shape(a)[0])
print(one_hot_a.eval())
# prints [[ 0.  0.  1.]]
+9

Source: https://habr.com/ru/post/1648535/


All Articles