TensorFlow: Are my logs in the correct format for the cross entropy function?

Ok, so I'm ready to run the tf.nn.softmax_cross_entropy_with_logits() function in Tensorflow.

I understand that β€œlogits” should be a probability tensor, each of which corresponds to a certain probability of the pixel, that it is part of the image, which will ultimately be a β€œdog” or β€œtruck” or something else. a finite number of things.

These logits will be connected to this cross-entropy equation: Wikipedia Cross-Entropy Formula

As I understand it, logits are connected to the right side of the equation. That is, they are q of each x (image). If they were probabilities from 0 to 1 ... it would make sense to me. But when I run my code and end with the tensor of logs, I do not get the probability. Instead, I get floats that are positive and negative:

 -0.07264724 -0.15262917 0.06612295 ..., -0.03235611 0.08587133 0.01897052 0.04655019 -0.20552202 0.08725972 ..., -0.02107313 -0.00567073 0.03241089 0.06872301 -0.20756687 0.01094618 ..., etc 

So my question is ... is that right? Should I somehow calculate all my logits and turn them into probabilities from 0 to 1?

+5
source share
1 answer

It is important to note that tf.nn.softmax_cross_entropy_with_logits(logits, labels) executes an internal softmax on each line of logits so that they are interpreted as probabilities before they are fed into the cross-entropy equation.

Therefore, β€œ logits ” do not have to be probabilities (or even true logarithmic probabilities, as the name implies) due to the internal normalization that occurs inside this op.

Alternative recording method:

 xent = tf.nn.softmax_cross_entropy_with_logits(logits, labels) 

... will be:

 softmax = tf.nn.softmax(xent) xent = -tf.reduce_sum(labels * tf.log(softmax), 1) 

However, this option would be (i) less numerically stable (since softmax can calculate much larger values) and (ii) less efficient (because redundant calculations will occur in the backup processor). For real purposes, we recommend using tf.nn.softmax_cross_entropy_with_logits() .

+13
source

Source: https://habr.com/ru/post/1245293/


All Articles