Tensorflow, difference between tf.nn.softmax_cross_entropy_with_logits and tf.nn.sparse_softmax_cross_entropy_with_logits

I have read the documents of the two functions , but as far as I know, the function tf.nn.softmax_cross_entropy_with_logits(logits, labels, dim=-1, name=None)result is a cross entropy loss, in which the dimensions logitsand labelsthe same.

But for the function, the tf.nn.sparse_softmax_cross_entropy_with_logitssizes logitsand labelsdo not match?

Could you give a more detailed example tf.nn.sparse_softmax_cross_entropy_with_logits?

+4
source share
1 answer

The difference is that it tf.nn.softmax_cross_entropy_with_logitsdoes not imply that the classes are mutually exclusive:

. , , .

sparse_*:

( ). , CIFAR-10 : , .

, logits labels : labels , logits , .

+5

Source: https://habr.com/ru/post/1664560/


All Articles