I have read the documents of the two functions , but as far as I know, the function tf.nn.softmax_cross_entropy_with_logits(logits, labels, dim=-1, name=None)result is a cross entropy loss, in which the dimensions logitsand labelsthe same.
tf.nn.softmax_cross_entropy_with_logits(logits, labels, dim=-1, name=None)
logits
labels
But for the function, the tf.nn.sparse_softmax_cross_entropy_with_logitssizes logitsand labelsdo not match?
tf.nn.sparse_softmax_cross_entropy_with_logits
Could you give a more detailed example tf.nn.sparse_softmax_cross_entropy_with_logits?
The difference is that it tf.nn.softmax_cross_entropy_with_logitsdoes not imply that the classes are mutually exclusive:
tf.nn.softmax_cross_entropy_with_logits
. , , .
sparse_*:
sparse_*
( ). , CIFAR-10 : , .
, logits labels : labels , logits , .
Source: https://habr.com/ru/post/1664560/More articles:Rxjs: difference between Observable.First vs Single vs Filter - rxjsAn error occurred during an export export error for an Ad Hoc distribution - iosDoes the flyer work with Bing 8 Maps? - bing-mapsIs it possible to display figures behind series / graphs? - chartsDrop Shadow effect in a universal Windows application - c #Why is the begin-next-end behavior different in ruby and jruby? - ruby | fooobar.com.net Portable Class Library содержит Microsoft.VisualBasic как зависимость - .netHow to get a date from a Rundeck job - rundeckAngular2 user request - androidWhat is the best alternative for the following situation? - postgresqlAll Articles