, NaN , 0, , (0) -.
, , softmax, .
out = tf.clip_by_value(out,1e-10,100.0)
, :
out = out + 1e-10
, softmax sparse_softmax_cross_entropy_with_logits(), .
, - 1e-10 softmax, .
loss = -tf.reduce_sum(labels*tf.log(tf.nn.softmax(logits) + 1e-10))
Keep in mind that using the function, the sparse_softmax_cross_entropy_with_logits()variable labelswas the numeric value of the label, but if you yourself are losing cross-entropy, there labelsmust be a one-time encoding of these numeric labels.
Update: I fixed the answer thanks to @mdaoust's comment . According to him, zeros matter only after the softmax function was applied to the logs, and not earlier.
source
share