I found a similar problem here TensorFlow Cross_entropy NaN Task
Thanks to the author user1111929
tf.nn.softmax_cross_entropy_with_logits => -tf.reduce_sum(y_*tf.log(y_conv))
actually a terrible way to calculate cross entropy. In some examples, some classes can be excluded with certainty after some time, resulting in y_conv = 0 for this sample. This is usually not a problem, since you are not interested in it, but in how cross_entropy is written there, it gives 0 * log (0) for this particular sample / class. Therefore, NaN.
Replacing it
cross_entropy = -tf.reduce_sum(y_*tf.log(y_conv + 1e-10))
or
cross_entropy = -tf.reduce_sum(y_*tf.log(tf.clip_by_value(y_conv,1e-10,1.0)))
Solved cash problem.
source share