It is difficult for me to calculate cross-entropy in a tensor flow. In particular, I use the function:
tf.nn.softmax_cross_entropy_with_logits()
Using apparently simple code, I can get it to return zero
import tensorflow as tf import numpy as np sess = tf.InteractiveSession() a = tf.placeholder(tf.float32, shape =[None, 1]) b = tf.placeholder(tf.float32, shape = [None, 1]) sess.run(tf.global_variables_initializer()) c = tf.nn.softmax_cross_entropy_with_logits( logits=b, labels=a ).eval(feed_dict={b:np.array([[0.45]]), a:np.array([[0.2]])}) print c
returns
0
My understanding of cross entropy is as follows:
H(p,q) = p(x)*log(q(x))
Where p (x) is the true probability of the event x and q (x) is the predicted probability of the event x.
There, if any two numbers for p (x) and q (x) are introduced, such that
0<p(x)<1 AND 0<q(x)<1
there must be non-zero cross entropy. I expect to use shadoworflow incorrectly. Thanks in advance for any help.
source share