My first time using Tensorflow in the MNIST dataset, I had a very simple error when I forgot to accept the meaning of my errors before passing it to the optimizer.
In other words, instead of
loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=y, labels=y_))
I accidentally used
loss = tf.nn.softmax_cross_entropy_with_logits(logits=y, labels=y_)
Not accepting the average value or the sum of the error values, however, there were no errors when training the network. This made me think: Is there really a case where someone would need to pass multiple loss values to the optimizer? What happens when I passed in a tensor not of size [1] in minim ()?
ejlu source
share