Mismatch ranking: the rank of labels (received 2) should be equal to the rank of logins minus 1 (received 2)

I am building a DNN to predict whether an object is present in an image or not. My network has two hidden layers, and the last layer is as follows:

  # Output layer
  W_fc2 = weight_variable([2048, 1])
  b_fc2 = bias_variable([1])

  y = tf.matmul(h_fc1, W_fc2) + b_fc2

Then I have a placeholder for labels:

y_ = tf.placeholder(tf.float32, [None, 1], 'Output')

I start batch training (so the first argument in the Output layer form is None).

I use the following loss function:

cross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(
    y[:, :1], y_[:, :1], name='xentropy')
loss = tf.reduce_mean(cross_entropy, name='xentropy_mean')
predict_hand = tf.greater(y, 0.5)
correct_prediction = tf.equal(tf.to_float(predict_hand), y_)
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))

But at runtime, I got the following error:

Rank mismatch: the rank of labels (received 2) should be equal to the ranks of logits minus 1 (received 2).

I think I should redo the layer with the inscription, but I'm not sure what it expects. I looked in the documentation and it says:

logits: r [d_0, d_1,..., d_ {r-2}, num_classes] dtype float32 float64. : shape [d_0, d_1,..., d_ {r-2}] dtype int32 int64. [0, num_classes).

, ( 0 1)?

+4
3

* tf.nn.sparse_softmax_cross_entropy_with_logits:

" [batch_size, num_classes] [batch_size]. ".

, , [None]. , [None, 1] [None] .

:

>>> logits = np.array([[11, 22], [33, 44], [55, 66]])
>>> labels = np.array([1, 0, 1])

- 3 , logits - 11 22, 2 : 0 1.

* https://www.tensorflow.org/versions/r0.11/api_docs/python/nn.html#sparse_softmax_cross_entropy_with_logits

+3

. tf.nn.softmax_cross_entropy_with_logits sparse_softmax. .

+2

" [batch_size, num_classes] [batch_size]. ".

In many textbooks, including here and here , labels have a size [None,10], and logits have a size [None,10].

0
source

Source: https://habr.com/ru/post/1659454/


All Articles