Layered classification of sparse-tagged images in TensorFlow?

I want to perform a multi-level image classification task for n classes. I have sparse label vectors for each image, and each size of each label vector is currently encoded as follows:

1.0 → Label true / Image belongs to this class

-1.0 → The label false / Image does not belong to this class.

0.0 → missing value / label

For example: V = {1,0, -1,0,1,0, 0,0}

In this example, the V model should find out that the corresponding image should be classified in the first and third grade.

Currently, the problem is how to handle the missing values ​​/ labels. I looked through these problems and found this problem: tensorflow / skflow # 113 found here

Thus, it is possible to perform multilevel classification of images using: tf.nn.sigmoid_cross_entropy_with_logits (logits, goals, name = no)

but TensorFlow has this error function for sparse softmax, which is used for exceptional classification: tf.nn.sparse_softmax_cross_entropy_with_logits (logins, tags, name = no)

So is there something like rare sigmoid cross-entropy? (Could not find anything) or any suggestions on how I can deal with my problem of multilevel classification with sparse labels.

+4
source share
3 answers

First, I would like to know what you mean by missing data? What is the difference between error and false in your case?

, , . , . ( , )

, .

V = [(1,0,0), (0,0,1), (1,0,0), (0,1,0)]

0

Ok! , , .

, tf.sigmoid_cross_entropy_with_logits()

0.5. (0 false 1 true). , , .

0

weighted_cross_entropy_with_logits 1s.

. 0 10 , 1.

, 1s, pos_weight . pos_weight (= ) 10. , pos_weight. , .

, = 1, weighted_cross_entropy = pos_weight * sigmoid_cross_entropy

Weighted cross-entropy with logits is the same as sigmoid cross-entropy with logits, except for the additional weight value multiplied by all targets with a positive real value , i.e. 1.

Theoretically, he should do the job. I am also tuning other parameters to optimize performance. Performance statistics will be updated later.

0
source

Source: https://habr.com/ru/post/1655802/


All Articles