TensorFlow gradients: getting unnecessary 0.0 tf.gradients gradients

Suppose I have the following variable

embeddings = tf.Variable (tf.random_uniform (dtype = tf.float32, shape = [self.vocab_size, self.embedding_dim], minval = -0.001, maxval = 0.001))

sent_1 = construct_sentence (word_ids_1)

sent_2 = construct_sentence (word_ids_2)

Where construct_sentenceis the method of receiving proposal proposals based on placeholders word_ids_1andword_ids_2

Suppose I have a loss:

loss = construct_loss (sent_1, sent_2, label)

Now when I try to get gradients using:

gradients_wrt_w = tf.gradients (loss, investments)

, , construct_sentence construct_loss, embeddings ( 0 , ).

, ?

, ( ) - . - 2D-, :

tf.gradients(loss, tf.nn.embedding_lookup (embeddings, word_ids))

, , .

, (- ), tf.AggregationMethod, .

+4

Source: https://habr.com/ru/post/1693197/


All Articles