What is the difference between optimizer.compute_gradient () and tf.gradients () in a tensor stream?

The following code I wrote fails when self.optimizer.compute_gradients(self.output,all_variables)

import tensorflow as tf
import tensorlayer as tl
from tensorflow.python.framework import ops
import numpy as np

class Network1():


def __init__(self):
    ops.reset_default_graph()
    tl.layers.clear_layers_name()

    self.sess = tf.Session()
    self.optimizer = tf.train.AdamOptimizer(learning_rate=0.1)

    self.input_x = tf.placeholder(tf.float32, shape=[None, 784],name="input")  

    input_layer = tl.layers.InputLayer(self.input_x)        

    relu1 = tl.layers.DenseLayer(input_layer, n_units=800, act = tf.nn.relu, name="relu1")
    relu2 = tl.layers.DenseLayer(relu1, n_units=500, act = tf.nn.relu, name="relu2")

    self.output = relu2.all_layers[-1]
    all_variables = relu2.all_layers

    self.gradient = self.optimizer.compute_gradients(self.output,all_variables)

    init_op = tf.initialize_all_variables()
    self.sess.run(init_op)

with a warning

TypeError: The argument is not tf.Variable: Tensor ("relu1 / Relu: 0", shape = (?, 800), dtype = float32)

However, when I change this line to tf.gradients(self.output,all_variables), the code works fine, at least the warning is not reported. Where I missed, since I think that these two methods actually do the same thing, that is, they return a list of pairs (gradient, variable).

+4
source share
2 answers

optimizer.compute_gradients tf.gradients(), . ( ).

+5

, . optimizer.compute_gradients (grads, vars). , . , loss var_list None. , .

, tf.gradients sum(dy/dx) . .

:

        ### Approach 1 ###
        variable_list = desired_list_of_variables
        gradients = optimizer.compute_gradients(loss,var_list=variable_list)
        optimizer.apply_gradients(gradients)

        # ### Approach 2 ###
        variable_list = desired_list_of_variables
        gradients = tf.gradients(loss, var_list=variable_list)
        optimizer.apply_gradients(zip(gradients, variable_list))
0

Source: https://habr.com/ru/post/1660459/


All Articles