You find the derivative of fwith respect to var_f, but finstead is not a function var_f, but var. That is why you get [No]. Now, if you change the code to:
var = tf.Variable(np.ones((5,5)), dtype = tf.float32)
var_f = tf.reshape(var, [-1])
f = tf.reduce_sum(tf.reduce_sum(tf.square(var_f)))
grad = tf.gradients(f,var_f)
print(grad)
your gradients will be determined:
tf.Tensor 'gradients_28/Square_32_grad/mul_1: 0' shape = (25,) dtype = float32 >
:
var = tf.Variable(np.ones((5,5)), dtype = tf.float32, name='var')
f = tf.reduce_sum(tf.reduce_sum(tf.square(var)), name='f')
var_f = tf.reshape(var, [-1], name='var_f')
grad_1 = tf.gradients(f,var_f, name='grad_1')
grad_2 = tf.gradients(f,var, name='grad_2')

grad_1 , 'grad_2` . ( ) .