Anano (python): elemental gradient

I am trying to perform an elemental gradient with

eg.

output-f (x): 5 on 1 vector,

relative to input-X: 5 on 1 vector

I can do it for example

 import theano
 import theano.tensor as T

 X = T.vector('X')   

 f = X*3    

 [rfrx, []] = theano.scan(lambda j, f,X : T.grad(f[j], X), sequences=T.arange(X.shape[0]), non_sequences=[f,X])

 fcn_rfrx = theano.function([X], rfrx)

 fcn_rfrx(np.ones(5,).astype(float32))

and result

array([[ 3.,  0.,  0.,  0.,  0.],
       [ 0.,  3.,  0.,  0.,  0.],
       [ 0.,  0.,  3.,  0.,  0.],
       [ 0.,  0.,  0.,  3.,  0.],
       [ 0.,  0.,  0.,  0.,  3.]], dtype=float32)

but since it is inefficient, I want to get 5 on 1 vector as a result

doing something like ..

 [rfrx, []] = theano.scan(lambda j, f,X : T.grad(f[j], X[j]), sequences=T.arange(X.shape[0]), non_sequences=[f,X])

which does not work.

Is there any way to do this? (sorry for the poor format .. I'm new here and studied)


(I added a clearer example):

given input vector: x [1], x [2], ..., x [n]

and the output vector: y [1], y [2], .., y [n],

where y [i] = f (x [i]).

I want to get the result

df (x [i]) / dx [i] only

but not

df (x [i]) / dx [j] for (i <> j)

for computational efficiency (n - amount of data> 10000)

+4
source share
1

theano.tensor.jacobian.

import theano
import theano.tensor as T

x = T.fvector()
p = T.as_tensor_variable([(x ** i).sum() for i in range(5)])

j = T.jacobian(p, x)

f = theano.function([x], [p, j])

In [31]: f([1., 2., 3.])
Out[31]: 
[array([  3.,   6.,  14.,  36.,  98.], dtype=float32),
 array([[   0.,    0.,    0.],
        [   1.,    1.,    1.],
        [   2.,    4.,    6.],
        [   3.,   12.,   27.],
        [   4.,   32.,  108.]], dtype=float32)]

, . , , ( - ). (, , ).

x = T.fscalar()
y = T.fvector()
z = T.concatenate([x.reshape((1,)), y.reshape((-1,))])

e = (z ** 2).sum()
g = T.grad(e, wrt=x)

ff = theano.function([x, y], [e, g])
+3

Source: https://habr.com/ru/post/1618134/


All Articles