How to get node gradient using mxnet.jl and Julia?

I am trying to replicate the following example from mxnet core documents using mxnet.jl in Julia:

A = Variable('A') B = Variable('B') C = B * A D = C + Constant(1) # get gradient node. gA, gB = D.grad(wrt=[A, B]) # compiles the gradient function. f = compile([gA, gB]) grad_a, grad_b = f(A=np.ones(10), B=np.ones(10)*2) 

This example shows how to auto-diffuse a symbial expression and get its gradients.

What is equivalent in mxnet.jl (latest version 2016-03-07)?

+5
source share
1 answer

The code in MXNet.jl/src/symbolic-node.jl can help you find the answers.

I am not familiar with this package. Here's my Guess: A = mx.Variable("A") B = mx.Variable("B") C = B .* A D = C + 1 mx.normalized_gradient may be the solution to the remainder, if exists.

+1
source

Source: https://habr.com/ru/post/1244561/


All Articles