therefore ... Theano builds graphs for the expressions that he calculates before evaluating them. By passing an anano variable, such as "x" in the example, to initialize the logistic regression object, you will create several expressions in your object, such as p_y_given_x, which are x-dependent anano expressions. This is later used to calculate the symbolic gradient.
To better understand, you can do the following:
import theano.pp
This should give you a result like
softmax( W \dot x + b)
And while you go ahead and try
print pp(T.grad(lr._y_given_x,x)) #might need syntax checkng
how anano internally preserves expression. You can then use these expressions to create functions in anano, such as
values = theano.shared( value = mydata, name = 'values') f = theano.function([],lr.p_y_given_x , givens ={x:values},on_unused_input='ignore') print f()
then calling f should give you the predicted class probabilities for the values defined in mydata. The way to do this in anano (and how it is done in DL tutorials) is to pass the aano variable “dummy” and then use the keyword “givens” to set it to a shared variable containing your data. This is important because storing your variables in a shared variable allows theano to use your GPU for matrix operations.
source share