I am trying to implement a custom RNN layer in Keras, and I tried to accomplish what is explained in this link , which basically instructs how to inherit from existing RNN classes. However, the equation for updating the hidden layer in my formulation is slightly different: h(t) = tanh(Wx + Uh(t-1) + Vr(t) + b) and I'm a little confused. In this equation, r(t) = f(x, p(t)) is a function of x , a fixed input distributed over time, and p(t) = O(t-1).alpha + p(t-1) where O(t) is the Softmax output of each RNN cell.
I think that after calling super(customRNN, self).step in the inherited function step standard h(t) should be overridden by my definition of h(t) . However, I'm not sure how to change the states function, as well as get_constants , and whether I need to modify any other parts of the repeating and simple RNN classes in Keras. My intuition is that the get_constants function returns drop matrices as additional states of the step function, so I assume that at least one state should be added for the V cutoff matrix in my equations.
I recently started using Keras, and I could not find many references to a custom Keras layer definition. Sorry if my question is a bit overloaded with a lot of options, I just wanted to make sure that I did not miss a single point. Thanks!
source share