- How exactly does the shadoworflow method apply when calling tf.nn.rnn_cell.DropoutWrapper ()?
All that I read about the application of ejection to rnn refers to this article by Zaremba et. al , which states that no exception is required between duplicate compounds. Neurons should be randomly dropped before or after LSTM layers, but not between LSTM layers. Good.
- The question I have is how to turn off neurons relative to time?
In the article that everyone is quoting, it seems that at each time stamp a random “drop mask” is applied, instead of generating one random “drop mask” and reusing it, applying it to all the temporary methods in this layer, dropped outside. Then create a new "drop mask" in the next batch.
Also, and perhaps more importantly, how does a tensor flow do this? I checked the tenorflow api and tried to find a detailed explanation, but haven't found it yet.
- Is there a way to dig into the source code of the current tensor stream?
source share