In Keras, what exactly do I configure when I create an LSTM layer with state with N `units`?

The first arguments in the normal Denselayer are also unitsand are the number of neurons / nodes in this layer. However, the standard LSTM module is as follows:

enter image description here

(This is a revised version of " Understanding LSTM Networks ")

In Keras, when I create an LSTM object like this LSTM(units=N,...), do I actually create LSTMs Nfrom these units? Or is it the size of the layers of the "neural network" inside the LSTM block, i.e. Win the formulas? Or something else?

For context, I work based on this code example .

Here is the documentation: https://keras.io/layers/recurrent/

It says:

: , .

, Keras LSTM "layer". , N . , N LSTM LSTM, , , LSTM N , N h[t], , h[tN] h[t]?

, , - , , , x[tN] x[t], LSTM, units=N ?

, , return_sequences. " True N , " False h[t] . ?

+24
2

, API Keras-1.x.

, unit LSTM. LSTM (C_t C_ {t-1} ), (o_t ) / (h_t ) SAME, unit -length.

LSTM Keras LSTM, unit -length. return_sequence=True, - : (batch_size, timespan, unit). false, (batch_size, unit).

, . , (batch_size, timespan, input_dim), input_dim unit. , .

+16

, N LSTM LSTM, , , LSTM N , N h [t], , h [tN] h []?

. Keras LSTM N LSTM.

keras.layers.LSTM(units, activation='tanh', recurrent_activation='hard_sigmoid', use_bias=True, kernel_initializer='glorot_uniform', recurrent_initializer='orthogonal', bias_initializer='zeros', unit_forget_bias=True, kernel_regularizer=None, recurrent_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, recurrent_constraint=None, bias_constraint=None, dropout=0.0, recurrent_dropout=0.0, implementation=1, return_sequences=False, return_state=False, go_backwards=False, stateful=False, unroll=False) 

LSTM 1 , : enter image description here .

N=1
model = Sequential()
model.add(LSTM(N))

N> 1 enter image description here

0

Source: https://habr.com/ru/post/1678203/


All Articles