Does Keras LSTM support dynamic sentence length or not?

Since I see people arguing on some platform that LSTMKeras does not support the dynamic length of a sentence, I wrote the following code.

embedding_size = 100
model = Sequential()
model.add(LSTM(32, return_sequences=True, input_shape=(None, embedding_size)))

And it works great when using two inputs val1and val2I (this input has the form batch_size * sentence length * embedding size)

val1 = np.random.random((5,20,embedding_size))
val2 = np.random.random((5,10,embedding_size))
input = model.input
output = model.output

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    k1 = sess.run(output, feed_dict={input:val1})
    k2 = sess.run(output, feed_dict={input:val2})
    print k1.shape
    print k2.shape

And I have the following conclusion, which corresponds to my expectation that LSTMKeras is dynamic in input length, if we set input_shapehow (None, embedding_size), do I understand correctly?

(5, 20, 32)

(5, 10, 32)

+4
source share
1 answer

, , Keras. , batch . .

- , , , , / .

+5

Source: https://habr.com/ru/post/1674235/


All Articles