Bidirectional LSTM for variable length sequences in Tensorflow

I want to train a bi-directional LSTM in a tensor stream to perform the task of classifying sequences (classification of feelings).

Since sequences are of variable length, batches are usually filled with zero vectors. I usually use the sequence_length parameter in a unidirectional RNN to avoid training on fill vectors.

How it can be controlled with bidirectional LSTM. Does the sequence_length parameter automatically start from an advanced position in the sequence for the reverse direction?

thanks

+5
source share
1 answer

bidirectional_dynamic_rnn also has a sequence_length parameter that takes care of variable length sequences.

https://www.tensorflow.org/api_docs/python/tf/nn/bidirectional_dynamic_rnn ( mirror ):

sequence_length : vector int32 / int64, size [batch_size], containing the actual lengths for each of the sequences.

Here you can see an example: https://github.com/Franck-Dernoncourt/NeuroNER/blob/master/src/entity_lstm.py

+2
source

Source: https://habr.com/ru/post/1265753/


All Articles