How to encode a sequence for an RNN sequence in keras?

I am trying to write a sequence to an RNN sequence in keras. I encoded this program using what I understood from the Internet. First, I marked the text, and then converted the text to a sequence and filled in the form of the variable of function X. The target variable Y was obtained by first moving x to the left and then putting it aside. Finally, I passed my function and target variable to my LSTM model.

This is my code that I wrote in keras for this purpose.

from keras.preprocessing.text import Tokenizer,base_filter from keras.preprocessing.sequence import pad_sequences from keras.models import Sequential from keras.layers import Dense, Activation,Dropout,Embedding from keras.layers import LSTM def shift(seq, n): n = n % len(seq) return seq[n:] + seq[:n] txt="abcdefghijklmn"*100 tk = Tokenizer(nb_words=2000, filters=base_filter(), lower=True, split=" ") tk.fit_on_texts(txt) x = tk.texts_to_sequences(txt) #shifing to left y = shift(x,1) #padding sequence max_len = 100 max_features=len(tk.word_counts) X = pad_sequences(x, maxlen=max_len) Y = pad_sequences(y, maxlen=max_len) #lstm model model = Sequential() model.add(Embedding(max_features, 128, input_length=max_len, dropout=0.2)) model.add(LSTM(128, dropout_W=0.2, dropout_U=0.2)) model.add(Dense(max_len)) model.add(Activation('softmax')) model.compile(loss='binary_crossentropy', optimizer='rmsprop') model.fit(X, Y, batch_size=200, nb_epoch=10) 

The problem is that it shows an error

 Epoch 1/10 IndexError: index 14 is out of bounds for size 14 Apply node that caused the error: AdvancedSubtensor1(if{inplace}.0, Reshape{1}.0) Toposort index: 80 
+6
source share
1 answer

The problem is this:

 model.add(Embedding(max_features, 128, input_length=max_len, dropout=0.2)) 

In the documentation, you can see that the first argument provided to it should be set to the size of the dictionary + 1. This is because there will always be a place for the word null , whose index is 0 . Because of this, you need to change this line to:

 model.add(Embedding(max_features + 1, 128, input_length=max_len, dropout=0.2)) 
+5
source

Source: https://habr.com/ru/post/1014548/


All Articles