A layer called with input that is not keras symbolic tensor

I am trying to transfer the output of one layer to two different layers, and then combine them together. However, this error stops me, which tells me that my input is not a symbolic tensor.

Received type: <class 'keras.layers.recurrent.LSTM'>. All inputs to the layers should be tensors. 

However, I believe that I am closely following the documentation: https://keras.io/getting-started/functional-api-guide/#multi-input-and-multi-output-models

and I'm not quite sure why this is wrong?

 net_input = Input(shape=(maxlen, len(chars)), name='net_input') lstm_out = LSTM(128, input_shape=(maxlen, len(chars))) book_out = Dense(len(books), activation='softmax', name='book_output')(lstm_out) char_out = Dense(len(chars-4), activation='softmax', name='char_output')(lstm_out) x = keras.layers.concatenate([book_out, char_out]) net_output = Dense(len(chars)+len(books), activation='sigmoid', name='net_output') model = Model(inputs=[net_input], outputs=[net_output]) 

thanks

+5
source share
1 answer

It looks like you are not really entering your LSTM level. You indicate the number of recurrent neurons and the input form, but do not specify the input. Try:

 lstm_out = LSTM(128, input_shape=(maxlen, len(chars)))(net_input) 
+6
source

Source: https://habr.com/ru/post/1269380/


All Articles