How to link embed words and softmax weights in keras?

Its a common occurrence for various neural network architectures in NLP and problems in the vision language to relate the weights of the source interstitial layer to the output type softmax. Usually this leads to an increase in the quality of the generation of offers. (see example here )

It is typical for Keras to embed word embedding layers using the Embedding class, however, there seems to be no easy way to associate the weights of this layer with the output softmax. Will anyone know how this can be implemented?

+4
source share
1 answer

As you can read here , you should just set the flag trainableto False. For instance.

aux_output = Embedding(..., trainable=False)(input)
....
output = Dense(nb_of_classes, .. ,activation='softmax', trainable=False)
+2

Source: https://habr.com/ru/post/1688680/


All Articles