Keras: use the same layer in different models (scales)

Quick response:

It is really very simple. Here is the code (for those who do not want to read all this text):

inputs=Input((784,))
encode=Dense(10, input_shape=[784])(inputs)
decode=Dense(784, input_shape=[10])

model=Model(input=inputs, output=decode(encode))

inputs_2=Input((10,))
decode_model=Model(input=inputs_2, output=decode(inputs_2))

In this setting, it decode_modelwill use the same decoding level as model. If you train model, will also train decode_model.

Actual question:

I am trying to create a simple autocoder for MNIST in Keras:

This is the code so far:

model=Sequential()
encode=Dense(10, input_shape=[784])
decode=Dense(784, input_shape=[10])

model.add(encode)
model.add(decode)


model.compile(loss="mse",
             optimizer="adadelta",
             metrics=["accuracy"])

decode_model=Sequential()
decode_model.add(decode)

I train him to learn the identification function

model.fit(X_train,X_train,batch_size=50, nb_epoch=10, verbose=1, 
          validation_data=[X_test, X_test])

The reconstruction is quite interesting:

enter image description here

But I would also like to look at cluster views. What is the result of passing [1,0 ... 0] to the decoding level? This should be the "class average" of one class in MNIST.

decode_model, . , :

: : , dense_input_5 (None, 784), (10, 10)

. , Matrix 784- . :

____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
dense_14 (Dense)                 (None, 784)           8624        dense_13[0][0]                   
====================================================================================================
Total params: 8624

_13. , . , :

____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
dense_13 (Dense)                 (None, 10)            7850        dense_input_6[0][0]              
____________________________________________________________________________________________________
dense_14 (Dense)                 (None, 784)           8624        dense_13[0][0]                   
====================================================================================================
Total params: 16474
____________________

-, . , decode_model .

Keras? API, .

+4
1

, .

API: https://keras.io/getting-started/functional-api-guide/#shared-layers

(, ): enter image description here

, 3? , , .

, , :

inputs=Input((784,))
encode=Dense(10, input_shape=[784])(inputs)
decode=Dense(784, input_shape=[10])

model=Model(input=inputs, output=decode(encode))


model.compile(loss="mse",
             optimizer="adadelta",
             metrics=["accuracy"])

inputs_2=Input((10,))
decode_model=Model(input=inputs_2, output=decode(inputs_2))

. , .

+2

Source: https://habr.com/ru/post/1659028/


All Articles