Autoencoder does not recognize authentication function

I am somewhat new to machine learning in general, and I wanted to do a simple experiment to get to know the neural network auto-encoders better: create an extremely simple auto-encoder that explores the identification function.

I use Keras to make life easier, so I did it first to make sure it works:

# Weights are given as [weights, biases], so we give
# the identity matrix for the weights and a vector of zeros for the biases
weights = [np.diag(np.ones(84)), np.zeros(84)]
model = Sequential([Dense(84, input_dim=84, weights=weights)])
model.compile(optimizer='sgd', loss='mean_squared_error')
model.fit(X, X, nb_epoch=10, batch_size=8, validation_split=0.3)

As expected, the loss is zero, both in the train data and in the validation:

Epoch 1/10
97535/97535 [==============================] - 27s - loss: 0.0000e+00 - val_loss: 0.0000e+00
Epoch 2/10
97535/97535 [==============================] - 28s - loss: 0.0000e+00 - val_loss: 0.0000e+00

, , , . . 200 , , L1 L2. , , , - , , . - , 1,1. , , ?

" " - , , , - ? ?

, , , :

X = np.random.normal(1.1090579, 0.0012380764, (139336, 84))

, . ( 1e-6), , . , - / ? !

UPDATE

, , , 84 , ( ), , . , 84 . . , , , / , , . , , . , .

+4
1

, , X. 100 argmax() , .

,

from keras.models import Sequential
from keras.layers import Dense
import numpy as np
import random
import pandas as pd

X = np.array([[random.random() for r in xrange(84)] for i in xrange(1,100000)])
model = Sequential([Dense(84, input_dim=84)], name="layer1")
model.compile(optimizer='sgd', loss='mean_squared_error')
model.fit(X, X, nb_epoch=100, batch_size=80, validation_split=0.3)

l_weights = np.round(model.layers[0].get_weights()[0],3)

print l_weights.argmax(axis=0)
print l_weights.max(axis=0)

:

Train on 69999 samples, validate on 30000 samples
Epoch 1/100
69999/69999 [==============================] - 1s - loss: 0.2092 - val_loss: 0.1564
Epoch 2/100
69999/69999 [==============================] - 1s - loss: 0.1536 - val_loss: 0.1510
Epoch 3/100
69999/69999 [==============================] - 1s - loss: 0.1484 - val_loss: 0.1459
.
.
.
Epoch 98/100
69999/69999 [==============================] - 1s - loss: 0.0055 - val_loss: 0.0054
Epoch 99/100
69999/69999 [==============================] - 1s - loss: 0.0053 - val_loss: 0.0053
Epoch 100/100
69999/69999 [==============================] - 1s - loss: 0.0051 - val_loss: 0.0051
[ 0  1  2  3  4  5  6  7  8  9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83]
[ 0.85000002  0.85100001  0.79799998  0.80500001  0.82700002  0.81900001
  0.792       0.829       0.81099999  0.80800003  0.84899998  0.829       0.852
  0.79500002  0.84100002  0.81099999  0.792       0.80800003  0.85399997
  0.82999998  0.85100001  0.84500003  0.847       0.79699999  0.81400001
  0.84100002  0.81        0.85100001  0.80599999  0.84500003  0.824
  0.81999999  0.82999998  0.79100001  0.81199998  0.829       0.85600001
  0.84100002  0.792       0.847       0.82499999  0.84500003  0.796
  0.82099998  0.81900001  0.84200001  0.83999997  0.815       0.79500002
  0.85100001  0.83700001  0.85000002  0.79900002  0.84100002  0.79699999
  0.838       0.847       0.84899998  0.83700001  0.80299997  0.85399997
  0.84500003  0.83399999  0.83200002  0.80900002  0.85500002  0.83899999
  0.79900002  0.83399999  0.81        0.79100001  0.81800002  0.82200003
  0.79100001  0.83700001  0.83600003  0.824       0.829       0.82800001
  0.83700001  0.85799998  0.81999999  0.84299999  0.83999997]

5 , :

array([[ 1.,  0., -0.,  0.,  0.],
       [ 0.,  1.,  0., -0., -0.],
       [-0.,  0.,  1.,  0.,  0.],
       [ 0., -0.,  0.,  1., -0.],
       [ 0., -0.,  0., -0.,  1.]], dtype=float32)
+3

Source: https://habr.com/ru/post/1666326/


All Articles