As you know, the main problem in DNN is a long training time.
But there are several ways to speed up learning:
=(x-AVG)/Variance
Batch normalization achieves the same accuracy with 14 times less training steps
=max(x, 0)
The advantage of using the unsaturated activation function lies in two aspects: the first is to solve the so-called "exploding / disappearing" gradient, the second is to accelerate the convergence rate .
Or any: (maxout, ReLU-family, tanh)
Our initialization corresponds to the current state without supervision or self-controlled methods of preliminary preparation on a standard computer vision of a task, such as classification of images and detection of objects, roughly three orders of magnitude faster .
Or LSUV initialization (Layer-sequential unit-variance): https://arxiv.org/abs/1511.06422
But if we use all the steps: (1) Normalization of the batch, (2) ReLU, (3) Quick weight initialization or LSUV - then does it make sense to use autoencoder / autoassociator at any stage of training a deep neural network?
. , , .
RBM . - . . , RBM (G. Hinton .) Autoencoders (Y. Bengio .) .
:
RBM autoencoder . . , , , , .
? , ? , , .
, , . , .
Source: https://habr.com/ru/post/1663339/More articles:How to calculate where a bullet impacts a wall (real wall and bullet) - camerareading packed file in aws lambda package - node.jsChrome debugger / breakpoint stops on wrong line - google-chrome_.extend vs _.clone in lodash - javascriptHow can I use CRTP to remove a virtual method in this context? - c ++horizontal scroll menu without scrollbar - javascriptHTML: horizontal scrolling without scollbar - htmlIs there a "general" type of iterator that will be used in C ++ for function parameters? - c ++Объединить конфликт в файле, который не был изменен ни в одной ветки - githttps://translate.googleusercontent.com/translate_c?depth=1&pto=aue&rurl=translate.google.com&sl=ru&sp=nmt4&tl=en&u=https://fooobar.com/questions/1663344/is-there-a-way-to-use-variable-names-to-generate-code-in-swift&usg=ALkJrhgQDirqTM0K4rF7JQMRZngtYZyaEgAll Articles