How does the Flatten layer work in Keras?

I am using the TensorFlow server side.

I apply convolution, maximum pool, align and dense layer sequentially. Convolution requires three-dimensional input (height, width, color_channels_depth).

After the convolution, it becomes (height, width, Number_of_filters).

After applying the maximum pools, the height and width change. But after applying a flattened layer, what exactly happens? For example, if the input before smoothing is (24, 24, 32), then how does it smooth it?

Is it sequential, like (24 * 24) for height, weight for each filter number sequentially, or in some other way? An example will be appreciated with actual values.

+15
source share
3 answers

The Flatten() operator expands values ​​starting from the last measurement (at least for Theano, which is the “channel first” and not the “last channel”, for example TF. I cannot run TensorFlow in my environment). This is equivalent to numpy.reshape with order "C":

'C means reading / writing elements using the order of the C-like index, with the index of the last axis changing faster, back to the index of the first axis slower.

Here is a separate example illustrating the Flatten statement with the Keras functional API. You should easily adapt to your environment.

 import numpy as np from keras.layers import Input, Flatten from keras.models import Model inputs = Input(shape=(3,2,4)) # Define a model consisting only of the Flatten operation prediction = Flatten()(inputs) model = Model(inputs=inputs, outputs=prediction) X = np.arange(0,24).reshape(1,3,2,4) print(X) #[[[[ 0 1 2 3] # [ 4 5 6 7]] # # [[ 8 9 10 11] # [12 13 14 15]] # # [[16 17 18 19] # [20 21 22 23]]]] model.predict(X) #array([[ 0., 1., 2., 3., 4., 5., 6., 7., 8., 9., 10., # 11., 12., 13., 14., 15., 16., 17., 18., 19., 20., 21., # 22., 23.]], dtype=float32) 
+30
source

It is sequential as 24 * 24 * 32, and converts it as shown in the following code.

 def batch_flatten(x): """Turn a nD tensor into a 2D tensor with same 0th dimension. In other words, it flattens each data samples of a batch. # Arguments x: A tensor or variable. # Returns A tensor. """ x = tf.reshape(x, tf.stack([-1, prod(shape(x)[1:])])) return x 
+3
source

Aligning the tensor means deleting all but one dimension.

The Flatten layer in Keras changes the shape of the tensor to have a shape equal to the number of elements contained in the tensor.

This is the same as creating a 1d array of elements.

For example, in the VGG16 model, it may be easy for you to understand:

 >>> model.summary() Layer (type) Output Shape Param # ================================================================ vgg16 (Model) (None, 4, 4, 512) 14714688 ________________________________________________________________ flatten_1 (Flatten) (None, 8192) 0 ________________________________________________________________ dense_1 (Dense) (None, 256) 2097408 ________________________________________________________________ dense_2 (Dense) (None, 1) 257 =============================================================== 

Notice what the shape of the flatten_1 layer is (No, 8192), where 8192 is actually 4 * 4 * 512.


PS, None means a measurement (or dynamic measurement), but you can usually read it as 1. More information can be found in here .

+1
source

Source: https://habr.com/ru/post/1268221/


All Articles