Conv2d_transpose depends on batch_size when making forecasts

I have a neural network, which is currently implemented in a tensor stream, but I have a problem with predictions after training, because I have conv2d_transpose operations, and the forms of these ops depend on the size of the batch. I have a layer for which output_shape is required as an argument:

def deconvLayer(input, filter_shape, output_shape, strides):
    W1_1 = weight_variable(filter_shape)

    output = tf.nn.conv2d_transpose(input, W1_1, output_shape, strides, padding="SAME")

    return output

This is actually used in a larger model that I built as follows:

 conv3 = layers.convLayer(conv2['layer_output'], [3, 3, 64, 128], use_pool=False)

 conv4 = layers.deconvLayer(conv3['layer_output'],
                                    filter_shape=[2, 2, 64, 128],
                                    output_shape=[batch_size, 32, 40, 64],
                                    strides=[1, 2, 2, 1])

The problem is that if I make a forecast using a trained model, my test data should have the same lot size, otherwise I get the following error.

tensorflow.python.framework.errors.InvalidArgumentError: Conv2DBackpropInput: input and out_backprop must have the same batch size

? , , , .

+4
1

, , https://github.com/tensorflow/tensorflow/issues/833.

 conv4 = layers.deconvLayer(conv3['layer_output'],
                                    filter_shape=[2, 2, 64, 128],
                                    output_shape=[batch_size, 32, 40, 64],
                                    strides=[1, 2, 2, 1])

, deconvLayer, . :

def deconvLayer(input, filter_shape, output_shape, strides):
    W1_1 = weight_variable(filter_shape)

    dyn_input_shape = tf.shape(input)
    batch_size = dyn_input_shape[0]

    output_shape = tf.pack([batch_size, output_shape[1], output_shape[2], output_shape[3]])

    output = tf.nn.conv2d_transpose(input, W1_1, output_shape, strides, padding="SAME")

    return output

.

, . , - , ops , . , None batch_size , , , , .

+4

Source: https://habr.com/ru/post/1650504/


All Articles