How to use bidirectional RNN and Conv1D in cores when the pieces do not match?

I am new to Deep-Learning, so I read Deep Learning with Keras by Antonio Gulli and learn a lot. I want to start using some of the concepts. I want to try to implement a neural network with a 1-dimensional convolutional layer, which is fed into a bidirectional repeating layer (for example, below). All the training materials or code fragments that I came across do not implement anything remote with this (for example, pattern recognition) or use an older version keraswith various functions and uses.


What I'm trying to do is a variant of this document :

(1) convert DNA sequences to vectors one-hot encoding;

(2) use a 1-dimensional convolutional neural network;

(3) with a maximum pool;

(4) send the withdrawal to bidirectional RNN;

(5) classify input;


I can’t figure out how to get the shapes in line with bidirectional RNN. I can’t even get the usual RNNwork at this stage. How can I restructure incoming layers to work with bidirectional RNN?

: https://github.com/uci-cbcl/DanQ/blob/master/DanQ_train.py, , . () https://github.com/fchollet/keras/issues/3322, keras. ( 2- ) , , . keras, .

# Imports
import tensorflow as tf
import numpy as np
from tensorflow.python.keras._impl.keras.layers.core import *
from tensorflow.python.keras._impl.keras.layers import Conv1D, MaxPooling1D, SimpleRNN, Bidirectional, Input
from tensorflow.python.keras._impl.keras.models import Model, Sequential

# Set up TensorFlow backend
K = tf.keras.backend
K.set_session(tf.Session())
np.random.seed(0) # For keras?

# Constants
NUMBER_OF_POSITIONS = 40
NUMBER_OF_CLASSES = 2
NUMBER_OF_SAMPLES_IN_EACH_CLASS = 25

# Generate sequences
https://pastebin.com/GvfLQte2

# Build model
# ===========
# Input Layer
input_layer = Input(shape=(NUMBER_OF_POSITIONS,4))
# Hidden Layers
y = Conv1D(100, 10, strides=1, activation="relu", )(input_layer)
y = MaxPooling1D(pool_size=5, strides=5)(y)
y = Flatten()(y)
y = Bidirectional(SimpleRNN(100, return_sequences = True, activation="tanh", ))(y)
y = Flatten()(y)
y = Dense(100, activation='relu')(y)
# Output layer
output_layer = Dense(NUMBER_OF_CLASSES, activation="softmax")(y)

model = Model(input_layer, output_layer)
model.compile(optimizer="adam", loss="categorical_crossentropy", )
model.summary()


# ~/anaconda/lib/python3.6/site-packages/tensorflow/python/keras/_impl/keras/layers/recurrent.py in build(self, input_shape)
#    1049     input_shape = tensor_shape.TensorShape(input_shape).as_list()
#    1050     batch_size = input_shape[0] if self.stateful else None
# -> 1051     self.input_dim = input_shape[2]
#    1052     self.input_spec[0] = InputSpec(shape=(batch_size, None, self.input_dim))
#    1053 

# IndexError: list index out of range
+1
1

, Conv1D LSTM.

, Flatten, .

, Conv1D LSTM:

  • Conv1D: (batch, length, channels)
  • LSTM: (batch, timeSteps, features)

- , TimeStep, - , .

Bidirectional . .


.

, LSTM return_sequences=False. ( + )

, LSTM return_sequences=True. .

+2

Source: https://habr.com/ru/post/1649099/


All Articles