Tensorflow dynamic RNN (LSTM): how to format input?

I have been given some data in this format and the following information:

person1, day1, feature1, feature2, ..., featureN, label
person1, day2, feature1, feature2, ..., featureN, label
...
person1, dayN, feature1, feature2, ..., featureN, label
person2, day1, feature1, feature2, ..., featureN, label
person2, day2, feature1, feature2, ..., featureN, label
...
person2, dayN, feature1, feature2, ..., featureN, label
...
  • there is always the same number of functions, but each function can be 0, representing nothing
  • there is a different number of days available for each person, for example. person1 has 20 days of data, person 2 has 50

The goal is to predict a personโ€™s label on the next day, therefore label for day N + 1, either on an individual basis or as a whole (for a person it makes sense to me). I can freely reformat the data (it is small). Based on the above, after some reading, I thought dynamic RNN (LSTM) might work best:

  • repeating neural network: because the next day relies on the previous day
  • lstm: โ€‹โ€‹
  • dynamic: .

, , , . :

/ tensorflow/tflearn?

, tflearn, , " " . , , , , , , . .

+6
1

dynamic:

. RNN , , (0, ).

, , , - (1... ?) (feature1... featureN). -, LSTM

cell = tf.contrib.rnn.LSTMcell(size)

rnn, tf.nn.dynamic_rnn. :

: RNN.

time_major == False ( ), : [batch_size, max_time,...] .

max_time . dynamic_rnn, , :

x = tf.placeholder(tf.float32, shape=(batch_size, None, N))

rnn,

outputs, state = tf.nn.dynamic_rnn(cell, x)

(batch_size, seq_length, N). , 0- sequence_length dynamic_rnn

, , , RNN, , , RNN, .

+13

Source: https://habr.com/ru/post/1016460/


All Articles