LSTM for regression (in tensor flow)

I want to implement some LSTM model in Tensorflow. I think I understood the textbooks quite well. These inputs were given in the form of words that were embedded in a continuous vector space (having a number of advantages). Now I want to do LSTM to predict a series of conditional numbers and I don’t know what is the best approach to this. Should I discretize my input range in this way, effectively get a classification problem with multiple classes and use the nesting described earlier, or stick to continuous numbers and do a regression? In this case, I just pass at one step one function of the model, namely a continuous number?

+5
source share
1 answer

You can find two examples here.

https://github.com/MorvanZhou/tutorials/blob/master/tensorflowTUT/tf20_RNN2.2/full_code.py

http://mourafiq.com/2016/05/15/predicting-sequences-using-rnn-in-tensorflow.html

You can just use regression. However, if your entry is forever long, you need to correct the sequence of sizes.

0
source

Source: https://habr.com/ru/post/1261257/


All Articles