Tensorflow Serving - Stateful LSTM

Is there a canonical way to keep state LSTM stateful, etc. when servicing Tensorflow?

Using the Tensorflow API directly is straightforward - but I'm not sure how best to perform a constant LSTM state between calls after exporting the model for service.

Are there any examples that accomplish the above? The patterns within the repo are very simple.

+6
source share
1 answer

From Martin Wick on the TF mailing list:

" . , , . , , , . , , ( - , ), , TensorFlow ( ), , / ."

+3

Source: https://habr.com/ru/post/1016904/


All Articles