I am breeding the Python 3 API w / gunicorn, which uses keras to compute vectors for an image, is pretty simple.
How can I reset to store data in memory for each request? Slowly over time, requests increase over the time required to respond. I started the profiler, and this is exactly this line in tensorflow (also the memory usage slowly grows over time for each process):
This takes longer since more data is in the node. Here is the code that I am executing:
# We have 11439MiB of GPU memory, lets only use 2GB of it: config = tf.ConfigProto() config.gpu_options.per_process_gpu_memory_fraction = 0.22 sess = tf.Session(config=config) set_session(sess) sess.graph.as_default()
I thought as_default() would help, but it is not. I also tried to close the session after getting the list of vectors, and this fails.
Geesu source share