Using TensorFlow RAM limits

I have seen methods to limit the use of TensorFlows GPU memory (through coefficient). We use TF on mobile devices, and RAM is extremely limited. Is there any way to affect the TF memory allocation? I have to somehow make sure that the TF does not allocate more than ~ 400 MB at any time or the application will crash.

TF emits especially peaks during the run and drops to reasonable levels after (594 MB peak, 355 MB inactivity).

Also, are there any methods to optimize the schedule for allocating low memory?

PS: It seems that the memory allocation has increased dramatically with TF 1.0, I don’t remember that it was high with 0.9 beta versions, especially when the graph has only 45 MB of disk space.

+4
source share

Source: https://habr.com/ru/post/1672364/


All Articles