Free up memory due to errors in IPython Notebook

I sometimes work with very large datasets in IPython Notebooks. Sometimes a single pandas DataFrame will take up 1 + GB of memory, so I cannot afford to store many copies.

I found that if I try to perform an operation on such a matrix and an error occurs, I do not get the memory back - some intermittent variable is still being tracked somewhere. The problem is that I don’t know where I can’t free her!

For example, the image below shows the memory consumption after repeated attempts to execute a cell (each step in the graph corresponds to an attempt to execute a cell). Every time a new block of memory is consumed that is never released.

Memory usage when executing a cell that causes an error

Does anyone know where this memory goes and how to free it? Alternatively, if it is an error (for example, a memory leak or similar), how do you show this? I did not want to report this as an error, if in fact it is a side effect of the code execution as it was developed (for example, IPython caches things and I just abuse the caching system).

Thanks!

+5
source share
1 answer

In a github discussion regarding issue 642, there is a known memory leak in jsonschema 2.4. After upgrading to jsonschema 2.5.1, I no longer had this problem.

So, if you use an outdated structure and see this problem, you will need to update at least jsonschema.

+2
source

Source: https://habr.com/ru/post/1243352/


All Articles