My problem:
I am writing a simple Python tool to help me visualize my data as a function of many parameters. Each change of parameters includes a non-trivial amount of time, so I would like to cache every step obtained as a result of the image and data support in the dictionary. But then I worry that this dictionary may become too large over time. Most of my data is represented as Numpy arrays.
My question is:
How can I calculate the total number of bytes used by the Python dictionary. The dictionary itself can contain lists and other dictionaries, each of which contains data stored in Numpy arrays.
Ideas?
source
share