I need to create large and large (very) matrices (Markov chains) for scientific purposes. I am doing a calculus that I entered into a list of 20301 elements (= one row of my matrix). I need all this data in memory to continue Markovβs next step, but I can save it in another place (for example, a file), if necessary, even if it slows down my walk around Markov. My computer (science lab): Bi-xenon 6 cores / 12threads each, 12GB memory, OS: win64
Traceback (most recent call last): File "my_file.py", line 247, in <module> ListTemp.append(calculus) MemoryError
Example of calculation results: 9.233747520008198e-102 (yes, this is more than 1/9000)
Error saving element 19766:
ListTemp[19766] 1.4509421012263216e-103
If I go further
Traceback (most recent call last): File "<pyshell#21>", line 1, in <module> ListTemp[19767] IndexError: list index out of range
Thus, this list had a memory error in the 19767 loop.
Questions:
Is there a memory limit for a list? Is this a "list limit" or a "global-script limit"?
How to get around these restrictions? Any features in mind?
Did this help to use numpy, python64? What are the memory limitations with them? What about other languages?
python list memory limits
Taupi Apr 04 2018-11-11T00: 00Z
source share