Memory errors and list limits?

I need to create large and large (very) matrices (Markov chains) for scientific purposes. I am doing a calculus that I entered into a list of 20301 elements (= one row of my matrix). I need all this data in memory to continue Markov’s next step, but I can save it in another place (for example, a file), if necessary, even if it slows down my walk around Markov. My computer (science lab): Bi-xenon 6 cores / 12threads each, 12GB memory, OS: win64

Traceback (most recent call last): File "my_file.py", line 247, in <module> ListTemp.append(calculus) MemoryError 

Example of calculation results: 9.233747520008198e-102 (yes, this is more than 1/9000)

Error saving element 19766:

 ListTemp[19766] 1.4509421012263216e-103 

If I go further

 Traceback (most recent call last): File "<pyshell#21>", line 1, in <module> ListTemp[19767] IndexError: list index out of range 

Thus, this list had a memory error in the 19767 loop.

Questions:

  • Is there a memory limit for a list? Is this a "list limit" or a "global-script limit"?

  • How to get around these restrictions? Any features in mind?

  • Did this help to use numpy, python64? What are the memory limitations with them? What about other languages?

+63
python list memory limits
Apr 04 2018-11-11T00:
source share
3 answers

First, see how big can a Python array get? and Numpy, the problem with long arrays

Secondly, the only real limit comes from the amount of your memory and how your memory stores memory. There is no limit to the list, so Python will work until memory runs out. Two possibilities:

  • If you are running an earlier OS or one that forces processes to use a limited amount of memory, you may need to increase the amount of memory that the Python process has access to.
  • Split the list using chunking. For example, make the first 1000 elements of the list, collect and save them to disk, and then follow the next 1000. To work with them, open one piece at a time so that you don't have enough memory. This is essentially the same method that databases use to work with more data than in RAM.
+52
Apr 04 '11 at 11:23
source share

The MemoryError exception you see is a direct result of not using RAM. This may be due to either a 2 GB limit per program installed by Windows ( 32bit programs ) or a lack of available RAM on your computer. (This link refers to the previous question).

You should be able to expand 2 GB with a 64-bit copy of Python, provided that you use a 64-bit copy of windows.

IndexError will be caused by Python getting a MemoryError exception before evaluating the entire array. Again this is a memory issue.

To work around this problem, you can try using a 64-bit copy of Python or better yet find a way to write the results to a file. To do this, look at numpy memory mapped arrays .

You should be able to run the entire set of calculations in one of these arrays, since the actual data will be written to disk, and only a small part of it is stored in memory.

+26
Apr 04 2018-11-11T00:
source share

The memory limit is not imposed by Python. However, you will get a MemoryError if you run out of RAM. you say that you have 20301 items in a list . This seems too small to cause a memory error for simple data types (for example, int ), but if each element itself is an object that takes up a lot of memory, out of memory may well end.

Probably IndexError caused by the fact that your ListTemp has only 19767 elements (with an index from 0 to 19766), and you are trying to access the last element.

It is hard to say what you can do to avoid being hit without knowing exactly what you are trying to do. Using numpy may help. It looks like you are storing a huge amount of data. You may not need to store all this at every step. But one cannot say without knowing.

+8
Apr 04 '11 at 11:15
source share



All Articles