On my 64-bit Mac OS X 10.8.5 laptop, the range(11464882) object range(11464882) requires:
>>> import sys >>> sys.getsizeof(range(11464882)) 91719128 >>> sys.getsizeof(11464881)
so 350 megabytes of memory.
Here sys.getsizeof() returns the amount of memory data for Python objects, not counting the contained values. Thus, for a list, it is just the memory that the list structure requires; accounting information plus 11 million 64-bit pointers.
In addition, many empty dictionaries accept:
>>> sys.getsizeof({}) 280 >>> 91719128 + 280 * 11464882 3301886088 >>> (91719128 + 280 * 11464882) / (1024.0 ** 2)
3 gigabytes of memory. 11 million times 280 bytes is a lot of space.
Together with other overheads (most likely the detection of the garbage collection cycle, the Python process itself and the stored values), this means that you are pushing 4 GB of CPU memory onto your computer.
If you use a 32-bit binary, the sizes will be smaller since you only need space for 32-bit pointers, but you will also get only 2 GB of address memory to fit all of your objects.
source share