I load large h5 files into memory using numpy ndarray . I read that my system (Win 7 prof., RAM 6 GB) should allow python.exe to use about 2 GB of physical memory.
However, I get a MemoryError , already shy of 1 GB. Even a stranger, this lower limit seems to apply only to the numpy array , but not to list .
I tested my memory consumption using the following function: here :
import psutil import gc import os import numpy as np from matplotlib.pyplot import pause def memory_usage_psutil():
Test 1: Checking the memory limits for a regular list
print 'Memory - %d MB' %memory_usage_psutil()
Test 1 prints:
Memory - 39 MB Memory - 1947 MB Memory - 1516 MB run garbage collector: collected 0 objects. Memory - 49 MB
Test 2: Creating a lot of np.array
shape = (5500,5500) names = ['b', 'c', 'd', 'g', 'h'] try: for n in names: globals()[n] = np.ones(shape, dtype='float64') print 'created variable %s with %0.2f MB'\ %(n,(globals()[n].nbytes/2.**20)) except MemoryError: print 'MemoryError, Memory - %d MB. Deleting files..'\ %memory_usage_psutil() pause(2)
Test 2 print:
Memory - 39 MB created variable b with 230.79 MB created variable c with 230.79 MB created variable d with 230.79 MB created variable g with 230.79 MB MemoryError, Memory - 964 MB. Deleting files.. Memory - 39 MB run garbage collector: collected 0 objects. Memory - 39 MB
My question is: Why do I get a MemoryError before I get close to the 2 GB limit and why is there a difference in memory limits for list and np.array respectively or what I don't see? I am using python 2.7 and numpy 1.7.1