I had the same problems with guppy / heapy. Guppy is a bit out of date now. There is some patch, but I can no longer find it. However, this did not work for me in python 2.7. You may also have problems due to the OS arch (64 bit !?).
There are other questions about memory profiling methods:
I personally believe that the most valuable alternatives are:
It is also useful to calculate the size of your objects and track it yourself. I developed some code (initially https://stackoverflow.com/users/216356/noctis-skytower ) that I found in one of the StackOverflow questions ( Approximately how much memory would there be a list of 80,000 items consumed in python? ) For compatibility with Python 2.7 (should also work in 3):
totalSizeOf = lambda obj: sum(map(sys.getsizeof, explore(obj, set()))) def explore(obj, memo): loc = id(obj) if loc not in memo: memo.add(loc) yield obj # Handle instances with slots. try: slots = obj.__slots__ except AttributeError: pass else: for name in slots: try: attr = getattr(obj, name) except AttributeError: pass else: #yield from explore(attr, memo) for bar in explore(attr, memo): yield bar # Handle instances with dict. try: attrs = obj.__dict__ except AttributeError: pass else: #yield from explore(attrs, memo) for bar in explore(attrs, memo): yield bar # Handle dicts or iterables. for name in 'keys', 'values', '__iter__': try: attr = getattr(obj, name) except AttributeError: pass else: for item in attr(): #yield from explore(item, memo) for bar in explore(item, memo): yield bar
source share