MemoryError when etching data in python

I am trying to unload a dictionary into pickle format using the 'dump' command provided in python. The dictionary file size is about 150 mb, but an exception occurs when only 115 mb of the file is reset. The exception is:

Traceback (most recent call last): File "C:\Python27\generate_traffic_pattern.py", line 32, in <module> b.dump_data(way_id_data,'way_id_data.pickle') File "C:\Python27\class_dump_load_data.py", line 8, in dump_data pickle.dump(data,saved_file) File "C:\Python27\lib\pickle.py", line 1370, in dump Pickler(file, protocol).dump(obj) File "C:\Python27\lib\pickle.py", line 224, in dump self.save(obj) File "C:\Python27\lib\pickle.py", line 286, in save f(self, obj) # Call unbound method with explicit self File "C:\Python27\lib\pickle.py", line 649, in save_dict self._batch_setitems(obj.iteritems()) File "C:\Python27\lib\pickle.py", line 663, in _batch_setitems save(v) File "C:\Python27\lib\pickle.py", line 286, in save f(self, obj) # Call unbound method with explicit self File "C:\Python27\lib\pickle.py", line 600, in save_list self._batch_appends(iter(obj)) File "C:\Python27\lib\pickle.py", line 615, in _batch_appends save(x) File "C:\Python27\lib\pickle.py", line 286, in save f(self, obj) # Call unbound method with explicit self File "C:\Python27\lib\pickle.py", line 599, in save_list self.memoize(obj) File "C:\Python27\lib\pickle.py", line 247, in memoize self.memo[id(obj)] = memo_len, obj MemoryError 

I really got confused since my previous code worked fine before.

+6
source share
2 answers

Do you drop only one object and that's it?

If you call the dump many times, calling Pickler.clear_memo() between the dumps will clear the internally stored backlinks (causing a leak). And your code should work fine ...

+1
source

Have you tried this?

 import cPickle as pickle p = pickle.Pickler(open("temp.p","wb")) p.fast = True p.dump(d) # d is your dictionary 
+1
source

Source: https://habr.com/ru/post/944383/


All Articles