Good persistent synchronous queue in python

I donโ€™t immediately care about the fifo or filo options, but in the future it may be nice.

What I'm looking for is a quick, quick, easy way to store (no more than one gigabyte of data or tens of millions of records) on disk, which can be obtained and installed by several processes. Entries are just 40 byte strings, not python objects. Not all shelve functionality is needed.

I saw this http://code.activestate.com/lists/python-list/310105/ It looks simple. It must be updated to the new version of Queue.

I wonder if something is better? I am concerned that in the event of a power failure, the entire pickled file becomes corrupt, not just one record.

+4
source share
3 answers

Try using Celery . This is not pure python since it uses RabbitMQ as a backend, but it is reliable, persistent and distributed and, in general, much better than using files or a database in the end.

+3
source

I think PyBSDDB is what you want. You can select a queue as an access type. PyBSDDB is a Python module based on Oracle Berkeley DB . It has synchronous access and can be accessed from different processes, although I do not know if this is possible from Python bindings. About several processes writing to db, I found this thread .

+2
source

Using files doesn't work? ...

Use the log file system to recover from power failures. That is their goal.

-1
source

Source: https://habr.com/ru/post/1388641/


All Articles