Python / Django Tips and Message Queuing Tips

I have an application in Django that should send a large number of letters to users in various use cases. I do not want to handle this synchronously in the application for obvious reasons.

Are there any recommendations for a message queue server that integrates well with Python, or did they use in the Django project? The rest of my stack is Apache, mod_python, MySQL.

+41
python django message-queue
Jan 18 '09 at 10:42
source share
10 answers

So far, I have not found a β€œpleasant” solution for this. I have some more stringent requirements for real-time mode (with an image of a cardboard box with an inscription), so probably one of the approaches is enough for you. I guess emails can wait a few minutes.

  • A to-do list in the database processed by the cron job.
  • A "todo list" in the database, processed on a regular basis by the polling daemon.
  • Using a custom daemon, which is notified by the web server via the UDP packet (in the Production Today section). Mostly my own Queing system with an IP stack for queuing.
  • Using ActiveMQ as a message broker - this did not work due to security issues. Also, to me Java demons are usually somewhat plump.
  • Using update triggers in CouchDB. Nice, but Update Triggers are not designed to handle heavy images, so they are not suitable for my problem.

So far I have not tried RabbitMQ and XMPP / ejabebrd to solve the problem, but they are on my list of the following things to try. In 2008, RabbitMQ got decent Python connectivity and there are many XMPP libraries.

But perhaps all you need is a properly configured mail server on the local computer. This will probably allow you to synchronously send emails to the local mail server and thus make your entire software package much easier.

+13
Jan 18 '09 at 11:54
source share

In your particular case, when it's just an email queue, I wold choose a simple path and use django-mailer . As a nice side bonus, there are other plug-in projects that are smart enough to use django-mailer when they see it on the stack.

As for the more general queue decisions, I have not been able to try any of them yet, but here is a list of those that look more interesting to me:

+23
Jan 19 '09 at 5:16
source share

Stompserver is a good option. It is lightweight, easy to install and easy to use from Django / python.

We have a system that uses a toppserver in production to send email and process other jobs asynchronously.

Django stores emails in the database, the model.post_save handler in Django sends the event to stompserver, and stompserver passes the event to a process that performs an asynchronous task (sends an email).

It scales quite well, because you can add consumer processes at runtime - two users can send twice as many letters, and consumers can be on separate machines. One minor complication is that each consumer needs their own named queue, so Django needs to know how many consumers are available, and send events to each queue in a circular fashion. (Two subscribers listening to the same queue will receive each message = duplication). If you need only one consumer process, this is not a problem.

Previously, we had processes that continuously checked the database for tasks, but found that it adds a lot of load to the system, even when nothing needs to be processed.

+6
Apr 3 '09 at 16:15
source share

Just add the letters to the database, and then write another script launched by some kind of task scheduler utility (cron comes to mind) to send emails.

+1
Jan 19 '09 at 2:45
source share

Maybe you should take a look at pymq . It is written in python, communicates HTTP with its clients and allows many monitoring and control parameters for queues.

+1
Apr 18 2018-10-18T00:
source share

Is there anything wrong with solving this problem using the mail infrastructure? For example, each application server that runs its own mail daemons that will queue any locally sent mail that is sent to a centralized mail server that can perform heavy mail work?

+1
Sep 12 '10 at 21:46
source share
+1
May 20 '11 at 7:21
source share

If you already have MySQL installed, you can create a table to use as a to-do list.

Threads synchronously add tasks to the table, and a batch task deletes tasks as they are completed.

Thus, there is no need to install and learn more software, and it should work just fine, like a persistent job repository unless you send a lot of emails (e.g.> 10 / sec).

0
Jan 18 '09 at 11:34
source share

Here is a lazy, but correct and adequate solution. Use the following database table as a queue.

drop table if exists mailqueue; create table mailqueue ( id bigint primary key, subject text not null, body mediumtext not null, from varchar(255) not null, to varchar(255) not null ); 

Senders should insert new lines at the end of this table.

Configure workflows to send mail one at a time from the other end (lowest id) and try sending them.

0
Mar 25 2018-12-12T00:
source share

You can also use Twisted for this. But in all cases, it will not play with django, it is very dependent on deployment scenarios. Most importantly, each request must be serviced by a single python process, so you need apache compiled in streaming mode.

-2
Jan 18 '09 at 22:21
source share



All Articles