Parallel programming using python multiprocessing and process processing

I had a problem creating a parallel program using multiprocessing. AFAIK, when I start a new process using this module (multiprocessing), I have to do "os.wait ()" or "childProcess.join ()" to get its exit status. But placement over functions in my program can happen when the main process stops, if something happens to the child process (and the child process freezes).

The problem is that if I do not, I will get child processes that will be zombies (and will be listed as "python <defunct>" in the top listing).

Is there a way to avoid waiting for child processes to complete and to avoid creating zombie processes and / or not bother the main process with so much about its child processes?

+3
source share
2 answers

Although ars answer should solve your immediate problems, you can think of celery: http://ask.github.com/celery/index.html . This is a relatively favorable approach for developers to achieve these goals and much more.

+2
source

, , . , ""? , - :

. , ( JoinableQueue.cancel_join_thread()), , .

, , , , , , . , , , - .

, , , . . .

0

Source: https://habr.com/ru/post/1755705/


All Articles