Python3 multiprocessing example crashed my pc :(

I'm new to multiprocessing

I ran sample code for two “recommended” multiprocessing examples given in response to other stackoverflow multiprocessing questions. Here is an example of one (which I dare not run again!)

test2.py (powered by pydev)

import multiprocessing class MyFancyClass(object): def __init__(self, name): self.name = name def do_something(self): proc_name = multiprocessing.current_process().name print(proc_name, self.name) def worker(q): obj = q.get() obj.do_something() queue = multiprocessing.Queue() p = multiprocessing.Process(target=worker, args=(queue,)) p.start() queue.put(MyFancyClass('Fancy Dan')) # Wait for the worker to finish queue.close() queue.join_thread() p.join() 

When I run this, my computer instantly slows down. It becomes gradually slower. After some time, I managed to enter the task manager to see MANY MANY python.exe on the process tab. after trying to complete the process on some, my mouse stopped moving. This was the second time I was forced to reboot.
I'm too scared to try a third example ...

works - Intel (R) Core (TM) i7 CPU 870 @ 2.93 GHz (8 processors), ~ 2.9 GHz on win7 64

If anyone knows what the problem is and can provide a VERY SIMPLE example of multiprocessing (send the string too multiprocessor, change it and send it for printing), I would be very grateful.

+4
source share
3 answers

From the docs :

Make sure that the main module can be safely imported using the new Python interpreter without causing unintended side effects (such a launch is a new process).

So on Windows you have to wrap your code inside

 if __name__=='__main__': 

block.


For example, this sends the line to the workflow, the line is reversed, and the result is printed by the main process:

 import multiprocessing as mp def worker(inq,outq): obj = inq.get() obj = obj[::-1] outq.put(obj) if __name__=='__main__': inq = mp.Queue() outq = mp.Queue() p = mp.Process(target=worker, args=(inq,outq)) p.start() inq.put('Fancy Dan') # Wait for the worker to finish p.join() result = outq.get() print(result) 
+9
source

Due to the way multiprocessing works on Windows (child processes import the __main__ module), the __main__ module cannot actually run anything during import - any code that must be executed directly at startup must be protected if __name__ == '__main__' idiom. Your fixed code:

 import multiprocessing class MyFancyClass(object): def __init__(self, name): self.name = name def do_something(self): proc_name = multiprocessing.current_process().name print(proc_name, self.name) def worker(q): obj = q.get() obj.do_something() if __name__ == '__main__': queue = multiprocessing.Queue() p = multiprocessing.Process(target=worker, args=(queue,)) p.start() queue.put(MyFancyClass('Fancy Dan')) # Wait for the worker to finish queue.close() queue.join_thread() p.join() 
+4
source

Can I suggest this link ? It uses threads instead of multiprocessing, but many of these principles are the same.

0
source

Source: https://habr.com/ru/post/1389661/


All Articles