I am trying to use concurrent.futures.ProcessPoolExecutor with Locks, but I am getting a runtime error. (I am working on Windows, if relevant)
Here is my code:
import multiprocessing from concurrent.futures import ProcessPoolExecutor import time def f(i, lock): with lock: print(i, 'hello') time.sleep(1) print(i, 'world') def main(): lock = multiprocessing.Lock() pool = ProcessPoolExecutor() futures = [pool.submit(f, num, lock) for num in range(3)] for future in futures: future.result() if __name__ == '__main__': main()
Here is the error I get:
Traceback (most recent call last): File "F:\WinPython-64bit-3.4.3.2\python-3.4.3.amd64\Lib\multiprocessing\queues.py", line 242, in _feed obj = ForkingPickler.dumps(obj) File "F:\WinPython-64bit-3.4.3.2\python-3.4.3.amd64\Lib\multiprocessing\reduction.py", line 50, in dumps cls(buf, protocol).dump(obj) File "F:\WinPython-64bit-3.4.3.2\python-3.4.3.amd64\Lib\multiprocessing\synchronize.py", line 102, in __getstate__ context.assert_spawning(self) File "F:\WinPython-64bit-3.4.3.2\python-3.4.3.amd64\Lib\multiprocessing\context.py", line 347, in assert_spawning ' through inheritance' % type(obj).__name__ RuntimeError: Lock objects should only be shared between processes through inheritance
Which is strange, if I write the same code with multiprocessing.Process , everything works fine:
import multiprocessing import time def f(i, lock): with lock: print(i, 'hello') time.sleep(1) print(i, 'world') def main(): lock = multiprocessing.Lock() processes = [multiprocessing.Process(target=f, args=(i, lock)) for i in range(3)] for process in processes: process.start() for process in processes: process.join() if __name__ == '__main__': main()
This works and I get:
1 hello 1 world 0 hello 0 world 2 hello 2 world