Python multiprocessing process is killed by http request if ipdb is imported

It seems to just import ipdb when the http request wrapped in an instance of multiprocessing Process causes the program to exit without errors or messages.

The following script behaves very strange:

 from multiprocessing import Process import requests import ipdb def spawn(): print("before") r = requests.get("http://wtfismyip.com") print("after") Process(target=spawn).start() 

If you run this in the terminal, the output will be just before , and you will be returned to your prompt. If you comment out on import ipdb , everything will be fine and the request will succeed.

  • Saving the Process instance in a variable and calling join() after start() did not affect it.
  • This happens in both Python 2.7.10 and 3.5.0.
  • It performs not with traditional pdb .
  • Other people here and here also had this problem. In the first one, I'm not sure if the reason for importing ipdb . In the latter case, it was a problem updating the / python package version, but I checked that my iPython and ipdb are the latest (4.0.0 and 0.8.1).

Can someone explain why this is happening?

+2
source share

Source: https://habr.com/ru/post/988871/


All Articles