I'm still relatively new to Python, so if this is an obvious question, I'm sorry.
My question is about urllib2 library and urlopen function. Currently, I use this to load a large number of pages from another server (they are all on the same remote host), but the script is killed by a timeout error from time to time (I assume this is from large requests).
Is there a way to keep the script running after a timeout? I would like to get all the pages, so I want the script to keep trying until it gets the page, and then goes over.
On the other hand, will the connection be open to server help?
source
share