Port error in scrapy

I developed a scanner where there will be two spiders. I developed them using scrapy.
These spiders will run independently, retrieving data from the database.

We launch these spiders using a reactor. As we know, we cannot restart the reactor

we give several 500 links to the second spider for scanning. If we do this, we will have a problem with a port error. ie scrapy uses only one port

Error caught on signal handler: <bound method ?.start_listening of <scrapy.telnet.TelnetConsole instance at 0x0467B440>> Traceback (most recent call last): File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 1070, in _inlineCallbacks result = g.send(result) File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\core\engine.py", line 75, in start yield self.signals.send_catch_log_deferred(signal=signals.engine_started) File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\signalmanager.py", line 23, in send_catch_log_deferred return signal.send_catch_log_deferred(*a, **kw) File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\utils\signal.py", line 53, in send_catch_log_deferred *arguments, **named) --- <exception caught here> --- File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 137, in maybeDeferred result = f(*args, **kw) File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\xlib\pydispatch\robustapply.py", line 47, in robustApply return receiver(*arguments, **named) File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\telnet.py", line 47, in start_listening self.port = listen_tcp(self.portrange, self.host, self) File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\utils\reactor.py", line 14, in listen_tcp return reactor.listenTCP(x, factory, interface=host) File "C:\Python27\lib\site-packages\twisted\internet\posixbase.py", line 489, in listenTCP p.startListening() File "C:\Python27\lib\site-packages\twisted\internet\tcp.py", line 980, in startListening raise CannotListenError(self.interface, self.port, le) twisted.internet.error.CannotListenError: Couldn't listen on 0.0.0.0:6073: [Errno 10048] Only one usage of each socket address (protocol/network address/port) is normally permitted. 

So what is the problem here? Then what is the best way to solve this scenario? Please, help...

ps: I increased the number of ports in the settings, but it always accepts 6073 by default.

+6
source share
2 answers

The easiest way is to disable the Telnet console by adding it to your settings.py :

 EXTENSIONS = { 'scrapy.telnet.TelnetConsole': None } 

See also the http://doc.scrapy.org/en/latest/topics/settings.html#extensions list of extensions with default enabled.

+5
source

Your problem can be solved by running multiple parallel scanners. Here is the recipe I wrote for sequential queries: This particular class runs only one crawler, but the modifications needed to run its batches (for example, 10 at a time) are trivial.

 class SequentialCrawlManager(object): """Start spiders sequentially""" def __init__(self, spider, websites): self.spider = spider self.websites = websites # setup crawler self.settings = get_project_settings() self.current_site_idx = 0 def next_site(self): if self.current_site_idx < len(self.websites): self.crawler = Crawler(self.settings) # the CSVs data in each column is passed as keyword arguments # the arguments come from the spider = self.spider() # pass arguments if desired self.crawler.crawl(spider) self.crawler.start() # wait for one spider to finish before starting the next one self.crawler.signals.connect(self.next_site, signal=signals.spider_closed) self.crawler.configure() self.current_site_idx += 1 else: reactor.stop() # required for the program to terminate def start(self): log.start() self.next_site() reactor.run() # blocking call 
+1
source

Source: https://habr.com/ru/post/948724/


All Articles