You do not need to use any external library or framework for such a simple task, put the list of URLs in the queue, start 4 threads, and each thread must take an element from the queue and load it.
something like that:
import sys import os import urllib import threading from Queue import Queue class DownloadThread(threading.Thread): def __init__(self, queue, destfolder): super(DownloadThread, self).__init__() self.queue = queue self.destfolder = destfolder self.daemon = True def run(self): while True: url = self.queue.get() try: self.download_url(url) except Exception,e: print " Error: %s"%e self.queue.task_done() def download_url(self, url):
using:
$ python download.py http://en.wikipedia.org/wiki/1 http://en.wikipedia.org/wiki/2 http://en.wikipedia.org/wiki/3 http://en.wikipedia.org/wiki/4 [4456497152] Downloading http://en.wikipedia.org/wiki/1 -> /tmp/1 [4457033728] Downloading http://en.wikipedia.org/wiki/2 -> /tmp/2 [4457701376] Downloading http://en.wikipedia.org/wiki/3 -> /tmp/3 [4458258432] Downloading http://en.wikipedia.org/wiki/4 -> /tmp/4
source share