I try to get this to work correctly all day, its almost complete is just the weird problem I get. Each result found in the search query is recorded as expected, but the first result is recorded once, the second is recorded twice, the third is recorded three times, etc.
Any ideas how to get rid of duplicates? Log Example
#!/usr/bin/python import urllib import simplejson import logging from logging.handlers import SysLogHandler query = urllib.urlencode({'q' : 'test'}) url = 'http://ajax.googleapis.com/ajax/services/search/web?v=1.0&%s' \ % (query) search_results = urllib.urlopen(url) json = simplejson.loads(search_results.read()) results = json['responseData']['results'] for i in results: logger = logging.getLogger() logger.addHandler(SysLogHandler(address=('192.168.0.2', 514))) logger.addHandler(logging.FileHandler("hits.log")) logging.warn(i['url']) print i['url']
source share