Problem with duplicate log lines in a log file

I try to get this to work correctly all day, its almost complete is just the weird problem I get. Each result found in the search query is recorded as expected, but the first result is recorded once, the second is recorded twice, the third is recorded three times, etc.

Any ideas how to get rid of duplicates? Log Example

#!/usr/bin/python import urllib import simplejson import logging from logging.handlers import SysLogHandler query = urllib.urlencode({'q' : 'test'}) url = 'http://ajax.googleapis.com/ajax/services/search/web?v=1.0&%s' \ % (query) search_results = urllib.urlopen(url) json = simplejson.loads(search_results.read()) results = json['responseData']['results'] for i in results: logger = logging.getLogger() logger.addHandler(SysLogHandler(address=('192.168.0.2', 514))) logger.addHandler(logging.FileHandler("hits.log")) logging.warn(i['url']) print i['url'] 
+4
source share
3 answers

Because you add a new handler to the for loop each time. Do this outside the loop, and then do the actual logging.warn inside the loop.

+4
source

I had a similar problem, but I needed to add a new handler every time in the for loop. So removing the handler inside the loop did not help me.

When you create a handler as follows:

 hdl = logging.FileHandler("hits.log") 

you need to remove it as follows:

 logger.removeHandler(hdl) 
+4
source

As you did not accept the answer, as Daniel said, you need

 logger = logging.getLogger('') logger.addHandler(logging.FileHandler("hits.log")) logger.addHandler(SysLogHandler(address=('192.168.0.2', 514))) 

outside for loop .

+3
source

Source: https://habr.com/ru/post/1392484/


All Articles