Urllib2 with cookies

I am trying to query an RSS feed that requires a cookie using python. I thought using urllib2, and adding the appropriate header would be enough, but the request continues to speak unauthorized.

I suppose this might be a problem on the remote sites side, but was not sure. How to use urllib2 with cookies? is there a better package for this (e.g. httplib, mechanize, curl)

+6
source share
2 answers
import urllib2 opener = urllib2.build_opener() opener.addheaders.append(('Cookie', 'cookiename=cookievalue')) f = opener.open("http://example.com/") 
+5
source

I would use the requests package, docs , it is much easier to use than urlib2 (sane API).

If the answer contains some cookies, you can get quick access to them:

 url = 'http://httpbin.org/cookies/set/requests-is/awesome' r = requests.get(url) print r.cookies #{'requests-is': 'awesome'} 

To send your own cookies to the server, you can use the cookie parameter:

 url = 'http://httpbin.org/cookies' cookies = dict(cookies_are='working') r = requests.get(url, cookies=cookies) r.content # '{"cookies": {"cookies_are": "working"}}' 

http://docs.python-requests.org/en/latest/user/quickstart/#cookies

+13
source

Source: https://habr.com/ru/post/905147/


All Articles