So that's it for the urllib
package. As you can see, access to the standard library is more than adequate for most HTTP tasks. We haven't touched upon all of its capabilities. There are numerous handler classes which we haven't discussed, plus the opener interface is extensible.
However, the API isn't the most elegant, and there have been several attempts made to improve it. One of these is the very popular third-party library called Requests. It's available as the requests
package on PyPi. It can either be installed through Pip or be downloaded from http://docs.python-requests.org, which hosts the documentation.
The Requests
library automates and simplifies many of the tasks that we've been looking at. The quickest way of illustrating this is by trying some examples.
The commands for retrieving a URL with Requests
are similar to retrieving a URL with the urllib
package, as shown here:
>>> import requests >>> response = requests.get('http://www.debian...