How to catch error in urllib?

Python 3.

How to catch all errors in queries?
The script accesses the Yandex XML, and from there get the address of the site. There awaits us all: a timeout, any inappropriate statuses (403, for example).

Was doing this, somewhere advised, but still get the 403 error and a script crash because of it (not caught).
try:
 response = urllib.request.urlopen(url, timeout=10)
except SocketError as e:
 if e.errno != errno.ECONNRESET:
 raise # Not the error we are looking for
 return False
except urllib.error.HTTPError:
 return False
except urllib.error.URLError:
 return False
except SocketError:
 return False

except socket.timeout:
 return False

except http.client.BadStatusLine:
 return False

except http.client.IncompleteRead:
 return False


PS: Also, maybe advise an existing library for Python 3 for the normal receipt of the content at the url (in nodejs there is a great request).
July 8th 19 at 11:54
1 answer
July 8th 19 at 11:56
Solution
docs.python-requests.org/en/master

In order to understand what class error you have triggered and continue execution
try:
pass
except Exception as e:
 print(e)


however, to interrupt by ctrl+c will not be

Find more questions by tags Python