问题描述

今天在爬虫的时候经常遇到Traceback (most recent call last):异常,程序写得比较简陋,没有处理异常,导致爬虫程序经常报错停止。经过调试,发现是爬虫网站不稳定导致连接失败。

解决方法

1
2
3
4
5
6
7
8
9
10
11
12
maxTryNum = 20
for tries in range(maxTryNum):
try:
response = requests.get(urls[i], headers=headers, timeout=60)
with open(dir_name + '/' + file_name,'wb') as f:
f.write(response.content)
except:
if tries < (maxTryNum - 1):
continue
else:
print("Has tried %d times to access url %s, all failed!" % (maxTryNum, urls[i]))
break