您好,登錄后才能下訂單哦!
小編給大家分享一下python爬蟲實戰之怎樣爬取高德地區,相信大部分人都還不怎么了解,因此分享這篇文章給大家參考一下,希望大家閱讀完這篇文章后大有收獲,下面讓我們一起去了解一下吧!
代碼:
import requests import json def weatherlist(url1,url2,headers,proxies): response = requests.get(url=url1, headers=headers, proxies=proxies).content.decode('utf-8') response = json.loads(response) for i in response["data"]["cityByLetter"].values(): for j in i: adcode = j["adcode"] name = j["name"] full_url = url2+adcode response = requests.get(url=full_url, headers=headers, proxies=proxies).content.decode('utf-8') response = json.loads(response) print(response) try: if response["data"]["data"]: for weather in response["data"]["data"]: for weather in weather['forecast_data']: weather_name = weather['weather_name'] temp_min = weather['min_temp'] temp_max = weather['max_temp'] with open('weather_list.txt', 'a', encoding='utf-8') as fp: fp.write("城市:"+name+ " 天氣: "+weather_name+" 最高氣溫: "+ temp_max +" 最低氣溫: "+temp_min+'\n') except: print('空') if __name__ == '__main__': url1 = 'https://www.amap.com/service/cityList' url2 = 'https://www.amap.com/service/weather?adcode=' headers = {'User-Agent':'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.97 Safari/537.36','Cookie':'BIDUPSID=F6BBCD59FE2A646812DB8DAE641A0BE5; PSTM=1573713375; BAIDUID=F6BBCD59FE2A6468D0329C1E2F60212F:FG=1; BD_UPN=12314353; BDORZ=B490B5EBF6F3CD402E515D22BCDA1598; H_PS_PSSID=1452_21098_29568_29221_26350; delPer=0; BD_CK_SAM=1; PSINO=2; H_PS_645EC=50d5uY51q2qJG%2BVlK7rlPmCgY73TcN9qKRz4sPKuBII1GIkIx4QkChitGd4; BDSVRTM=209'} proxies = {'http':'124.113.217.5:9999','https':''} weatherlist(url1,url2,headers,proxies)
以上是“python爬蟲實戰之怎樣爬取高德地區”這篇文章的所有內容,感謝各位的閱讀!相信大家都有了一定的了解,希望分享的內容對大家有所幫助,如果還想學習更多知識,歡迎關注億速云行業資訊頻道!
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。