您好,登錄后才能下訂單哦!
python中怎么實現并發和異步編程,相信很多沒有經驗的人對此束手無策,為此本文總結了問題出現的原因和解決方法,通過這篇文章希望你能解決這個問題。
1、準備階段
下面為所有測試代碼所需要的包
#! python3 # coding:utf-8 import socket from concurrent import futures from selectors import DefaultSelector,EVENT_WRITE,EVENT_READ import asyncio import aiohttp import time from time import ctime
在進行不同實現方式的比較時,實現場景就是在進行爬蟲開發的時候通過向對方網站發起一系列的http請求訪問,統計耗時來判斷實現方式的優劣,具體地,通過建立通信套接字,訪問新浪主頁,返回源碼,作為一次請求。先實現一個裝飾器用來統計函數的執行時間:
def tsfunc(func): def wrappedFunc(*args,**kargs): start = time.clock() action = func(*args,**kargs) time_delta = time.clock() - start print ('[{0}] {1}() called, time delta: {2}'.format(ctime(),func.__name__,time_delta)) return action return wrappedFunc
輸出的格式為:當前時間,調用的函數,函數的執行時間。
2、阻塞/非阻塞和同步/異步
這兩對概念不是很好區分,從定義上理解:
阻塞:在進行socket通信過程中,一個線程發起請求,如果當前請求沒有返回結果,則進入sleep狀態,期間線程掛起不能做其他操作,直到有返回結果,或者超時(如果設置超時的話)。
非阻塞:與阻塞相似,只不過在等待請求結果時,線程并不掛起而是進行其他操作,即在不能立刻得到結果之前,該函數不會阻掛起當前線程,而會立刻返回。
同步:同步和阻塞比較相似,但是二者并不是同一個概念,同步是指完成事件的邏輯,是指一件事完成之后,再完成第二件事,以此類推…
異步:異步和非阻塞比較類似,異步的概念和同步相對。當一個異步過程調用發出后,調用者不能立刻得到結果。實際處理這個調用的部件在完成后,通過狀態、通知和回調來通知調用者,實現異步的方式通俗講就是“等會再告訴你”。
1)阻塞方式
回到代碼上,首先實現阻塞方式的請求函數:
def blocking_way(): sock = socket.socket() sock.connect(('www.sina.com',80)) request = 'GET / HTTP/1.0\r\nHOST:www.sina.com\r\n\r\n' sock.send(request.encode('ascii')) response = b'' chunk = sock.recv(4096) while chunk: response += chunk chunk = sock.recv(4096) return response
測試線程、多進程和多線程
# 阻塞無并發 @tsfunc def sync_way(): res = [] for i in range(10): res.append(blocking_way()) return len(res) @tsfunc # 阻塞、多進程 def process_way(): worker = 10 with futures.ProcessPoolExecutor(worker) as executor: futs = {executor.submit(blocking_way) for i in range(10)} return len([fut.result() for fut in futs]) # 阻塞、多線程 @tsfunc def thread_way(): worker = 10 with futures.ThreadPoolExecutor(worker) as executor: futs = {executor.submit(blocking_way) for i in range(10)} return len([fut.result() for fut in futs])
運行結果:
[Wed Dec 13 16:52:25 2017] sync_way() called, time delta: 0.06371647809425328 [Wed Dec 13 16:52:28 2017] process_way() called, time delta: 2.31437644946734 [Wed Dec 13 16:52:28 2017] thread_way() called, time delta: 0.010172946070299727
可見與非并發的方式相比,啟動10個進程完成10次請求訪問耗費的時間最長,進程確實需要很大的系統開銷,相比多線程則效果好得多,啟動10個線程并發請求,比順序請求速度快了6倍左右。
2)非阻塞方式
實現非阻塞的請求代碼,與阻塞方式的區別在于等待請求時并不掛起而是直接返回,為了確保能正確讀取消息,最原始的方式就是循環讀取,知道讀取完成為跳出循環,代碼如下:
def nonblocking_way(): sock = socket.socket(socket.AF_INET,socket.SOCK_STREAM) sock.setblocking(False) try: sock.connect(('www.sina.com', 80)) except BlockingIOError: pass request = 'GET / HTTP/1.0\r\nHost: www.sina.com\r\n\r\n' data = request.encode('ascii') while True: try: sock.send(data) break except OSError: pass response = b'' while True: try: chunk = sock.recv(4096) while chunk: response += chunk chunk = sock.recv(4096) break except OSError: pass return response
測試單線程異步非阻塞方式:
@tsfunc def async_way(): res = [] for i in range(10): res.append(nonblocking_way()) return len(res)
測試結果與單線程同步阻塞方式相比:
[Wed Dec 13 17:18:30 2017] sync_way() called, time delta: 0.07342884475822574 [Wed Dec 13 17:18:30 2017] async_way() called, time delta: 0.06509009095694886
非阻塞方式起到了一定的效果,但是并不明顯,原因肯定是讀取消息的時候雖然不是在線程掛起的時候而是在循環讀取消息的時候浪費了時間,如果大部分時間讀浪費了并沒有發揮異步編程的威力,解決的辦法就是后面要說的【事件驅動】
3、回調、生成器和協程
a、回調
class Crawler(): def __init__(self,url): self.url = url self.sock = None self.response = b'' def fetch(self): self.sock = socket.socket() self.sock.setblocking(False) try: self.sock.connect(('www.sina.com',80)) except BlockingIOError: pass selector.register(self.sock.fileno(),EVENT_WRITE,self.connected) def connected(self,key,mask): selector.unregister(key.fd) get = 'GET {0} HTTP/1.0\r\nHost:www.sina.com\r\n\r\n'.format(self.url) self.sock.send(get.encode('ascii')) selector.register(key.fd,EVENT_READ,self.read_response) def read_response(self,key,mask): global stopped while True: try: chunk = self.sock.recv(4096) if chunk: self.response += chunk chunk = self.sock.recv(4096) else: selector.unregister(key.fd) urls_todo.remove(self.url) if not urls_todo: stopped = True break except: pass def loop(): while not stopped: events = selector.select() for event_key,event_mask in events: callback = event_key.data callback(event_key,event_mask) @tsfunc def callback_way(): for url in urls_todo: crawler = Crawler(url) crawler.fetch() loop1()
這是通過傳統回調方式實現的異步編程,結果如下:
[Tue Mar 27 17:52:49 2018] callback_way() called, time delta: 0.054735804048789374
b、生成器
class Crawler2: def __init__(self, url): self.url = url self.response = b'' def fetch(self): global stopped sock = socket.socket() yield from connect(sock, ('www.sina.com', 80)) get = 'GET {0} HTTP/1.0\r\nHost: www.sina.com\r\n\r\n'.format(self.url) sock.send(get.encode('ascii')) self.response = yield from read_all(sock) urls_todo.remove(self.url) if not urls_todo: stopped = True class Task: def __init__(self, coro): self.coro = coro f = Future1() f.set_result(None) self.step(f) def step(self, future): try: # send會進入到coro執行, 即fetch, 直到下次yield # next_future 為yield返回的對象 next_future = self.coro.send(future.result) except StopIteration: return next_future.add_done_callback(self.step) def loop1(): while not stopped: events = selector.select() for event_key,event_mask in events: callback = event_key.data callback()
運行結果如下:
[Tue Mar 27 17:54:27 2018] generate_way() called, time delta: 0.2914336347673473
c、協程
def nonblocking_way(): sock = socket.socket(socket.AF_INET,socket.SOCK_STREAM) sock.setblocking(False) try: sock.connect(('www.sina.com', 80)) except BlockingIOError: pass request = 'GET / HTTP/1.0\r\nHost: www.sina.com\r\n\r\n' data = request.encode('ascii') while True: try: sock.send(data) break except OSError: pass response = b'' while True: try: chunk = sock.recv(4096) while chunk: response += chunk chunk = sock.recv(4096) break except OSError: pass return response @tsfunc def asyncio_way(): tasks = [fetch(host+url) for url in urls_todo] loop.run_until_complete(asyncio.gather(*tasks)) return (len(tasks))
運行結果:
[Tue Mar 27 17:56:17 2018] asyncio_way() called, time delta: 0.43688060698484166
到此終于把并發和異步編程實例代碼測試完,下邊貼出全部代碼,共讀者自行測試,在任務量加大時,相信結果會大不一樣。
#! python3 # coding:utf-8 import socket from concurrent import futures from selectors import DefaultSelector,EVENT_WRITE,EVENT_READ import asyncio import aiohttp import time from time import ctime def tsfunc(func): def wrappedFunc(*args,**kargs): start = time.clock() action = func(*args,**kargs) time_delta = time.clock() - start print ('[{0}] {1}() called, time delta: {2}'.format(ctime(),func.__name__,time_delta)) return action return wrappedFunc def blocking_way(): sock = socket.socket() sock.connect(('www.sina.com',80)) request = 'GET / HTTP/1.0\r\nHOST:www.sina.com\r\n\r\n' sock.send(request.encode('ascii')) response = b'' chunk = sock.recv(4096) while chunk: response += chunk chunk = sock.recv(4096) return response def nonblocking_way(): sock = socket.socket(socket.AF_INET,socket.SOCK_STREAM) sock.setblocking(False) try: sock.connect(('www.sina.com', 80)) except BlockingIOError: pass request = 'GET / HTTP/1.0\r\nHost: www.sina.com\r\n\r\n' data = request.encode('ascii') while True: try: sock.send(data) break except OSError: pass response = b'' while True: try: chunk = sock.recv(4096) while chunk: response += chunk chunk = sock.recv(4096) break except OSError: pass return response selector = DefaultSelector() stopped = False urls_todo = ['/','/1','/2','/3','/4','/5','/6','/7','/8','/9'] class Crawler(): def __init__(self,url): self.url = url self.sock = None self.response = b'' def fetch(self): self.sock = socket.socket() self.sock.setblocking(False) try: self.sock.connect(('www.sina.com',80)) except BlockingIOError: pass selector.register(self.sock.fileno(),EVENT_WRITE,self.connected) def connected(self,key,mask): selector.unregister(key.fd) get = 'GET {0} HTTP/1.0\r\nHost:www.sina.com\r\n\r\n'.format(self.url) self.sock.send(get.encode('ascii')) selector.register(key.fd,EVENT_READ,self.read_response) def read_response(self,key,mask): global stopped while True: try: chunk = self.sock.recv(4096) if chunk: self.response += chunk chunk = self.sock.recv(4096) else: selector.unregister(key.fd) urls_todo.remove(self.url) if not urls_todo: stopped = True break except: pass def loop(): while not stopped: events = selector.select() for event_key,event_mask in events: callback = event_key.data callback(event_key,event_mask) # 基于生成器的協程 class Future: def __init__(self): self.result = None self._callbacks = [] def add_done_callback(self,fn): self._callbacks.append(fn) def set_result(self,result): self.result = result for fn in self._callbacks: fn(self) class Crawler1(): def __init__(self,url): self.url = url self.response = b'' def fetch(self): sock = socket.socket() sock.setblocking(False) try: sock.connect(('www.sina.com',80)) except BlockingIOError: pass f = Future() def on_connected(): f.set_result(None) selector.register(sock.fileno(),EVENT_WRITE,on_connected) yield f selector.unregister(sock.fileno()) get = 'GET {0} HTTP/1.0\r\nHost: www.sina.com\r\n\r\n'.format(self.url) sock.send(get.encode('ascii')) global stopped while True: f = Future() def on_readable(): f.set_result(sock.recv(4096)) selector.register(sock.fileno(),EVENT_READ,on_readable) chunk = yield f selector.unregister(sock.fileno()) if chunk: self.response += chunk else: urls_todo.remove(self.url) if not urls_todo: stopped = True break # yield from 改進的生成器協程 class Future1: def __init__(self): self.result = None self._callbacks = [] def add_done_callback(self,fn): self._callbacks.append(fn) def set_result(self,result): self.result = result for fn in self._callbacks: fn(self) def __iter__(self): yield self return self.result def connect(sock, address): f = Future1() sock.setblocking(False) try: sock.connect(address) except BlockingIOError: pass def on_connected(): f.set_result(None) selector.register(sock.fileno(), EVENT_WRITE, on_connected) yield from f selector.unregister(sock.fileno()) def read(sock): f = Future1() def on_readable(): f.set_result(sock.recv(4096)) selector.register(sock.fileno(), EVENT_READ, on_readable) chunk = yield from f selector.unregister(sock.fileno()) return chunk def read_all(sock): response = [] chunk = yield from read(sock) while chunk: response.append(chunk) chunk = yield from read(sock) return b''.join(response) class Crawler2: def __init__(self, url): self.url = url self.response = b'' def fetch(self): global stopped sock = socket.socket() yield from connect(sock, ('www.sina.com', 80)) get = 'GET {0} HTTP/1.0\r\nHost: www.sina.com\r\n\r\n'.format(self.url) sock.send(get.encode('ascii')) self.response = yield from read_all(sock) urls_todo.remove(self.url) if not urls_todo: stopped = True class Task: def __init__(self, coro): self.coro = coro f = Future1() f.set_result(None) self.step(f) def step(self, future): try: # send會進入到coro執行, 即fetch, 直到下次yield # next_future 為yield返回的對象 next_future = self.coro.send(future.result) except StopIteration: return next_future.add_done_callback(self.step) def loop1(): while not stopped: events = selector.select() for event_key,event_mask in events: callback = event_key.data callback() # asyncio 協程 host = 'http://www.sina.com' loop = asyncio.get_event_loop() async def fetch(url): async with aiohttp.ClientSession(loop=loop) as session: async with session.get(url) as response: response = await response.read() return response @tsfunc def asyncio_way(): tasks = [fetch(host+url) for url in urls_todo] loop.run_until_complete(asyncio.gather(*tasks)) return (len(tasks)) @tsfunc def sync_way(): res = [] for i in range(10): res.append(blocking_way()) return len(res) @tsfunc def process_way(): worker = 10 with futures.ProcessPoolExecutor(worker) as executor: futs = {executor.submit(blocking_way) for i in range(10)} return len([fut.result() for fut in futs]) @tsfunc def thread_way(): worker = 10 with futures.ThreadPoolExecutor(worker) as executor: futs = {executor.submit(blocking_way) for i in range(10)} return len([fut.result() for fut in futs]) @tsfunc def async_way(): res = [] for i in range(10): res.append(nonblocking_way()) return len(res) @tsfunc def callback_way(): for url in urls_todo: crawler = Crawler(url) crawler.fetch() loop1() @tsfunc def generate_way(): for url in urls_todo: crawler = Crawler2(url) Task(crawler.fetch()) loop1() if __name__ == '__main__': #sync_way() #process_way() #thread_way() #async_way() #callback_way() #generate_way() asyncio_way()
看完上述內容,你們掌握python中怎么實現并發和異步編程的方法了嗎?如果還想學到更多技能或想了解更多相關內容,歡迎關注億速云行業資訊頻道,感謝各位的閱讀!
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。