您好,登錄后才能下訂單哦!
怎么在Django中使用celery處理異步任務?相信很多沒有經驗的人對此束手無策,為此本文總結了問題出現的原因和解決方法,通過這篇文章希望你能解決這個問題。
什么是Celery?
Celery是 一個專注于實時處理的任務隊列,它還支持任務調度。 Celery快速,簡單,高度可用且靈活。
Celery需要消息傳輸來發送和接收消息,這可以由Redis或RabbitMQ完成。
入門
讓我們開始在您的virtualenv中安裝Celery軟件包。
安裝Celery
<span class="nv">$ </span>pip <span class="nb">install </span>celery pip install celery
安裝Redis
我們將Message Broker用作Redis,所以我們安裝
Linux / Mac用戶
您可以從這里下載最新版本
$ wget http://download.redis.io/releases/redis-4.0.8.tar.gz $ tar xzf redis-4.0.8.tar.gz $ cd redis-4.0.8 $make
Windows用戶
對于Windows用戶,您可以從此處獲取redis的可執行文件。
安裝后,請嘗試是否正確安裝。
$ redis-cli ping
它應該顯示:
pong
同時安裝redis的python包
$ pip install redis
Django的第一步
現在您已經成功安裝了軟件包,現在就開始學習Django Project
settings.py Add some of the setting configuration in your settings.py CELERY_BROKER_URL = 'redis://localhost:6379' CELERY_RESULT_BACKEND = 'redis://localhost:6379' CELERY_ACCEPT_CONTENT = ['application/json'] CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json' CELERY_TIMEZONE = "YOUR_TIMEZONE"
確保您已從YOUR_TIMEZONE更改時區。 您可以從這里獲取時區
主Django項目目錄中創建celery.py文件
- src/ - manage.py - celery_project/ - __init__.py - settings.py - urls.py - celery.py celery_project/celery.py
在celery.py模塊中添加以下代碼。 該模塊用于定義celery實例。
確保已使用django項目名稱更改了項目名稱(<your project name>)
from __future__ import absolute_import, unicode_literals import os from celery import Celery # set the default Django settings module for the 'celery' program. os.environ.setdefault('DJANGO_SETTINGS_MODULE', '<your project name>.settings') app = Celery('<your project name>') # Using a string here means the worker doesn't have to serialize # the configuration object to child processes. # - namespace='CELERY' means all celery-related configuration keys # should have a `CELERY_` prefix. app.config_from_object('django.conf:settings', namespace='CELERY') # Load task modules from all registered Django app configs. app.autodiscover_tasks() @app.task(bind=True) def debug_task(self): print('Request: {0!r}'.format(self.request)) celery_project/__init__.py
然后,我們需要將定義celery.py的應用程序導入到主項目目錄的__init__.py。 這樣,我們可以確保在Django項目啟動時已加載應用
from __future__ import absolute_import, unicode_literals # This will make sure the app is always imported when # Django starts so that shared_task will use this app. from .celery import app as celery_app __all__ = ['celery_app']
創建任務
現在創建一些任務
在您在INSTALLED_APPS中注冊的任何應用程序中創建一個新文件
my_app/tasks.py from __future__ import absolute_import, unicode_literals from celery import shared_task @shared_task(name = "print_msg_with_name") def print_message(name, *args, **kwargs): print("Celery is working!! {} have implemented it correctly.".format(name)) @shared_task(name = "add_2_numbers") def add(x, y): print("Add function has been called!! with params {}, {}".format(x, y)) return x+y
開始程序
打開一個NEW終端并運行以下命令以運行celery的worker實例,并將目錄更改為您的主項目目錄所在的位置,即,將manage.py文件放置的目錄,并確保您已經 激活您的virtualenv(如果已創建)。
用您的項目名稱更改項目名稱
$ celery -A <your project name> worker -l info
輸出:
-------------- celery@root v4.1.0 (latentcall) ---- **** ----- --- * *** * -- Linux-4.13.0-32-generic-x86_64-with-Ubuntu-17.10-artful 2018-02-17 08:09:37 -- * - **** --- - ** ---------- [config] - ** ---------- .> app: celery_project:0x7f9039886400 - ** ---------- .> transport: redis://localhost:6379// - ** ---------- .> results: redis://localhost:6379/ - *** --- * --- .> concurrency: 4 (prefork) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery [tasks] . add_2_numbers . celery_project.celery.debug_task . print_msg_with_name [2018-02-17 08:09:37,877: INFO/MainProcess] Connected to redis://localhost:6379// [2018-02-17 08:09:37,987: INFO/MainProcess] mingle: searching for neighbors [2018-02-17 08:09:39,084: INFO/MainProcess] mingle: all alone [2018-02-17 08:09:39,121: WARNING/MainProcess] /home/jai/Desktop/demo/lib/python3.6/site-packages/celery/fixups/django.py:202: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments! warnings.warn('Using settings.DEBUG leads to a memory leak, never ' [2018-02-17 08:09:39,121: INFO/MainProcess] celery@root ready.
注意:檢查上面的[tasks],它應該包含您在task.py模塊中創建的任務的名稱。
有關更多信息和日志,您還可以在DEBUG MODE中運行worker實例
celery <span class="nt">-A</span> <your project name> worker <span class="nt">-l</span> info <span class="nt">--loglevel</span><span class="o">=</span>DEBUG celery -A <your project name> worker -l info --loglevel=DEBUG
注意:請勿關閉此終端,應保持打開狀態!!
測試任務
現在讓我們從django shell運行任務打開Django shell
$ python3 manage.py shell
用delay方法運行函數:
>>> from my_app.tasks import print_message, add >>> print_message.delay("Jai Singhal") <AsyncResult: fe4f9787-9ee4-46da-856c-453d36556760> >>> add.delay(10, 20) <AsyncResult: ca5d2c50-87bc-4e87-92ad-99d6d9704c30>
當檢查您的celery worker實例正在運行的第二個終端時,您將獲得此類型的輸出,顯示您的任務已收到且任務已成功完成
[2018-02-17 08:12:14,375: INFO/MainProcess] Received task: my_app.tasks.print_message[fe4f9787-9ee4-46da-856c-453d36556760] [2018-02-17 08:12:14,377: WARNING/ForkPoolWorker-4] Celery is working!! Jai Singhal have implemented it correctly. [2018-02-17 08:12:14,382: INFO/ForkPoolWorker-4] Task my_app.tasks.print_message[fe4f9787-9ee4-46da-856c-453d36556760] succeeded in 0.004476275000342866s: None [2018-02-17 08:12:28,344: INFO/MainProcess] Received task: my_app.tasks.add[ca5d2c50-87bc-4e87-92ad-99d6d9704c30] [2018-02-17 08:12:28,349: WARNING/ForkPoolWorker-3] Add function has been called!! with params 10, 20 [2018-02-17 08:12:28,358: INFO/ForkPoolWorker-3] Task my_app.tasks.add[ca5d2c50-87bc-4e87-92ad-99d6d9704c30] succeeded in 0.010077004999857309s: 30
看完上述內容,你們掌握怎么在Django中使用celery處理異步任務的方法了嗎?如果還想學到更多技能或想了解更多相關內容,歡迎關注億速云行業資訊頻道,感謝各位的閱讀!
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。