我有一个运行 tasks.py 的 celery worker ,如下所示:
from celery import Celery
from kombu import Connection, Exchange, Queue, Consumer
import socket
app = Celery('tasks', backend='redis://', broker='pyamqp://guest:guest@localhost/')
app.conf.task_default_queue = 'default'
app.conf.task_queues = (
Queue('queue1', routing_key='tasks.add'),
Queue('queueA', routing_key='tasks.task_1'),
Queue('queueB', routing_key='tasks.task_2'),
Queue('queueC', routing_key='tasks.task_3'),
Queue('queueD', routing_key='tasks.task_4')
)
@app.task
def add(x, y):
print ("add("+ str(x) + "+" + str(y) + ")")
return x + y
还有一个创建任务链的 tasks_canvas.py,如下所示:
from celery import signature
from celery import chain
from tasks import *
signature('tasks.add', args=(2,2))
result = chain(add.s(2,2), add.s(4), add.s(8)).apply_async(queue='queue1')
print (result.status)
print (result.get())
但是,当运行 tasks_canvas.py 时,result.status 总是 PENDING 并且 worker 永远不会运行整个链。这是运行 tasks_canvas.py 的输出:
C:\Users\user_\Desktop\Aida>tasks_canvas.py
PENDING
并且,这是工作人员的输出:
C:\Users\user_\Desktop\Aida>celery -A tasks worker -l info -P eventlet
-------------- celery@User-RazerBlade v4.2.0 (windowlicker)
---- **** -----
--- * *** * -- Windows-10-10.0.17134-SP0 2018-07-16 12:04:20
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: tasks:0x41d5390
- ** ---------- .> transport: amqp://guest:**@localhost:5672//
- ** ---------- .> results: redis://
- *** --- * --- .> concurrency: 4 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> queue1 exchange=(direct) key=tasks.add
.> queueA exchange=(direct) key=tasks.task_1
.> queueB exchange=(direct) key=tasks.task_2
.> queueC exchange=(direct) key=tasks.task_3
.> queueD exchange=(direct) key=tasks.task_4
[tasks]
. tasks.add
. tasks.task_1
. tasks.task_2
. tasks.task_3
. tasks.task_4
. tasks.task_5
[2018-07-16 12:04:20,334: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2018-07-16 12:04:20,351: INFO/MainProcess] mingle: searching for neighbors
[2018-07-16 12:04:21,394: INFO/MainProcess] mingle: all alone
[2018-07-16 12:04:21,443: INFO/MainProcess] celery@User-RazerBlade ready.
[2018-07-16 12:04:21,448: INFO/MainProcess] pidbox: Connected to amqp://guest:**@127.0.0.1:5672//.
[2018-07-16 12:04:23,101: INFO/MainProcess] Received task: tasks.add[e6306b5b-211f-4015-b57e-05e2d0ac2df2]
[2018-07-16 12:04:23,102: WARNING/MainProcess] add(2+2)
[2018-07-16 12:04:23,128: INFO/MainProcess] Task tasks.add[e6306b5b-211f-4015-b57e-05e2d0ac2df2] succeeded in 0.031000000000858563s: 4
我想知道为什么会发生这种情况,因为我是 celery 的新手,以及如何让工作人员运行整个链条?
最佳答案
我解决了这个问题。有攻略here关于为什么任务总是处于挂起状态。但是,它并不涵盖所有情况。就我而言,存在任务路由问题。当我使用默认队列时,链中的所有任务都会立即运行。
关于python-3.x - 为什么 celery worker 将任务置于 PENDING 状态这么久?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51365984/