python - Django Celery Beat 与数据库调度程序未运行任务

标签 python celery django-celery celerybeat

我一直在广泛搜索,到目前为止我遇到的一切都表明我的配置是正确的......我可以看到我的任务正在我的 celery worker 容器中注册。我没有看到它们在我的节拍容器中注册(但我认为我应该?我应该只看到我所做的任务)但任务没有运行。
celery .py

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'itapp.settings')

app = Celery("itapp")
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

if __name__ == '__main__':
    app.start()

@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))
init.py
from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)
设置.py
LANGUAGE_CODE = 'en-gb'
TIME_ZONE = 'Europe/London'
USE_TZ = True
USE_I18N = True
USE_L10N = True

# CELERY SETTINGS
CELERY_TIMEZONE = 'Europe/London'
ENABLE_UTC = True
CELERY_BROKER_URL = 'amqp://rabbitmq:5672'
CELERY_RESULT_BACKEND = 'redis://redis:6379'
CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
CELERY_TASK_TIME_LIMIT = 540
docker-compose.yml
  celery:
    image: "app:latest"
    env_file: 
      - /Users/a/app/config/en_vars.txt
    volumes:
      - /Users/a/app/app:/app/app
    command: celery -A app worker -l INFO --concurrency 30
    depends_on:
      - rabbitmq
      - redis
  celery_beat:
    image: "app:latest"
    env_file: 
      - /Users/a/app/config/en_vars.txt
    volumes:
      - /Users/a/app/app:/app/app
    command: celery -A app beat -l DEBUG --pidfile= 
    depends_on:
      - rabbitmq
      - redis
定期时间表
scheuduled job
celery 原木
 -------------- celery@a05cfcda833f v4.4.7 (cliffs)
--- ***** -----
-- ******* ---- Linux-5.4.39-linuxkit-x86_64-with-debian-9.7 2020-11-26 14:57:55
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         itapp:0x7fb8d9e80518
- ** ---------- .> transport:   amqp://guest:**@rabbitmq:5672//
- ** ---------- .> results:     redis://redis:6379/
- *** --- * --- .> concurrency: 30 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . monitoring.tasks.monitoring_devices

[2020-11-26 14:58:09,160: INFO/MainProcess] Connected to amqp://guest:**@rabbitmq:5672//
[2020-11-26 14:58:09,201: INFO/MainProcess] mingle: searching for neighbors
[2020-11-26 14:58:10,266: INFO/MainProcess] mingle: all alone
[2020-11-26 14:58:10,333: INFO/MainProcess] celery@a05cfcda833f ready.
[2020-11-26 14:58:10,417: INFO/MainProcess] Events of group {task} enabled by remote.
击败日志
celery beat v4.4.7 (cliffs) is starting.
__    -    ... __   -        _
LocalTime -> 2020-11-26 14:57:55
Configuration ->
    . broker -> amqp://guest:**@rabbitmq:5672//
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> django_celery_beat.schedulers.DatabaseScheduler

    . logfile -> [stderr]@%DEBUG
    . maxinterval -> 5.00 seconds (5s)
[2020-11-26 14:57:55,321: DEBUG/MainProcess] Setting default socket timeout to 30
[2020-11-26 14:57:55,322: INFO/MainProcess] beat: Starting...
[2020-11-26 14:57:55,325: DEBUG/MainProcess] DatabaseScheduler: initial read
[2020-11-26 14:57:55,326: INFO/MainProcess] Writing entries...
[2020-11-26 14:57:55,327: DEBUG/MainProcess] DatabaseScheduler: Fetching database schedule
[2020-11-26 14:57:55,742: DEBUG/MainProcess] Current schedule:
<ModelEntry: celery.backend_cleanup celery.backend_cleanup(*[], **{}) <crontab: 0 4
         * *
          * (m/h/d/dM/MY), Europe/London>
        >
<ModelEntry: Meraki Device Monitor monitoring.tasks.monitoring_devices(*[], **{}) <freq: 2.00 minutes>>
[2020-11-26 14:57:56,537: DEBUG/MainProcess] beat: Ticking with max interval->5.00 seconds
[2020-11-26 14:57:57,094: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
[2020-11-26 14:58:02,101: DEBUG/MainProcess] beat: Synchronizing schedule...
[2020-11-26 14:58:02,102: INFO/MainProcess] Writing entries...
[2020-11-26 14:58:02,365: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
[2020-11-26 14:58:07,663: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
[2020-11-26 14:58:12,897: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
将容器作为单个命令“celery -A app worker -l INFO --beat”运行,并在下面启用调试日志
[2020-11-26 17:36:42,710: DEBUG/MainProcess] using channel_id: 1
[2020-11-26 17:36:42,725: DEBUG/MainProcess] Channel open
[2020-11-26 17:36:43,791: INFO/MainProcess] mingle: sync with 1 nodes
[2020-11-26 17:36:43,794: DEBUG/MainProcess] mingle: processing reply from celery@a05cfcda833f
[2020-11-26 17:36:43,795: INFO/MainProcess] mingle: sync complete
[2020-11-26 17:36:43,796: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,797: DEBUG/MainProcess] | Consumer: Starting Tasks
[2020-11-26 17:36:43,831: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,832: DEBUG/MainProcess] | Consumer: Starting Control
[2020-11-26 17:36:43,833: DEBUG/MainProcess] using channel_id: 2
[2020-11-26 17:36:43,837: DEBUG/MainProcess] Channel open
[2020-11-26 17:36:43,858: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,859: DEBUG/MainProcess] | Consumer: Starting Gossip
[2020-11-26 17:36:43,860: DEBUG/MainProcess] using channel_id: 3
[2020-11-26 17:36:43,863: DEBUG/MainProcess] Channel open
[2020-11-26 17:36:43,877: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,878: DEBUG/MainProcess] | Consumer: Starting Heart
[2020-11-26 17:36:43,879: DEBUG/MainProcess] using channel_id: 1
[2020-11-26 17:36:43,883: DEBUG/MainProcess] Channel open
[2020-11-26 17:36:43,888: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,889: DEBUG/MainProcess] | Consumer: Starting event loop
[2020-11-26 17:36:43,891: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2020-11-26 17:36:43,898: WARNING/MainProcess] /usr/local/lib/python3.7/site-packages/celery/fixups/django.py:206: UserWarning: Using settings.DEBUG leads to a memory
            leak, never use this setting in production environments!
  leak, never use this setting in production environments!''')
[2020-11-26 17:36:43,899: INFO/MainProcess] celery@123d2a28ed9f ready.
[2020-11-26 17:36:43,900: DEBUG/MainProcess] basic.qos: prefetch_count->12
[2020-11-26 17:36:44,139: DEBUG/MainProcess] celery@a05cfcda833f joined the party
[2020-11-26 17:36:45,832: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:36:45,834: INFO/MainProcess] Events of group {task} enabled by remote.
[2020-11-26 17:36:47,337: INFO/Beat] beat: Starting...
[2020-11-26 17:36:47,345: DEBUG/Beat] DatabaseScheduler: initial read
[2020-11-26 17:36:47,346: INFO/Beat] Writing entries...
[2020-11-26 17:36:47,347: DEBUG/Beat] DatabaseScheduler: Fetching database schedule
[2020-11-26 17:36:47,782: DEBUG/Beat] Current schedule:
<ModelEntry: celery.backend_cleanup celery.backend_cleanup(*[], **{}) <crontab: 0 4
         * *
          * (m/h/d/dM/MY), Europe/London>
        >
<ModelEntry: Meraki Device Monitor monitoring.tasks.monitoring_devices(*[], **{}) <freq: 2.00 minutes>>
[2020-11-26 17:36:48,629: DEBUG/Beat] beat: Ticking with max interval->5.00 seconds
[2020-11-26 17:36:49,125: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:36:50,835: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:36:54,093: DEBUG/Beat] beat: Synchronizing schedule...
[2020-11-26 17:36:54,095: INFO/Beat] Writing entries...
[2020-11-26 17:36:54,336: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:36:55,802: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:36:59,592: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:00,834: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:03,859: DEBUG/MainProcess] heartbeat_tick : for connection 6d96e75dc3d240809cca3a0aa738e512
[2020-11-26 17:37:03,859: DEBUG/MainProcess] heartbeat_tick : Prev sent/recv: None/None, now - 28/103, monotonic - 33966.588017603, last_heartbeat_sent - 33966.587982103, heartbeat int. - 60 for connection 6d96e75dc3d240809cca3a0aa738e512
[2020-11-26 17:37:04,876: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:05,834: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:10,131: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:10,836: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:15,364: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:15,833: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:20,773: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:20,833: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:23,827: DEBUG/MainProcess] heartbeat_tick : for connection 6d96e75dc3d240809cca3a0aa738e512
[2020-11-26 17:37:23,827: DEBUG/MainProcess] heartbeat_tick : Prev sent/recv: 28/103, now - 28/176, monotonic - 33986.590378703, last_heartbeat_sent - 33966.587982103, heartbeat int. - 60 for connection 6d96e75dc3d240809cca3a0aa738e512
实时数据库中的数据:
>>> PeriodicTask.objects.all()
<ExtendedQuerySet [<PeriodicTask: celery.backend_cleanup: 0 4 * * * (m/h/d/dM/MY) Europe/London>, <PeriodicTask: Device Monitor: every 2 minutes>]>
>>> PeriodicTasks.objects.all()
<ExtendedQuerySet [<PeriodicTasks: PeriodicTasks object (1)>]>
>>> vars(PeriodicTasks.objects.all()[0])
{'_state': <django.db.models.base.ModelState object at 0x7fc0e1f300f0>, 'ident': 1, 'last_update': datetime.datetime(2020, 12, 7, 9, 1, 59, tzinfo=<UTC>)}
>>>

最佳答案

您的 docker-compose.yml没有任何 MySQL 链接。您的应用程序是否可能无法访问数据库服务器?
如果您怀疑这是问题所在,您可以使用 docker ps 列出 docker 容器。然后手动尝试从相关容器内部使用 docker exec -ti <containerid> bash 连接到所需的数据库然后使用 mysql甚至 nccurlwget检查是否可以解析数据库主机 - 如果不能,请从 docker-compose.yml 链接到它将其插入 hosts文件自动,或者您可以手动完成,或者只是在您的配置文件中更新它。

关于python - Django Celery Beat 与数据库调度程序未运行任务,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/65024610/

相关文章:

python - 使用 Python 的基于 Web 的向导

Python 脚本打开并写入终端

python - Celery:长期专用的单一任务与短期的多项任务

python - Django Celery 中的 worker 在一次接收数千个任务后死亡

python - 如何在 Django-Celery 中设置失败时重试任务

python - 使用 python 处理数千个持久 TCP 连接

python - (0,1) 中的 1 == 0 为假;为什么?

python - celery 、Redis 和连接池

django - Django-celery/RabbitMQ 在哪里存储任务结果?

python-3.x - celery 是长时间运行任务的好选择吗?