python - Airflow 随机向任务发送sigterms

标签 python apache celery airflow

我遇到了 Airflow 1.10.1 的问题。 dags 中的一些任务是从 helpers.py 获取 SIGTERM,据我所知,这是为 worker 执行关闭并终止所有子进程,但我在 10 个 dag taks 中的大约 2-3 个任务中看到了这一点,并且再次运行 dag 时,接收信号的任务会发生变化。是否有一定的标准来发送这些 SIGTERM 信号。
收到 SIGTERM 的任务的日志:

[2019-12-10 11:13:44,530] {base_task_runner.py:101} INFO - Job 25404: Subtask BS_PMU2 [2019-12-10 11:13:44,520] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=3600
[2019-12-10 11:13:45,489] {base_task_runner.py:101} INFO - Job 25404: Subtask BS_PMU2 [2019-12-10 11:13:45,488] {__init__.py:51} INFO - Using executor CeleryExecutor
[2019-12-10 11:13:45,934] {base_task_runner.py:101} INFO - Job 25404: Subtask BS_PMU2 [2019-12-10 11:13:45,933] {models.py:271} INFO - Filling up the DagBag from /home/centos/airflow/dags/61b6c300e82643b0f294df6f.py
[2019-12-10 11:13:46,580] {base_task_runner.py:101} INFO - Job 25404: Subtask BS_PMU2 Connected to MongoDB...
[2019-12-10 11:13:47,510] {bash_operator.py:74} INFO - Tmp dir root location:
/tmp
[2019-12-10 11:13:47,510] {bash_operator.py:87} INFO - Temporary script location: /tmp/airflowtmpal71kawr/BS_PMU2rjty_k9l
[2019-12-10 11:13:47,511] {bash_operator.py:97} INFO - Running command:
[2019-12-10 11:13:47,542] {bash_operator.py:106} INFO - Output:
[2019-12-10 11:13:47,542] {bash_operator.py:114} INFO - Command exited with return code 0
[2019-12-10 11:13:57,559] {base_task_runner.py:101} INFO - Job 25404: Subtask BS_PMU2 2019-12-10 11:13:57,556 - root - INFO - Putting xcom with return value:
[2019-12-10 11:13:57,631] {base_task_runner.py:101} INFO - Job 25404: Subtask BS_PMU2 2019-12-10 11:13:57,625 - root - INFO - WorkflowID: 61b6c300e82643b0f294df6f, RunID: 456c5bfb16556a3adc3b251a, TaskID: BS_PMU2
[2019-12-10 11:13:57,652] {base_task_runner.py:101} INFO - Job 25404: Subtask BS_PMU2 2019-12-10 11:13:57,643 - root - ERROR - Invalid key/value. Will skip setting xcom.
[2019-12-10 11:13:57,652] {base_task_runner.py:101} INFO - Job 25404: Subtask BS_PMU2 2019-12-10 11:13:57,644 - root - INFO - Done Execute
[2019-12-10 11:13:58,663] {helpers.py:240} INFO - Sending Signals.SIGTERM to GPID 9696
[2019-12-10 11:13:58,674] {helpers.py:230} INFO - Process psutil.Process(pid=9696 (terminated)) (9696) terminated with exit code 15```

最佳答案

您可以尝试增加 AIRFLOW__CORE__KILLED_TASK_CLEANUP_TIME 的值如果您想坚持使用相同版本的 Airflow ,请在 Airflow 配置中。
升级您的 Airflow 版本 >= 2.X 也会有所帮助。
您可以查找文档以获取更多引用:https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#killed-task-cleanup-time

关于python - Airflow 随机向任务发送sigterms,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59298566/

相关文章:

apache - 我应该在我的 Linux 服务器空间中安装哪个版本的 Apache Tomcat?

python - celery:在所有子任务的所有子任务运行后调用任务

python - 来自 Celery worker 的事件 Django 设置文件

django - 为什么 CeleryCAM 不能与 Amazon SQS 一起使用?

python - 如何修复/usr/local/bin/virtualenv :/usr/bin/python: bad interpreter: No such file or directory?

python - 我在使用 python 从 Excel 工作表读取日期和时间时遇到问题

Python 可迭代和上下文管理器

Python缩进之谜

linux - 在哪里管理 apache 后备目录

apache - Windows 10 Creators Update (1703) 后 WampServer Apache httpd.exe 应用程序无法正确启动 (0xc0000142)