python - 在 Airflow 2.0 中运行多个 Athena 查询

标签 python airflow boto3 amazon-athena

我正在尝试创建一个 DAG,其中一个任务使用 boto3 执行 athena 查询。它适用于一个查询,但是当我尝试运行多个 athena 查询时遇到问题。

这个问题可以这样破解:-

  1. 如果一个人通过this博客,可以看到athena使用start_query_execution触发查询,使用get_query_execution获取status queryExecutionId 和有关查询的其他数据(athena 的文档)

按照上述模式后,我有以下代码:-

import json
import time
import asyncio
import boto3
import logging
from airflow import DAG
from airflow.operators.python import PythonOperator


def execute_query(client, query, database, output_location):
    response = client.start_query_execution(
        QueryString=query,
        QueryExecutionContext={
            'Database': database
        },
        ResultConfiguration={
            'OutputLocation': output_location
        }
    )

    return response['QueryExecutionId']


async def get_ids(client_athena, query, database, output_location):
    query_responses = []
    for i in range(5):
        query_responses.append(execute_query(client_athena, query, database, output_location))    

    res = await asyncio.gather(*query_responses, return_exceptions=True)

    return res

def run_athena_query(query, database, output_location, region_name, **context):
    BOTO_SESSION = boto3.Session(
        aws_access_key_id = 'YOUR_KEY',
        aws_secret_access_key = 'YOUR_ACCESS_KEY')
    client_athena = BOTO_SESSION.client('athena', region_name=region_name)

    loop = asyncio.get_event_loop()
    query_execution_ids = loop.run_until_complete(get_ids(client_athena, query, database, output_location))
    loop.close()

    repetitions = 900
    error_messages = []
    s3_uris = []

    while repetitions > 0 and len(query_execution_ids) > 0:
        repetitions = repetitions - 1
        
        query_response_list = client_athena.batch_get_query_execution(
            QueryExecutionIds=query_execution_ids)['QueryExecutions']
      
        for query_response in query_response_list:
            if 'QueryExecution' in query_response and \
                    'Status' in query_response['QueryExecution'] and \
                    'State' in query_response['QueryExecution']['Status']:
                state = query_response['QueryExecution']['Status']['State']

                if state in ['FAILED', 'CANCELLED']:
                    error_reason = query_response['QueryExecution']['Status']['StateChangeReason']
                    error_message = 'Final state of Athena job is {}, query_execution_id is {}. Error: {}'.format(
                            state, query_execution_id, error_message
                        )
                    error_messages.append(error_message)
                    query_execution_ids.remove(query_response['QueryExecutionId'])
                
                elif state == 'SUCCEEDED':
                    result_location = query_response['QueryExecution']['ResultConfiguration']['OutputLocation']
                    s3_uris.append(result_location)
                    query_execution_ids.remove(query_response['QueryExecutionId'])
                 
                    
        time.sleep(2)
    
    logging.exception(error_messages)
    return s3_uris


DEFAULT_ARGS = {
    'owner': 'ubuntu',
    'depends_on_past': True,
    'start_date': datetime(2021, 6, 8),
    'retries': 0,
    'concurrency': 2
}

with DAG('resync_job_dag', default_args=DEFAULT_ARGS, schedule_interval=None) as dag:

    ATHENA_QUERY = PythonOperator(
        task_id='athena_query',
        python_callable=run_athena_query,
        provide_context=True,
        op_kwargs={
            'query': 'SELECT request_timestamp FROM "sampledb"."elb_logs" limit 10;', # query provide in athena tutorial
            'database':'sampledb',
            'output_location':'YOUR_BUCKET',
            'region_name':'YOUR_REGION'
        }
    )

    ATHENA_QUERY

在运行上面的代码时,出现以下错误:-

[2021-06-16 20:34:52,981] {taskinstance.py:1455} ERROR - An asyncio.Future, a coroutine or an awaitable is required
Traceback (most recent call last):
  File "/home/ubuntu/venv/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1112, in _run_raw_task
    self._prepare_and_execute_task_with_callbacks(context, task)
  File "/home/ubuntu/venv/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1285, in _prepare_and_execute_task_with_callbacks
    result = self._execute_task(context, task_copy)
  File "/home/ubuntu/venv/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1315, in _execute_task
    result = task_copy.execute(context=context)
  File "/home/ubuntu/venv/lib/python3.6/site-packages/airflow/operators/python.py", line 117, in execute
    return_value = self.execute_callable()
  File "/home/ubuntu/venv/lib/python3.6/site-packages/airflow/operators/python.py", line 128, in execute_callable
    return self.python_callable(*self.op_args, **self.op_kwargs)
  File "/home/ubuntu/iac-airflow/dags/helper/tasks.py", line 93, in run_athena_query
    query_execution_ids = loop.run_until_complete(get_ids(client_athena, query, database, output_location))
  File "/usr/lib/python3.6/asyncio/base_events.py", line 484, in run_until_complete
    return future.result()
  File "/home/ubuntu/iac-airflow/dags/helper/tasks.py", line 79, in get_ids
    res = await asyncio.gather(*query_responses, return_exceptions=True)
  File "/usr/lib/python3.6/asyncio/tasks.py", line 602, in gather
    fut = ensure_future(arg, loop=loop)
  File "/usr/lib/python3.6/asyncio/tasks.py", line 526, in ensure_future
    raise TypeError('An asyncio.Future, a coroutine or an awaitable is '
TypeError: An asyncio.Future, a coroutine or an awaitable is required

我无法找到我要去的地方。希望对这个问题有一些提示

最佳答案

我认为您在这里所做的并不是真正需要的。 您的问题是:

  1. 并行执行多个查询。
  2. 能够为每个查询恢复 queryExecutionId

这两个问题都可以通过使用 AWSAthenaOperator 轻松解决。接线员已经为您处理了您提到的一切。

例子:

from airflow.models import DAG
from airflow.utils.dates import days_ago
from airflow.operators.dummy import DummyOperator
from airflow.providers.amazon.aws.operators.athena import AWSAthenaOperator


with DAG(
    dag_id="athena",
    schedule_interval='@daily',
    start_date=days_ago(1),
    catchup=False,
) as dag:

    start_op = DummyOperator(task_id="start_task")
    query_list = ["SELECT 1;", "SELECT 2;" "SELECT 3;"]

    for i, sql in enumerate(query_list):
        run_query = AWSAthenaOperator(
            task_id=f'run_query_{i}',
            query=sql,
            output_location='s3://my-bucket/my-path/',
            database='my_database'
        )
        start_op >> query_op

只需向 query_list 添加更多查询,即可动态创建 Athena 任务:

enter image description here

注意 QueryExecutionIdpushed to xcom因此,如果需要,您可以在下游任务中访问。

关于python - 在 Airflow 2.0 中运行多个 Athena 查询,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/68009760/

相关文章:

python - 使用 Vincent/Vega 添加图表标题

python - Airflow DAG - 使用 SimpleHttpOperator 访问上下文以启用 XCOM 拉取

pytorch - 在 PythonVirtualenvOperator 中使用除 pip 之外的其他来源下载

python - 如何查找特定标签下的保留实例总数?

amazon-web-services - Airflow Emr Dag 成功但集群未启动

python - 如何找到 Seaborn 版本

python - 在 AWS S3 上将数据从一个文件夹移动/复制到另一个文件夹

python - 有没有办法知道函数中传递了多少个参数?

airflow - 如何触发每日 DAG 在本地时间午夜而不是 UTC 时间午夜运行

python - Boto3:WAITINGS3流式上传完成