:py:mod:`astronomer.providers.amazon.aws.triggers.batch` ======================================================== .. py:module:: astronomer.providers.amazon.aws.triggers.batch Module Contents --------------- Classes ~~~~~~~ .. autoapisummary:: astronomer.providers.amazon.aws.triggers.batch.BatchOperatorTrigger astronomer.providers.amazon.aws.triggers.batch.BatchSensorTrigger .. py:class:: BatchOperatorTrigger(job_id, job_name, job_definition, job_queue, container_overrides, array_properties, parameters, waiters, tags, max_retries, status_retries, region_name, aws_conn_id = 'aws_default') Bases: :py:obj:`airflow.triggers.base.BaseTrigger` Checks for the state of a previously submitted job to AWS Batch. BatchOperatorTrigger is fired as deferred class with params to poll the job state in Triggerer :param job_id: the job ID, usually unknown (None) until the submit_job operation gets the jobId defined by AWS Batch :param job_name: the name for the job that will run on AWS Batch (templated) :param job_definition: the job definition name on AWS Batch :param job_queue: the queue name on AWS Batch :param container_overrides: the `containerOverrides` parameter for boto3 (templated) :param array_properties: the `arrayProperties` parameter for boto3 :param parameters: the `parameters` for boto3 (templated) :param waiters: a :class:`.BatchWaiters` object (see note below); if None, polling is used with max_retries and status_retries. :param tags: collection of tags to apply to the AWS Batch job submission if None, no tags are submitted :param max_retries: exponential back-off retries, 4200 = 48 hours; polling is only used when waiters is None :param status_retries: number of HTTP retries to get job status, 10; polling is only used when waiters is None :param aws_conn_id: connection id of AWS credentials / region name. If None, credential boto3 strategy will be used. :param region_name: AWS region name to use . Override the region_name in connection (if provided) .. py:method:: serialize() Serializes BatchOperatorTrigger arguments and classpath. .. py:method:: run() :async: Make async connection using aiobotocore library to AWS Batch, periodically poll for the job status on the Triggerer The status that indicates job completion are: 'SUCCEEDED'|'FAILED'. So the status options that this will poll for are the transitions from: 'SUBMITTED'>'PENDING'>'RUNNABLE'>'STARTING'>'RUNNING'>'SUCCEEDED'|'FAILED' .. py:class:: BatchSensorTrigger(job_id, region_name, aws_conn_id = 'aws_default', poke_interval = 5) Bases: :py:obj:`airflow.triggers.base.BaseTrigger` Checks for the status of a submitted job_id to AWS Batch until it reaches a failure or a success state. BatchSensorTrigger is fired as deferred class with params to poll the job state in Triggerer :param job_id: the job ID, to poll for job completion or not :param aws_conn_id: connection id of AWS credentials / region name. If None, credential boto3 strategy will be used :param region_name: AWS region name to use Override the region_name in connection (if provided) :param poke_interval: polling period in seconds to check for the status of the job .. py:method:: serialize() Serializes BatchSensorTrigger arguments and classpath. .. py:method:: run() :async: Make async connection using aiobotocore library to AWS Batch, periodically poll for the Batch job status The status that indicates job completion are: 'SUCCEEDED'|'FAILED'.