astronomer.providers.amazon.aws.hooks.batch_client

Module Contents

Classes

BatchClientHookAsync

Async client for AWS Batch services.

class astronomer.providers.amazon.aws.hooks.batch_client.BatchClientHookAsync(job_id, waiters=None, *args, **kwargs)[source]

Bases: airflow.providers.amazon.aws.hooks.batch_client.BatchClientHook, astronomer.providers.amazon.aws.hooks.base_aws.AwsBaseHookAsync

Async client for AWS Batch services.

Parameters:
  • max_retries – exponential back-off retries, 4200 = 48 hours; polling is only used when waiters is None

  • status_retries – number of HTTP retries to get job status, 10; polling is only used when waiters is None

Note

Several methods use a default random delay to check or poll for job status, i.e. random.sample() Using a random interval helps to avoid AWS API throttle limits when many concurrent tasks request job-descriptions.

To modify the global defaults for the range of jitter allowed when a random delay is used to check Batch job status, modify these defaults, e.g.:

BatchClient.DEFAULT_DELAY_MIN = 0 BatchClient.DEFAULT_DELAY_MAX = 5

When explicit delay values are used, a 1 second random jitter is applied to the delay . It is generally recommended that random jitter is added to API requests. A convenience method is provided for this, e.g. to get a random delay of 10 sec +/- 5 sec: delay = BatchClient.add_jitter(10, width=5, minima=0)

async monitor_job()[source]

Monitor an AWS Batch job monitor_job can raise an exception or an AirflowTaskTimeout can be raised if execution_timeout is given while creating the task. These exceptions should be handled in taskinstance.py instead of here like it was previously done

Raises:

AirflowException

async check_job_success(job_id)[source]

Check the final status of the Batch job; return True if the job ‘SUCCEEDED’, else raise an AirflowException

Parameters:

job_id (str) – a Batch job ID

Raises:

AirflowException

async static delay(delay=None)[source]

Pause execution for delay seconds.

Parameters:

delay (Union[int, float, None]) – a delay to pause execution using time.sleep(delay); a small 1 second jitter is applied to the delay.

Note

This method uses a default random delay, i.e. random.sample(); using a random interval helps to avoid AWS API throttle limits when many concurrent tasks request job-descriptions.

async wait_for_job(job_id, delay=None)[source]

Wait for Batch job to complete

Parameters:
  • job_id (str) – a Batch job ID

  • delay (Union[int, float, None]) – a delay before polling for job status

Raises:

AirflowException

async poll_for_job_complete(job_id, delay=None)[source]

Poll for job completion. The status that indicates job completion are: ‘SUCCEEDED’|’FAILED’.

So the status options that this will wait for are the transitions from: ‘SUBMITTED’>’PENDING’>’RUNNABLE’>’STARTING’>’RUNNING’>’SUCCEEDED’|’FAILED’

Parameters:
  • job_id (str) – a Batch job ID

  • delay (Union[int, float, None]) – a delay before polling for job status

Raises:

AirflowException

async poll_for_job_running(job_id, delay=None)[source]

Poll for job running. The status that indicates a job is running or already complete are: ‘RUNNING’|’SUCCEEDED’|’FAILED’.

So the status options that this will wait for are the transitions from: ‘SUBMITTED’>’PENDING’>’RUNNABLE’>’STARTING’>’RUNNING’|’SUCCEEDED’|’FAILED’

The completed status options are included for cases where the status changes too quickly for polling to detect a RUNNING status that moves quickly from STARTING to RUNNING to completed (often a failure).

Parameters:
  • job_id (str) – a Batch job ID

  • delay (Union[int, float, None]) – a delay before polling for job status

Raises:

AirflowException

async get_job_description(job_id)[source]

Get job description (using status_retries).

Parameters:

job_id (str) – a Batch job ID

Raises:

AirflowException

async poll_job_status(job_id, match_status)[source]

Poll for job status using an exponential back-off strategy (with max_retries). The Batch job status polled are: ‘SUBMITTED’|’PENDING’|’RUNNABLE’|’STARTING’|’RUNNING’|’SUCCEEDED’|’FAILED’

Parameters:
  • job_id (str) – a Batch job ID

  • match_status (List[str]) – a list of job status to match

Raises:

AirflowException