google.cloud.triggers.bigquery

Module Contents

Classes

BigQueryInsertJobTrigger

Base class for all triggers.

BigQueryCheckTrigger

Base class for all triggers.

BigQueryGetDataTrigger

Base class for all triggers.

BigQueryIntervalCheckTrigger

Base class for all triggers.

BigQueryValueCheckTrigger

Base class for all triggers.

BigQueryTableExistenceTrigger

Initialise the BigQuery Table Existence Trigger with needed parameters

class google.cloud.triggers.bigquery.BigQueryInsertJobTrigger(conn_id, job_id, project_id, dataset_id=None, table_id=None, poll_interval=4.0)

Bases: airflow.triggers.base.BaseTrigger

Base class for all triggers.

A trigger has two contexts it can exist in:

  • Inside an Operator, when it’s passed to TaskDeferred

  • Actively running in a trigger worker

We use the same class for both situations, and rely on all Trigger classes to be able to return the (Airflow-JSON-encodable) arguments that will let them be re-instantiated elsewhere.

serialize(self)

Serializes BigQueryInsertJobTrigger arguments and classpath.

async run(self)

Gets current job execution status and yields a TriggerEvent

class google.cloud.triggers.bigquery.BigQueryCheckTrigger(conn_id, job_id, project_id, dataset_id=None, table_id=None, poll_interval=4.0)

Bases: BigQueryInsertJobTrigger

Base class for all triggers.

A trigger has two contexts it can exist in:

  • Inside an Operator, when it’s passed to TaskDeferred

  • Actively running in a trigger worker

We use the same class for both situations, and rely on all Trigger classes to be able to return the (Airflow-JSON-encodable) arguments that will let them be re-instantiated elsewhere.

serialize(self)

Serializes BigQueryCheckTrigger arguments and classpath.

async run(self)

Gets current job execution status and yields a TriggerEvent

class google.cloud.triggers.bigquery.BigQueryGetDataTrigger(conn_id, job_id, project_id, dataset_id=None, table_id=None, poll_interval=4.0)

Bases: BigQueryInsertJobTrigger

Base class for all triggers.

A trigger has two contexts it can exist in:

  • Inside an Operator, when it’s passed to TaskDeferred

  • Actively running in a trigger worker

We use the same class for both situations, and rely on all Trigger classes to be able to return the (Airflow-JSON-encodable) arguments that will let them be re-instantiated elsewhere.

serialize(self)

Serializes BigQueryInsertJobTrigger arguments and classpath.

async run(self)

Gets current job execution status and yields a TriggerEvent with response data

class google.cloud.triggers.bigquery.BigQueryIntervalCheckTrigger(conn_id, first_job_id, second_job_id, project_id, table, metrics_thresholds, date_filter_column='ds', days_back=- 7, ratio_formula='max_over_min', ignore_zero=True, dataset_id=None, table_id=None, poll_interval=4.0)

Bases: BigQueryInsertJobTrigger

Base class for all triggers.

A trigger has two contexts it can exist in:

  • Inside an Operator, when it’s passed to TaskDeferred

  • Actively running in a trigger worker

We use the same class for both situations, and rely on all Trigger classes to be able to return the (Airflow-JSON-encodable) arguments that will let them be re-instantiated elsewhere.

serialize(self)

Serializes BigQueryCheckTrigger arguments and classpath.

async run(self)

Gets current job execution status and yields a TriggerEvent

class google.cloud.triggers.bigquery.BigQueryValueCheckTrigger(conn_id, sql, pass_value, job_id, project_id, tolerance=None, dataset_id=None, table_id=None, poll_interval=4.0)

Bases: BigQueryInsertJobTrigger

Base class for all triggers.

A trigger has two contexts it can exist in:

  • Inside an Operator, when it’s passed to TaskDeferred

  • Actively running in a trigger worker

We use the same class for both situations, and rely on all Trigger classes to be able to return the (Airflow-JSON-encodable) arguments that will let them be re-instantiated elsewhere.

serialize(self)

Serializes BigQueryValueCheckTrigger arguments and classpath.

async run(self)

Gets current job execution status and yields a TriggerEvent

class google.cloud.triggers.bigquery.BigQueryTableExistenceTrigger(project_id, dataset_id, table_id, gcp_conn_id, hook_params, poll_interval=4.0)

Bases: airflow.triggers.base.BaseTrigger

Initialise the BigQuery Table Existence Trigger with needed parameters

serialize(self)

Serializes BigQueryTableExistenceTrigger arguments and classpath.

async run(self)

Will run until the table exists in the Google Big Query.