:py:mod:`astronomer.providers.databricks.triggers.databricks` ============================================================= .. py:module:: astronomer.providers.databricks.triggers.databricks Module Contents --------------- Classes ~~~~~~~ .. autoapisummary:: astronomer.providers.databricks.triggers.databricks.DatabricksTrigger .. py:class:: DatabricksTrigger(conn_id, task_id, run_id, retry_limit, retry_delay, polling_period_seconds, job_id = None, run_page_url = None) Bases: :py:obj:`airflow.triggers.base.BaseTrigger` Wait asynchronously for databricks job to reach the terminal state. :param conn_id: The databricks connection id. The default value is ``databricks_default``. :param task_id: The task id. :param run_id: The databricks job run id. :param retry_limit: Amount of times retry if the Databricks backend is unreachable. Its value must be greater than or equal to 1. :param retry_delay: Number of seconds to wait between retries (it might be a floating point number). :param polling_period_seconds: Controls the rate which we poll for the result of this run. By default, the operator will poll every 30 seconds. :param job_id: The databricks job id. :param run_page_url: The databricks run page url. .. py:method:: serialize() Serializes DatabricksTrigger arguments and classpath. .. py:method:: run() :async: Makes a series of asynchronous http calls via a Databrick hook. It yields a Trigger if response is a 200 and run_state is successful, will retry the call up to the retry limit if the error is 'retryable', otherwise it throws an exception.