本文整理汇总了Python中airflow.operators.BaseOperator方法的典型用法代码示例。如果您正苦于以下问题:Python operators.BaseOperator方法的具体用法?Python operators.BaseOperator怎么用?Python operators.BaseOperator使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类airflow.operators
的用法示例。
在下文中一共展示了operators.BaseOperator方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: add
# 需要导入模块: from airflow import operators [as 别名]
# 或者: from airflow.operators import BaseOperator [as 别名]
def add(self, task, to=None):
if not isinstance(task, BaseOperator):
raise AirflowException(
"Relationships can only be set between "
"Operators; received {}".format(task.__class__.__name__))
if to == 'top':
self.top_task = self.top_task if self.top_task else task
task.set_downstream([t for t in self.tasks if t.task_id != task.task_id and not t.upstream_list])
elif to == 'bottom':
self.bottom_task = self.bottom_task if self.bottom_task else task
task.set_upstream([t for t in self.tasks if t.task_id != task.task_id and not t.downstream_list])
if self.top_task and self.bottom_task:
self.bottom_task.reader_task_id = self.top_task.task_id
for t in self.tasks:
if t.task_id != self.top_task.task_id:
t.reader_task_id = self.top_task.task_id
示例2: make_airflow_dag_for_operator
# 需要导入模块: from airflow import operators [as 别名]
# 或者: from airflow.operators import BaseOperator [as 别名]
def make_airflow_dag_for_operator(
recon_repo,
pipeline_name,
operator,
run_config=None,
mode=None,
dag_id=None,
dag_description=None,
dag_kwargs=None,
op_kwargs=None,
environment_dict=None,
):
'''Construct an Airflow DAG corresponding to a given Dagster pipeline and custom operator.
`Custom operator template <https://github.com/dagster-io/dagster/blob/master/examples/legacy_examples/dagster_examples/dagster_airflow/custom_operator.py>`_
Tasks in the resulting DAG will execute the Dagster logic they encapsulate run by the given
Operator :py:class:`BaseOperator <airflow.models.BaseOperator>`. If you
are looking for a containerized solution to provide better isolation, see instead
:py:func:`make_airflow_dag_containerized`.
This function should be invoked in an Airflow DAG definition file, such as that created by an
invocation of the dagster-airflow scaffold CLI tool.
Args:
recon_repo (:class:`dagster.ReconstructableRepository`): reference to a Dagster RepositoryDefinition
that can be reconstructed in another process
pipeline_name (str): The name of the pipeline definition.
operator (type): The operator to use. Must be a class that inherits from
:py:class:`BaseOperator <airflow.models.BaseOperator>`
run_config (Optional[dict]): The environment config, if any, with which to compile
the pipeline to an execution plan, as a Python dict.
mode (Optional[str]): The mode in which to execute the pipeline.
instance (Optional[DagsterInstance]): The Dagster instance to use to execute the pipeline.
dag_id (Optional[str]): The id to use for the compiled Airflow DAG (passed through to
:py:class:`DAG <airflow:airflow.models.DAG>`).
dag_description (Optional[str]): The description to use for the compiled Airflow DAG
(passed through to :py:class:`DAG <airflow:airflow.models.DAG>`)
dag_kwargs (Optional[dict]): Any additional kwargs to pass to the Airflow
:py:class:`DAG <airflow:airflow.models.DAG>` constructor, including ``default_args``.
op_kwargs (Optional[dict]): Any additional kwargs to pass to the underlying Airflow
operator.
Returns:
(airflow.models.DAG, List[airflow.models.BaseOperator]): The generated Airflow DAG, and a
list of its constituent tasks.
'''
check.subclass_param(operator, 'operator', BaseOperator)
run_config = canonicalize_run_config(run_config, environment_dict) # backcompact
return _make_airflow_dag(
recon_repo=recon_repo,
pipeline_name=pipeline_name,
run_config=run_config,
mode=mode,
dag_id=dag_id,
dag_description=dag_description,
dag_kwargs=dag_kwargs,
op_kwargs=op_kwargs,
operator=operator,
)