当前位置: 首页>>代码示例>>Python>>正文


Python SSHHook.no_host_key_check方法代码示例

本文整理汇总了Python中airflow.contrib.hooks.ssh_hook.SSHHook.no_host_key_check方法的典型用法代码示例。如果您正苦于以下问题:Python SSHHook.no_host_key_check方法的具体用法?Python SSHHook.no_host_key_check怎么用?Python SSHHook.no_host_key_check使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在airflow.contrib.hooks.ssh_hook.SSHHook的用法示例。


在下文中一共展示了SSHHook.no_host_key_check方法的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: setUp

# 需要导入模块: from airflow.contrib.hooks.ssh_hook import SSHHook [as 别名]
# 或者: from airflow.contrib.hooks.ssh_hook.SSHHook import no_host_key_check [as 别名]
    def setUp(self):
        configuration.load_test_config()
        from airflow.contrib.hooks.ssh_hook import SSHHook
        from airflow.hooks.S3_hook import S3Hook

        hook = SSHHook(ssh_conn_id='ssh_default')
        s3_hook = S3Hook('aws_default')
        hook.no_host_key_check = True
        args = {
            'owner': 'airflow',
            'start_date': DEFAULT_DATE,
            'provide_context': True
        }
        dag = DAG(TEST_DAG_ID + 'test_schedule_dag_once', default_args=args)
        dag.schedule_interval = '@once'

        self.hook = hook
        self.s3_hook = s3_hook

        self.ssh_client = self.hook.get_conn()
        self.sftp_client = self.ssh_client.open_sftp()

        self.dag = dag
        self.s3_bucket = BUCKET
        self.sftp_path = SFTP_PATH
        self.s3_key = S3_KEY
开发者ID:Fokko,项目名称:incubator-airflow,代码行数:28,代码来源:test_s3_to_sftp_operator.py

示例2: setUp

# 需要导入模块: from airflow.contrib.hooks.ssh_hook import SSHHook [as 别名]
# 或者: from airflow.contrib.hooks.ssh_hook.SSHHook import no_host_key_check [as 别名]
 def setUp(self):
     configuration.load_test_config()
     from airflow.contrib.hooks.ssh_hook import SSHHook
     hook = SSHHook(ssh_conn_id='ssh_default')
     hook.no_host_key_check = True
     args = {
         'owner': 'airflow',
         'start_date': DEFAULT_DATE,
         'provide_context': True
     }
     dag = DAG(TEST_DAG_ID + 'test_schedule_dag_once', default_args=args)
     dag.schedule_interval = '@once'
     self.hook = hook
     self.dag = dag
     self.test_dir = "/tmp"
     self.test_local_dir = "/tmp/tmp2"
     self.test_remote_dir = "/tmp/tmp1"
     self.test_local_filename = 'test_local_file'
     self.test_remote_filename = 'test_remote_file'
     self.test_local_filepath = '{0}/{1}'.format(self.test_dir,
                                                 self.test_local_filename)
     # Local Filepath with Intermediate Directory
     self.test_local_filepath_int_dir = '{0}/{1}'.format(self.test_local_dir,
                                                         self.test_local_filename)
     self.test_remote_filepath = '{0}/{1}'.format(self.test_dir,
                                                  self.test_remote_filename)
     # Remote Filepath with Intermediate Directory
     self.test_remote_filepath_int_dir = '{0}/{1}'.format(self.test_remote_dir,
                                                          self.test_remote_filename)
开发者ID:apache,项目名称:incubator-airflow,代码行数:31,代码来源:test_sftp_operator.py

示例3: setUp

# 需要导入模块: from airflow.contrib.hooks.ssh_hook import SSHHook [as 别名]
# 或者: from airflow.contrib.hooks.ssh_hook.SSHHook import no_host_key_check [as 别名]
 def setUp(self):
     configuration.load_test_config()
     from airflow.contrib.hooks.ssh_hook import SSHHook
     hook = SSHHook()
     hook.no_host_key_check = True
     args = {
         'owner': 'airflow',
         'start_date': DEFAULT_DATE,
         'provide_context': True
     }
     dag = DAG(TEST_DAG_ID+'test_schedule_dag_once', default_args=args)
     dag.schedule_interval = '@once'
     self.hook = hook
     self.dag = dag
开发者ID:bolkedebruin,项目名称:airflow,代码行数:16,代码来源:ssh_execute_operator.py


注:本文中的airflow.contrib.hooks.ssh_hook.SSHHook.no_host_key_check方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。