当前位置: 首页>>代码示例>>Python>>正文


Python JobQueue.append方法代码示例

本文整理汇总了Python中fabric.job_queue.JobQueue.append方法的典型用法代码示例。如果您正苦于以下问题:Python JobQueue.append方法的具体用法?Python JobQueue.append怎么用?Python JobQueue.append使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在fabric.job_queue.JobQueue的用法示例。


在下文中一共展示了JobQueue.append方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: execute

# 需要导入模块: from fabric.job_queue import JobQueue [as 别名]
# 或者: from fabric.job_queue.JobQueue import append [as 别名]
def execute(task, *args, **kwargs):
    """
    Execute ``task`` (callable or name), honoring host/role decorators, etc.

    ``task`` may be an actual callable object, or it may be a registered task
    name, which is used to look up a callable just as if the name had been
    given on the command line (including :ref:`namespaced tasks <namespaces>`,
    e.g. ``"deploy.migrate"``.

    The task will then be executed once per host in its host list, which is
    (again) assembled in the same manner as CLI-specified tasks: drawing from
    :option:`-H`, :ref:`env.hosts <hosts>`, the `~fabric.decorators.hosts` or
    `~fabric.decorators.roles` decorators, and so forth.

    ``host``, ``hosts``, ``role``, ``roles`` and ``exclude_hosts`` kwargs will
    be stripped out of the final call, and used to set the task's host list, as
    if they had been specified on the command line like e.g. ``fab
    taskname:host=hostname``.

    Any other arguments or keyword arguments will be passed verbatim into
    ``task`` when it is called, so ``execute(mytask, 'arg1', kwarg1='value')``
    will (once per host) invoke ``mytask('arg1', kwarg1='value')``.

    .. seealso::
        :ref:`The execute usage docs <execute>`, for an expanded explanation
        and some examples.

    .. versionadded:: 1.3
    """
    my_env = {}
    # Obtain task
    if not callable(task):
        # Assume string, set env.command to it
        my_env["command"] = task
        task = crawl(task, state.commands)
        if task is None:
            abort("%r is not callable or a valid task name" % (task,))
    # Set env.command if we were given a real function or callable task obj
    else:
        dunder_name = getattr(task, "__name__", None)
        my_env["command"] = getattr(task, "name", dunder_name)
    # Normalize to Task instance
    if not hasattr(task, "run"):
        task = WrappedCallableTask(task)
    # Filter out hosts/roles kwargs
    new_kwargs, hosts, roles, exclude_hosts = parse_kwargs(kwargs)
    # Set up host list
    my_env["all_hosts"] = task.get_hosts(hosts, roles, exclude_hosts, state.env)

    # Get pool size for this task
    pool_size = task.get_pool_size(my_env["all_hosts"], state.env.pool_size)
    # Set up job queue in case parallel is needed
    jobs = JobQueue(pool_size)
    if state.output.debug:
        jobs._debug = True

    # Call on host list
    if my_env["all_hosts"]:
        for host in my_env["all_hosts"]:
            # Log to stdout
            if state.output.running and not hasattr(task, "return_value"):
                print ("[%s] Executing task '%s'" % (host, my_env["command"]))
            # Create per-run env with connection settings
            local_env = to_dict(host)
            local_env.update(my_env)
            state.env.update(local_env)
            # Handle parallel execution
            if requires_parallel(task):
                # Import multiprocessing if needed, erroring out usefully
                # if it can't.
                try:
                    import multiprocessing
                except ImportError, e:
                    msg = "At least one task needs to be run in parallel, but the\nmultiprocessing module cannot be imported:"
                    msg += "\n\n\t%s\n\n" % e
                    msg += "Please make sure the module is installed or that the above ImportError is\nfixed."
                    abort(msg)

                # Wrap in another callable that nukes the child's cached
                # connection object, if needed, to prevent shared-socket
                # problems.
                def inner(*args, **kwargs):
                    key = normalize_to_string(state.env.host_string)
                    state.connections.pop(key, "")
                    task.run(*args, **kwargs)

                # Stuff into Process wrapper
                p = multiprocessing.Process(target=inner, args=args, kwargs=new_kwargs)
                # Name/id is host string
                p.name = local_env["host_string"]
                # Add to queue
                jobs.append(p)
            # Handle serial execution
            else:
                task.run(*args, **new_kwargs)

        # If running in parallel, block until job queue is emptied
        if jobs:
            jobs.close()
            exitcodes = jobs.run()
#.........这里部分代码省略.........
开发者ID:sigman78,项目名称:fabric,代码行数:103,代码来源:tasks.py


注:本文中的fabric.job_queue.JobQueue.append方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。