本文整理汇总了Python中multiprocessing.Pool.count方法的典型用法代码示例。如果您正苦于以下问题:Python Pool.count方法的具体用法?Python Pool.count怎么用?Python Pool.count使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类multiprocessing.Pool
的用法示例。
在下文中一共展示了Pool.count方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: main
# 需要导入模块: from multiprocessing import Pool [as 别名]
# 或者: from multiprocessing.Pool import count [as 别名]
def main():
# python27 unrpyc.py [-c] [-d] [--python-screens|--ast-screens|--no-screens] file [file ...]
parser = argparse.ArgumentParser(description="Decompile .rpyc files")
parser.add_argument('-c', '--clobber', dest='clobber', action='store_true',
help="overwrites existing output files")
parser.add_argument('-d', '--dump', dest='dump', action='store_true',
help="instead of decompiling, pretty print the ast to a file")
parser.add_argument('-p', '--processes', dest='processes', action='store', default=cpu_count(),
help="use the specified number of processes to decompile")
parser.add_argument('--sl1-as-python', dest='decompile_python', action='store_true',
help="Only dumping and for decompiling screen language 1 screens. "
"Convert SL1 Python AST to Python code instead of dumping it or converting it to screenlang.")
parser.add_argument('--comparable', dest='comparable', action='store_true',
help="Only for dumping, remove several false differences when comparing dumps. "
"This suppresses attributes that are different even when the code is identical, such as file modification times. ")
parser.add_argument('--no-pyexpr', dest='no_pyexpr', action='store_true',
help="Only for dumping, disable special handling of PyExpr objects, instead printing them as strings. "
"This is useful when comparing dumps from different versions of Ren'Py. "
"It should only be used if necessary, since it will cause loss of information such as line numbers.")
parser.add_argument('file', type=str, nargs='+',
help="The filenames to decompile")
args = parser.parse_args()
# Expand wildcards
files = map(glob.glob, args.file)
# Concatenate lists
files = list(itertools.chain(*files))
# Check if we actually have files
if len(files) == 0:
parser.print_help();
parser.error("No script files given.")
files = map(lambda x: (args, x, path.getsize(x)), files)
processes = int(args.processes)
if processes > 1:
# If a big file starts near the end, there could be a long time with
# only one thread running, which is inefficient. Avoid this by starting
# big files first.
files = sorted(files, key=itemgetter(2), reverse=True)
results = Pool(int(args.processes), sharelock, [printlock]).map(worker, files, 1)
else:
results = map(worker, files)
# Check per file if everything went well and report back
good = results.count(True)
bad = results.count(False)
if bad == 0:
print "Decompilation of %d script file%s successful" % (good, 's' if good>1 else '')
elif good == 0:
print "Decompilation of %d file%s failed" % (bad, 's' if bad>1 else '')
else:
print "Decompilation of %d file%s successful, but decompilation of %d file%s failed" % (good, 's' if good>1 else '', bad, 's' if bad>1 else '')