本文整理汇总了Python中fairseq.tasks.FairseqTask方法的典型用法代码示例。如果您正苦于以下问题:Python tasks.FairseqTask方法的具体用法?Python tasks.FairseqTask怎么用?Python tasks.FairseqTask使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类fairseq.tasks
的用法示例。
在下文中一共展示了tasks.FairseqTask方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: load_diverse_ensemble_for_inference
# 需要导入模块: from fairseq import tasks [as 别名]
# 或者: from fairseq.tasks import FairseqTask [as 别名]
def load_diverse_ensemble_for_inference(
filenames: List[str], task: Optional[tasks.FairseqTask] = None
):
"""Load an ensemble of diverse models for inference.
This method is similar to fairseq.utils.load_ensemble_for_inference
but allows to load diverse models with non-uniform args.
Args:
filenames: List of file names to checkpoints
task: Optional[FairseqTask]. If this isn't provided, we setup the task
using the first checkpoint's model args loaded from the saved state.
Return:
models, args: Tuple of lists. models contains the loaded models, args
the corresponding configurations.
task: Either the input task or the task created within this function
using args
"""
# load model architectures and weights
checkpoints_data = []
for filename in filenames:
if not PathManager.exists(filename):
raise IOError("Model file not found: {}".format(filename))
with PathManager.open(filename, "rb") as f:
checkpoints_data.append(
torch.load(
f,
map_location=lambda s, l: torch.serialization.default_restore_location(
s, "cpu"
),
)
)
# build ensemble
ensemble = []
if task is None:
if hasattr(checkpoints_data[0]["args"], "mode"):
checkpoints_data[0]["args"].mode = "eval"
task = tasks.setup_task(checkpoints_data[0]["args"])
for checkpoint_data in checkpoints_data:
model = task.build_model(checkpoint_data["args"])
model.load_state_dict(checkpoint_data["model"])
ensemble.append(model)
args_list = [s["args"] for s in checkpoints_data]
return ensemble, args_list, task