当前位置: 首页>>代码示例 >>用法及示例精选 >>正文


Python PyTorch detect_anomaly用法及代码示例


本文简要介绍python语言中 torch.autograd.detect_anomaly 的用法。

用法:

class torch.autograd.detect_anomaly

Context-manager 为 autograd 引擎启用异常检测。

这做了两件事:

  • 在启用检测的情况下运行正向传递将允许反向传递打印创建失败的反向函数的正向操作的回溯。

  • 任何生成 “nan” 值的反向计算都会引发错误。

警告

此模式应仅用于调试,因为不同的测试会减慢您的程序执行速度。

示例

>>> import torch
>>> from torch import autograd
>>> class MyFunc(autograd.Function):
...     @staticmethod
...     def forward(ctx, inp):
...         return inp.clone()
...     @staticmethod
...     def backward(ctx, gO):
...         # Error during the backward pass
...         raise RuntimeError("Some error in backward")
...         return gO.clone()
>>> def run_fn(a):
...     out = MyFunc.apply(a)
...     return out.sum()
>>> inp = torch.rand(10, 10, requires_grad=True)
>>> out = run_fn(inp)
>>> out.backward()
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/your/pytorch/install/torch/_tensor.py", line 93, in backward
        torch.autograd.backward(self, gradient, retain_graph, create_graph)
      File "/your/pytorch/install/torch/autograd/__init__.py", line 90, in backward
        allow_unreachable=True)  # allow_unreachable flag
      File "/your/pytorch/install/torch/autograd/function.py", line 76, in apply
        return self._forward_cls.backward(self, *args)
      File "<stdin>", line 8, in backward
    RuntimeError: Some error in backward
>>> with autograd.detect_anomaly():
...     inp = torch.rand(10, 10, requires_grad=True)
...     out = run_fn(inp)
...     out.backward()
    Traceback of forward call that caused the error:
      File "tmp.py", line 53, in <module>
        out = run_fn(inp)
      File "tmp.py", line 44, in run_fn
        out = MyFunc.apply(a)
    Traceback (most recent call last):
      File "<stdin>", line 4, in <module>
      File "/your/pytorch/install/torch/_tensor.py", line 93, in backward
        torch.autograd.backward(self, gradient, retain_graph, create_graph)
      File "/your/pytorch/install/torch/autograd/__init__.py", line 90, in backward
        allow_unreachable=True)  # allow_unreachable flag
      File "/your/pytorch/install/torch/autograd/function.py", line 76, in apply
        return self._forward_cls.backward(self, *args)
      File "<stdin>", line 8, in backward
    RuntimeError: Some error in backward

相关用法


注:本文由纯净天空筛选整理自pytorch.org大神的英文原创作品 torch.autograd.detect_anomaly。非经特殊声明,原始代码版权归原作者所有,本译文未经允许或授权,请勿转载或复制。