本文简要介绍python语言中 torch.autograd.detect_anomaly
的用法。
用法:
class torch.autograd.detect_anomaly
Context-manager 为 autograd 引擎启用异常检测。
这做了两件事:
在启用检测的情况下运行正向传递将允许反向传递打印创建失败的反向函数的正向操作的回溯。
任何生成 “nan” 值的反向计算都会引发错误。
警告
此模式应仅用于调试,因为不同的测试会减慢您的程序执行速度。
示例
>>> import torch >>> from torch import autograd >>> class MyFunc(autograd.Function): ... @staticmethod ... def forward(ctx, inp): ... return inp.clone() ... @staticmethod ... def backward(ctx, gO): ... # Error during the backward pass ... raise RuntimeError("Some error in backward") ... return gO.clone() >>> def run_fn(a): ... out = MyFunc.apply(a) ... return out.sum() >>> inp = torch.rand(10, 10, requires_grad=True) >>> out = run_fn(inp) >>> out.backward() Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/your/pytorch/install/torch/_tensor.py", line 93, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph) File "/your/pytorch/install/torch/autograd/__init__.py", line 90, in backward allow_unreachable=True) # allow_unreachable flag File "/your/pytorch/install/torch/autograd/function.py", line 76, in apply return self._forward_cls.backward(self, *args) File "<stdin>", line 8, in backward RuntimeError: Some error in backward >>> with autograd.detect_anomaly(): ... inp = torch.rand(10, 10, requires_grad=True) ... out = run_fn(inp) ... out.backward() Traceback of forward call that caused the error: File "tmp.py", line 53, in <module> out = run_fn(inp) File "tmp.py", line 44, in run_fn out = MyFunc.apply(a) Traceback (most recent call last): File "<stdin>", line 4, in <module> File "/your/pytorch/install/torch/_tensor.py", line 93, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph) File "/your/pytorch/install/torch/autograd/__init__.py", line 90, in backward allow_unreachable=True) # allow_unreachable flag File "/your/pytorch/install/torch/autograd/function.py", line 76, in apply return self._forward_cls.backward(self, *args) File "<stdin>", line 8, in backward RuntimeError: Some error in backward
相关用法
- Python PyTorch det用法及代码示例
- Python PyTorch deform_conv2d用法及代码示例
- Python PyTorch deg2rad用法及代码示例
- Python PyTorch diag用法及代码示例
- Python PyTorch dirac_用法及代码示例
- Python PyTorch download_url_to_file用法及代码示例
- Python PyTorch download_from_url用法及代码示例
- Python PyTorch dot用法及代码示例
- Python PyTorch diag_embed用法及代码示例
- Python PyTorch diff用法及代码示例
- Python PyTorch diagflat用法及代码示例
- Python PyTorch dsplit用法及代码示例
- Python PyTorch div用法及代码示例
- Python PyTorch diagonal用法及代码示例
- Python PyTorch dist用法及代码示例
- Python PyTorch dstack用法及代码示例
- Python PyTorch digamma用法及代码示例
- Python PyTorch frexp用法及代码示例
- Python PyTorch jvp用法及代码示例
- Python PyTorch cholesky用法及代码示例
- Python PyTorch vdot用法及代码示例
- Python PyTorch ELU用法及代码示例
- Python PyTorch ScaledDotProduct.__init__用法及代码示例
- Python PyTorch gumbel_softmax用法及代码示例
- Python PyTorch get_tokenizer用法及代码示例
注:本文由纯净天空筛选整理自pytorch.org大神的英文原创作品 torch.autograd.detect_anomaly。非经特殊声明,原始代码版权归原作者所有,本译文未经允许或授权,请勿转载或复制。