本文簡要介紹python語言中 torch.autograd.detect_anomaly
的用法。
用法:
class torch.autograd.detect_anomaly
Context-manager 為 autograd 引擎啟用異常檢測。
這做了兩件事:
在啟用檢測的情況下運行正向傳遞將允許反向傳遞打印創建失敗的反向函數的正向操作的回溯。
任何生成 “nan” 值的反向計算都會引發錯誤。
警告
此模式應僅用於調試,因為不同的測試會減慢您的程序執行速度。
示例
>>> import torch >>> from torch import autograd >>> class MyFunc(autograd.Function): ... @staticmethod ... def forward(ctx, inp): ... return inp.clone() ... @staticmethod ... def backward(ctx, gO): ... # Error during the backward pass ... raise RuntimeError("Some error in backward") ... return gO.clone() >>> def run_fn(a): ... out = MyFunc.apply(a) ... return out.sum() >>> inp = torch.rand(10, 10, requires_grad=True) >>> out = run_fn(inp) >>> out.backward() Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/your/pytorch/install/torch/_tensor.py", line 93, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph) File "/your/pytorch/install/torch/autograd/__init__.py", line 90, in backward allow_unreachable=True) # allow_unreachable flag File "/your/pytorch/install/torch/autograd/function.py", line 76, in apply return self._forward_cls.backward(self, *args) File "<stdin>", line 8, in backward RuntimeError: Some error in backward >>> with autograd.detect_anomaly(): ... inp = torch.rand(10, 10, requires_grad=True) ... out = run_fn(inp) ... out.backward() Traceback of forward call that caused the error: File "tmp.py", line 53, in <module> out = run_fn(inp) File "tmp.py", line 44, in run_fn out = MyFunc.apply(a) Traceback (most recent call last): File "<stdin>", line 4, in <module> File "/your/pytorch/install/torch/_tensor.py", line 93, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph) File "/your/pytorch/install/torch/autograd/__init__.py", line 90, in backward allow_unreachable=True) # allow_unreachable flag File "/your/pytorch/install/torch/autograd/function.py", line 76, in apply return self._forward_cls.backward(self, *args) File "<stdin>", line 8, in backward RuntimeError: Some error in backward
相關用法
- Python PyTorch det用法及代碼示例
- Python PyTorch deform_conv2d用法及代碼示例
- Python PyTorch deg2rad用法及代碼示例
- Python PyTorch diag用法及代碼示例
- Python PyTorch dirac_用法及代碼示例
- Python PyTorch download_url_to_file用法及代碼示例
- Python PyTorch download_from_url用法及代碼示例
- Python PyTorch dot用法及代碼示例
- Python PyTorch diag_embed用法及代碼示例
- Python PyTorch diff用法及代碼示例
- Python PyTorch diagflat用法及代碼示例
- Python PyTorch dsplit用法及代碼示例
- Python PyTorch div用法及代碼示例
- Python PyTorch diagonal用法及代碼示例
- Python PyTorch dist用法及代碼示例
- Python PyTorch dstack用法及代碼示例
- Python PyTorch digamma用法及代碼示例
- Python PyTorch frexp用法及代碼示例
- Python PyTorch jvp用法及代碼示例
- Python PyTorch cholesky用法及代碼示例
- Python PyTorch vdot用法及代碼示例
- Python PyTorch ELU用法及代碼示例
- Python PyTorch ScaledDotProduct.__init__用法及代碼示例
- Python PyTorch gumbel_softmax用法及代碼示例
- Python PyTorch get_tokenizer用法及代碼示例
注:本文由純淨天空篩選整理自pytorch.org大神的英文原創作品 torch.autograd.detect_anomaly。非經特殊聲明,原始代碼版權歸原作者所有,本譯文未經允許或授權,請勿轉載或複製。