本文簡要介紹python語言中 torch.Tensor.is_leaf
的用法。
用法:
Tensor.is_leaf
按照慣例,所有具有
requires_grad
即False
的張量都將是葉張量。對於具有
requires_grad
即True
的張量,如果它們是由用戶創建的,它們將是葉張量。這意味著它們不是操作的結果,因此grad_fn
為無。在調用
backward()
期間,隻有葉張量的grad
才會被填充。要為非葉張量填充grad
,您可以使用retain_grad()
。例子:
>>> a = torch.rand(10, requires_grad=True) >>> a.is_leaf True >>> b = torch.rand(10, requires_grad=True).cuda() >>> b.is_leaf False # b was created by the operation that cast a cpu Tensor into a cuda Tensor >>> c = torch.rand(10, requires_grad=True) + 2 >>> c.is_leaf False # c was created by the addition operation >>> d = torch.rand(10).cuda() >>> d.is_leaf True # d does not require gradients and so has no operation creating it (that is tracked by the autograd engine) >>> e = torch.rand(10).cuda().requires_grad_() >>> e.is_leaf True # e requires gradients and has no operations creating it >>> f = torch.rand(10, requires_grad=True, device="cuda") >>> f.is_leaf True # f requires grad, has no operation creating it
相關用法
- Python PyTorch Tensor.imag用法及代碼示例
- Python PyTorch Tensor.index_copy_用法及代碼示例
- Python PyTorch Tensor.item用法及代碼示例
- Python PyTorch Tensor.index_fill_用法及代碼示例
- Python PyTorch Tensor.index_add_用法及代碼示例
- Python PyTorch Tensor.unflatten用法及代碼示例
- Python PyTorch Tensor.register_hook用法及代碼示例
- Python PyTorch Tensor.storage_offset用法及代碼示例
- Python PyTorch Tensor.to用法及代碼示例
- Python PyTorch Tensor.sparse_mask用法及代碼示例
- Python PyTorch Tensor.unfold用法及代碼示例
- Python PyTorch Tensor.real用法及代碼示例
- Python PyTorch Tensor.refine_names用法及代碼示例
- Python PyTorch Tensor.rename用法及代碼示例
- Python PyTorch Tensor.view用法及代碼示例
- Python PyTorch Tensor.new_empty用法及代碼示例
- Python PyTorch Tensor.new_tensor用法及代碼示例
- Python PyTorch Tensor.scatter_用法及代碼示例
- Python PyTorch Tensor.fill_diagonal_用法及代碼示例
- Python PyTorch Tensor.repeat用法及代碼示例
- Python PyTorch Tensor.tolist用法及代碼示例
- Python PyTorch Tensor.put_用法及代碼示例
- Python PyTorch Tensor.map_用法及代碼示例
- Python PyTorch Tensor.stride用法及代碼示例
- Python PyTorch Tensor.align_as用法及代碼示例
注:本文由純淨天空篩選整理自pytorch.org大神的英文原創作品 torch.Tensor.is_leaf。非經特殊聲明,原始代碼版權歸原作者所有,本譯文未經允許或授權,請勿轉載或複製。