本文简要介绍python语言中 torch.Tensor.is_leaf
的用法。
用法:
Tensor.is_leaf
按照惯例,所有具有
requires_grad
即False
的张量都将是叶张量。对于具有
requires_grad
即True
的张量,如果它们是由用户创建的,它们将是叶张量。这意味着它们不是操作的结果,因此grad_fn
为无。在调用
backward()
期间,只有叶张量的grad
才会被填充。要为非叶张量填充grad
,您可以使用retain_grad()
。例子:
>>> a = torch.rand(10, requires_grad=True) >>> a.is_leaf True >>> b = torch.rand(10, requires_grad=True).cuda() >>> b.is_leaf False # b was created by the operation that cast a cpu Tensor into a cuda Tensor >>> c = torch.rand(10, requires_grad=True) + 2 >>> c.is_leaf False # c was created by the addition operation >>> d = torch.rand(10).cuda() >>> d.is_leaf True # d does not require gradients and so has no operation creating it (that is tracked by the autograd engine) >>> e = torch.rand(10).cuda().requires_grad_() >>> e.is_leaf True # e requires gradients and has no operations creating it >>> f = torch.rand(10, requires_grad=True, device="cuda") >>> f.is_leaf True # f requires grad, has no operation creating it
相关用法
- Python PyTorch Tensor.imag用法及代码示例
- Python PyTorch Tensor.index_copy_用法及代码示例
- Python PyTorch Tensor.item用法及代码示例
- Python PyTorch Tensor.index_fill_用法及代码示例
- Python PyTorch Tensor.index_add_用法及代码示例
- Python PyTorch Tensor.unflatten用法及代码示例
- Python PyTorch Tensor.register_hook用法及代码示例
- Python PyTorch Tensor.storage_offset用法及代码示例
- Python PyTorch Tensor.to用法及代码示例
- Python PyTorch Tensor.sparse_mask用法及代码示例
- Python PyTorch Tensor.unfold用法及代码示例
- Python PyTorch Tensor.real用法及代码示例
- Python PyTorch Tensor.refine_names用法及代码示例
- Python PyTorch Tensor.rename用法及代码示例
- Python PyTorch Tensor.view用法及代码示例
- Python PyTorch Tensor.new_empty用法及代码示例
- Python PyTorch Tensor.new_tensor用法及代码示例
- Python PyTorch Tensor.scatter_用法及代码示例
- Python PyTorch Tensor.fill_diagonal_用法及代码示例
- Python PyTorch Tensor.repeat用法及代码示例
- Python PyTorch Tensor.tolist用法及代码示例
- Python PyTorch Tensor.put_用法及代码示例
- Python PyTorch Tensor.map_用法及代码示例
- Python PyTorch Tensor.stride用法及代码示例
- Python PyTorch Tensor.align_as用法及代码示例
注:本文由纯净天空筛选整理自pytorch.org大神的英文原创作品 torch.Tensor.is_leaf。非经特殊声明,原始代码版权归原作者所有,本译文未经允许或授权,请勿转载或复制。