本文整理汇总了Python中torch.autograd.Variable.double方法的典型用法代码示例。如果您正苦于以下问题:Python Variable.double方法的具体用法?Python Variable.double怎么用?Python Variable.double使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类torch.autograd.Variable
的用法示例。
在下文中一共展示了Variable.double方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: test_module_cast
# 需要导入模块: from torch.autograd import Variable [as 别名]
# 或者: from torch.autograd.Variable import double [as 别名]
def test_module_cast(self):
"""Compiled modules can be casted to other data types"""
@torch.jit.compile(nderivs=0)
class Adder(nn.Module):
def __init__(self):
super(Adder, self).__init__()
self.y = nn.Parameter(torch.randn(2, 2))
def forward(self, x):
return x + self.y
x = Variable(torch.randn(2, 2).float())
# Wrap it in a sequential to make sure it works for submodules
a = nn.Sequential(Adder()).float()
def check_type(caster):
caster(a)
a(caster(x))
with self.assertCompiled(a[0]):
a(caster(x))
check_type(lambda x: x)
check_type(lambda x: x.double())
if torch.cuda.is_available():
check_type(lambda x: x.float().cuda())
check_type(lambda x: x.double().cuda())
self.assertEqual(a[0].hits, 4 if torch.cuda.is_available() else 2)
示例2: test_broadcast_subspace
# 需要导入模块: from torch.autograd import Variable [as 别名]
# 或者: from torch.autograd.Variable import double [as 别名]
def test_broadcast_subspace(self):
a = zeros((100, 100))
v = Variable(torch.arange(0, 100))[:, None]
b = Variable(torch.arange(99, -1, -1).long())
a[b] = v
expected = b.double().unsqueeze(1).expand(100, 100)
self.assertEqual(a, expected)