本文整理汇总了Python中torch.autograd.Variable.exp方法的典型用法代码示例。如果您正苦于以下问题:Python Variable.exp方法的具体用法?Python Variable.exp怎么用?Python Variable.exp使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类torch.autograd.Variable
的用法示例。
在下文中一共展示了Variable.exp方法的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: forward
# 需要导入模块: from torch.autograd import Variable [as 别名]
# 或者: from torch.autograd.Variable import exp [as 别名]
def forward(self, x):
if self.zero_mean:
lrt_mean = self.op_bias(x, 0.0 * self.weight)
else:
lrt_mean = self.op_bias(x, self.weight)
sigma2 = Variable.exp(self.log_alpha) * self.weight * self.weight
if self.permute_sigma:
sigma2 = sigma2.view(-1)[torch.randperm(self.weight.nelement()).cuda()].view(self.weight.shape)
lrt_std = Variable.sqrt(1e-16 + self.op_nobias(x * x, sigma2))
if self.training:
eps = Variable(lrt_std.data.new(lrt_std.size()).normal_())
else:
eps = 0.0
return lrt_mean + lrt_std * eps
示例2: test_exp
# 需要导入模块: from torch.autograd import Variable [as 别名]
# 或者: from torch.autograd.Variable import exp [as 别名]
def test_exp(self):
x = Variable(torch.randn(3, 4), requires_grad=True)
self.assertONNX(lambda x: x.exp(), x)
示例3: kl_loguni
# 需要导入模块: from torch.autograd import Variable [as 别名]
# 或者: from torch.autograd.Variable import exp [as 别名]
def kl_loguni(log_alpha):
k1, k2, k3 = 0.63576, 1.8732, 1.48695
C = -k1
mdkl = k1 * F.sigmoid(k2 + k3 * log_alpha) - 0.5 * Variable.log1p(Variable.exp(-log_alpha)) + C
kl = -Variable.sum(mdkl)
return kl
示例4: kl_ard
# 需要导入模块: from torch.autograd import Variable [as 别名]
# 或者: from torch.autograd.Variable import exp [as 别名]
def kl_ard(log_alpha):
return 0.5 * Variable.sum(Variable.log1p(Variable.exp(-log_alpha)))