本文整理匯總了Python中torch.nn.CELU屬性的典型用法代碼示例。如果您正苦於以下問題:Python nn.CELU屬性的具體用法?Python nn.CELU怎麽用?Python nn.CELU使用的例子?那麽, 這裏精選的屬性代碼示例或許可以為您提供幫助。您也可以進一步了解該屬性所在類torch.nn
的用法示例。
在下文中一共展示了nn.CELU屬性的1個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Python代碼示例。
示例1: get_AF
# 需要導入模塊: from torch import nn [as 別名]
# 或者: from torch.nn import CELU [as 別名]
def get_AF(af_str):
"""
Given the string identifier, get PyTorch-supported activation function.
"""
if af_str == 'R':
return nn.ReLU() # ReLU(x)=max(0,x)
elif af_str == 'LR':
return nn.LeakyReLU() # LeakyReLU(x)=max(0,x)+negative_slope∗min(0,x)
elif af_str == 'RR':
return nn.RReLU() # the randomized leaky rectified liner unit function
elif af_str == 'E': # ELU(x)=max(0,x)+min(0,α∗(exp(x)−1))
return nn.ELU()
elif af_str == 'SE': # SELU(x)=scale∗(max(0,x)+min(0,α∗(exp(x)−1)))
return nn.SELU()
elif af_str == 'CE': # CELU(x)=max(0,x)+min(0,α∗(exp(x/α)−1))
return nn.CELU()
elif af_str == 'S':
return nn.Sigmoid()
elif af_str == 'SW':
#return SWISH()
raise NotImplementedError
elif af_str == 'T':
return nn.Tanh()
elif af_str == 'ST': # a kind of normalization
return F.softmax() # Applies the Softmax function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range (0,1) and sum to 1
elif af_str == 'EP':
#return Exp()
raise NotImplementedError
else:
raise NotImplementedError