本文整理匯總了Python中torch.nn.Hardshrink方法的典型用法代碼示例。如果您正苦於以下問題:Python nn.Hardshrink方法的具體用法?Python nn.Hardshrink怎麽用?Python nn.Hardshrink使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在類torch.nn
的用法示例。
在下文中一共展示了nn.Hardshrink方法的2個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Python代碼示例。
示例1: create_str_to_activations_converter
# 需要導入模塊: from torch import nn [as 別名]
# 或者: from torch.nn import Hardshrink [as 別名]
def create_str_to_activations_converter(self):
"""Creates a dictionary which converts strings to activations"""
str_to_activations_converter = {"elu": nn.ELU(), "hardshrink": nn.Hardshrink(), "hardtanh": nn.Hardtanh(),
"leakyrelu": nn.LeakyReLU(), "logsigmoid": nn.LogSigmoid(), "prelu": nn.PReLU(),
"relu": nn.ReLU(), "relu6": nn.ReLU6(), "rrelu": nn.RReLU(), "selu": nn.SELU(),
"sigmoid": nn.Sigmoid(), "softplus": nn.Softplus(), "logsoftmax": nn.LogSoftmax(),
"softshrink": nn.Softshrink(), "softsign": nn.Softsign(), "tanh": nn.Tanh(),
"tanhshrink": nn.Tanhshrink(), "softmin": nn.Softmin(), "softmax": nn.Softmax(dim=1),
"none": None}
return str_to_activations_converter
示例2: __init__
# 需要導入模塊: from torch import nn [as 別名]
# 或者: from torch.nn import Hardshrink [as 別名]
def __init__(self, mem_dim, fea_dim, shrink_thres=0.0025):
super(MemoryUnit, self).__init__()
self.mem_dim = mem_dim
self.fea_dim = fea_dim
self.weight = Parameter(torch.Tensor(self.mem_dim, self.fea_dim)) # M x C
self.bias = None
self.shrink_thres= shrink_thres
# self.hard_sparse_shrink_opt = nn.Hardshrink(lambd=shrink_thres)
self.reset_parameters()