当前位置: 首页>>代码示例>>Python>>正文


Python functions.elu方法代码示例

本文整理汇总了Python中chainer.functions.elu方法的典型用法代码示例。如果您正苦于以下问题:Python functions.elu方法的具体用法?Python functions.elu怎么用?Python functions.elu使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在chainer.functions的用法示例。


在下文中一共展示了functions.elu方法的6个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: __init__

# 需要导入模块: from chainer import functions [as 别名]
# 或者: from chainer.functions import elu [as 别名]
def __init__(self,
                 in_channels,
                 out_channels):
        super(LwopEncoderFinalBlock, self).__init__()
        with self.init_scope():
            self.pre_conv = conv1x1_block(
                in_channels=in_channels,
                out_channels=out_channels,
                use_bias=True,
                use_bn=False)
            self.body = SimpleSequential()
            with self.body.init_scope():
                for i in range(3):
                    setattr(self.body, "block{}".format(i + 1), dwsconv3x3_block(
                        in_channels=out_channels,
                        out_channels=out_channels,
                        use_bn=False,
                        dw_activation=(lambda: F.elu),
                        pw_activation=(lambda: F.elu)))
            self.post_conv = conv3x3_block(
                in_channels=out_channels,
                out_channels=out_channels,
                use_bias=True,
                use_bn=False) 
开发者ID:osmr,项目名称:imgclsmob,代码行数:26,代码来源:lwopenpose_cmupan.py

示例2: forward

# 需要导入模块: from chainer import functions [as 别名]
# 或者: from chainer.functions import elu [as 别名]
def forward(self, inputs):
        # Input shape: [batch_size, num_nodes, feature_dims]
        batch_size, num_nodes = inputs.shape[:2]
        inputs = inputs.reshape(batch_size * num_nodes, -1)
        # New shape: [batch_size * num_nodes, feature_dims]

        x = F.elu(self.fc1(inputs))
        x = F.dropout(x, self.dropout_prob)
        x = F.elu(self.fc2(x))
        x = self.bn(x)

        return x.reshape(batch_size, num_nodes, -1) 
开发者ID:chainer,项目名称:models,代码行数:14,代码来源:mlp.py

示例3: make_q_func

# 需要导入模块: from chainer import functions [as 别名]
# 或者: from chainer.functions import elu [as 别名]
def make_q_func(self, env):
        n_hidden_channels = 10
        return StatelessRecurrentSequential(
            L.Linear(env.observation_space.low.size, n_hidden_channels),
            F.elu,
            L.NStepRNNTanh(1, n_hidden_channels, n_hidden_channels, 0),
            L.Linear(n_hidden_channels, env.action_space.n),
            DiscreteActionValue,
        ) 
开发者ID:chainer,项目名称:chainerrl,代码行数:11,代码来源:basetest_dqn_like.py

示例4: forward

# 需要导入模块: from chainer import functions [as 别名]
# 或者: from chainer.functions import elu [as 别名]
def forward(self, inputs, device):
        x, = inputs
        return functions.elu(x, alpha=self.alpha), 
开发者ID:chainer,项目名称:chainer,代码行数:5,代码来源:test_elu.py

示例5: __call__

# 需要导入模块: from chainer import functions [as 别名]
# 或者: from chainer.functions import elu [as 别名]
def __call__(self, x, test=False):
        h = self.b1(F.elu(self.c1(x)), test=test)
        h = self.b2(F.elu(self.c2(h)), test=test)
        h = self.b3(F.elu(self.c3(h)), test=test)
        h = self.r1(h, test=test)
        h = self.r2(h, test=test)
        h = self.r3(h, test=test)
        h = self.r4(h, test=test)
        h = self.r5(h, test=test)
        h = self.b4(F.elu(self.d1(h)), test=test)
        h = self.b5(F.elu(self.d2(h)), test=test)
        y = self.d3(h)
        return (F.tanh(y)+1)*127.5 
开发者ID:yusuketomoto,项目名称:chainer-fast-neuralstyle,代码行数:15,代码来源:net.py

示例6: selu

# 需要导入模块: from chainer import functions [as 别名]
# 或者: from chainer.functions import elu [as 别名]
def selu(x):
    alpha = float(1.6732632423543772848170429916717)
    scale = float(1.0507009873554804934193349852946)
    return  scale * F.elu(x, alpha = alpha) 
开发者ID:Aixile,项目名称:chainer-gan-experiments,代码行数:6,代码来源:ops.py


注:本文中的chainer.functions.elu方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。