当前位置: 首页>>代码示例>>Python>>正文


Python layers.Layer方法代码示例

本文整理汇总了Python中lasagne.layers.Layer方法的典型用法代码示例。如果您正苦于以下问题:Python layers.Layer方法的具体用法?Python layers.Layer怎么用?Python layers.Layer使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在lasagne.layers的用法示例。


在下文中一共展示了layers.Layer方法的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: __init__

# 需要导入模块: from lasagne import layers [as 别名]
# 或者: from lasagne.layers import Layer [as 别名]
def __init__(self, incoming, pool_size, stride=None, pad=(0, 0),
                 ignore_border=True, centered=True, **kwargs):
        """A padded pooling layer

        Parameters
        ----------
        incoming : lasagne.layers.Layer
            The input layer
        pool_size : int
            The size of the pooling
        stride : int or iterable of int
            The stride or subsampling of the convolution
        pad :  int, iterable of int, ``full``, ``same`` or ``valid``
            **Ignored!** Kept for compatibility with the
            :class:``lasagne.layers.Pool2DLayer``
        ignore_border : bool
            See :class:``lasagne.layers.Pool2DLayer``
        centered : bool
            If True, the padding will be added on both sides. If False
            the zero padding will be applied on the upper left side.
        **kwargs
            Any additional keyword arguments are passed to the Layer
            superclass
        """
        self.centered = centered
        if pad not in [0, (0, 0), [0, 0]]:
            warnings.warn('The specified padding will be ignored',
                          RuntimeWarning)
        super(PaddedPool2DLayer, self).__init__(incoming,
                                                pool_size,
                                                stride,
                                                pad,
                                                ignore_border,
                                                **kwargs)
        if self.input_shape[2:] != (None, None):
            warnings.warn('This Layer should only be used when the size of '
                          'the image is not known', RuntimeWarning) 
开发者ID:fvisin,项目名称:reseg,代码行数:39,代码来源:padded.py

示例2: get_equivalent_input_padding

# 需要导入模块: from lasagne import layers [as 别名]
# 或者: from lasagne.layers import Layer [as 别名]
def get_equivalent_input_padding(layer, layers_args=[]):
    """Compute the equivalent padding in the input layer

    A function to compute the equivalent padding of a sequence of
    convolutional and pooling layers. It memorizes the padding
    of all the Layers up to the first InputLayer.
    It then computes what would be the equivalent padding in the Layer
    immediately before the chain of Layers that is being taken into account.
    """
    # Initialize the DynamicPadding layers
    lasagne.layers.get_output(layer)
    # Loop through conv and pool to collect data
    all_layers = get_all_layers(layer)
    # while(not isinstance(layer, (InputLayer))):
    for layer in all_layers:
        # Note: stride is numerical, but pad *could* be symbolic
        try:
            pad, stride = (layer.pad, layer.stride)
            if isinstance(pad, int):
                pad = pad, pad
            if isinstance(stride, int):
                stride = stride, stride
            layers_args.append((pad, stride))
        except(AttributeError):
            pass

    # Loop backward to compute the equivalent padding in the input
    # layer
    tot_pad = T.zeros(2)
    pad_factor = T.ones(2)
    while(layers_args):
        pad, stride = layers_args.pop()
        tot_pad += pad * pad_factor
        pad_factor *= stride

    return tot_pad 
开发者ID:fvisin,项目名称:reseg,代码行数:38,代码来源:padded.py

示例3: batch_norm

# 需要导入模块: from lasagne import layers [as 别名]
# 或者: from lasagne.layers import Layer [as 别名]
def batch_norm(layer, **kwargs):
    """
    Apply batch normalization to an existing layer. This is a convenience
    function modifying an existing layer to include batch normalization: It
    will steal the layer's nonlinearity if there is one (effectively
    introducing the normalization right before the nonlinearity), remove
    the layer's bias if there is one (because it would be redundant), and add
    a :class:`BatchNormLayer` and :class:`NonlinearityLayer` on top.
    Parameters
    ----------
    layer : A :class:`Layer` instance
        The layer to apply the normalization to; note that it will be
        irreversibly modified as specified above
    **kwargs
        Any additional keyword arguments are passed on to the
        :class:`BatchNormLayer` constructor.
    Returns
    -------
    BatchNormLayer or NonlinearityLayer instance
        A batch normalization layer stacked on the given modified `layer`, or
        a nonlinearity layer stacked on top of both if `layer` was nonlinear.
    Examples
    --------
    Just wrap any layer into a :func:`batch_norm` call on creating it:
    >>> from lasagne.layers import InputLayer, DenseLayer, batch_norm
    >>> from lasagne.nonlinearities import tanh
    >>> l1 = InputLayer((64, 768))
    >>> l2 = batch_norm(DenseLayer(l1, num_units=500, nonlinearity=tanh))
    This introduces batch normalization right before its nonlinearity:
    >>> from lasagne.layers import get_all_layers
    >>> [l.__class__.__name__ for l in get_all_layers(l2)]
    ['InputLayer', 'DenseLayer', 'BatchNormLayer', 'NonlinearityLayer']
    """
    nonlinearity = getattr(layer, 'nonlinearity', None)
    if nonlinearity is not None:
        layer.nonlinearity = nonlinearities.identity
    if hasattr(layer, 'b') and layer.b is not None:
        del layer.params[layer.b]
        layer.b = None
    layer = BatchNormLayer(layer, **kwargs)
    if nonlinearity is not None:
        from lasagne.layers import NonlinearityLayer
        layer = NonlinearityLayer(layer, nonlinearity)
    return layer 
开发者ID:SBU-BMI,项目名称:u24_lymphocyte,代码行数:46,代码来源:batch_norms.py


注:本文中的lasagne.layers.Layer方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。