当前位置: 首页>>代码示例>>Python>>正文


Python layers.RNN属性代码示例

本文整理汇总了Python中keras.layers.RNN属性的典型用法代码示例。如果您正苦于以下问题:Python layers.RNN属性的具体用法?Python layers.RNN怎么用?Python layers.RNN使用的例子?那么, 这里精选的属性代码示例或许可以为您提供帮助。您也可以进一步了解该属性所在keras.layers的用法示例。


在下文中一共展示了layers.RNN属性的5个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: __init__

# 需要导入模块: from keras import layers [as 别名]
# 或者: from keras.layers import RNN [as 别名]
def __init__(self, layers, cell_type, cell_params):
        """
        Build the rnn with the given number of layers.
        :param layers: list
            list of integers. The i-th element of the list is the number of hidden neurons for the i-th layer.
        :param cell_type: 'gru', 'rnn', 'lstm'
        :param cell_params: dict
            A dictionary containing all the paramters for the RNN cell.
            see keras.layers.LSTMCell, keras.layers.GRUCell or keras.layers.SimpleRNNCell for more details.
        """
        # init params
        self.model = None
        self.horizon = None
        self.layers = layers
        self.cell_params = cell_params
        if cell_type == 'lstm':
            self.cell = LSTMCell
        elif cell_type == 'gru':
            self.cell = GRUCell
        elif cell_type == 'rnn':
            self.cell = SimpleRNNCell
        else:
            raise NotImplementedError('{0} is not a valid cell type.'.format(cell_type))
        # Build deep rnn
        self.rnn = self._build_rnn() 
开发者ID:albertogaspar,项目名称:dts,代码行数:27,代码来源:Recurrent.py

示例2: _build_rnn

# 需要导入模块: from keras import layers [as 别名]
# 或者: from keras.layers import RNN [as 别名]
def _build_rnn(self):
        cells = []
        for _ in range(self.layers):
            cells.append(self.cell(**self.cell_params))
        deep_rnn = RNN(cells, return_sequences=False, return_state=False)
        return deep_rnn 
开发者ID:albertogaspar,项目名称:dts,代码行数:8,代码来源:Recurrent.py

示例3: __init__

# 需要导入模块: from keras import layers [as 别名]
# 或者: from keras.layers import RNN [as 别名]
def __init__(self,
                 encoder_layers,
                 decoder_layers,
                 output_sequence_length,
                 dropout=0.0,
                 l2=0.01,
                 cell_type='lstm'):
        """
        :param encoder_layers: list
            encoder (RNN) architecture: [n_hidden_units_1st_layer, n_hidden_units_2nd_layer, ...]
        :param decoder_layers: list
            decoder (RNN) architecture: [n_hidden_units_1st_layer, n_hidden_units_2nd_layer, ...]
        :param output_sequence_length: int
            number of timestep to be predicted.
        :param cell_type: str
            gru or lstm.
        """
        self.encoder_layers = encoder_layers
        self.decoder_layers = decoder_layers
        self.output_sequence_length = output_sequence_length
        self.dropout = dropout
        self.l2 = l2
        if cell_type == 'lstm':
            self.cell = LSTMCell
        elif cell_type == 'gru':
            self.cell = GRUCell
        else:
            raise ValueError('{0} is not a valid cell type. Choose between gru and lstm.'.format(cell_type)) 
开发者ID:albertogaspar,项目名称:dts,代码行数:30,代码来源:Seq2Seq.py

示例4: _build_encoder

# 需要导入模块: from keras import layers [as 别名]
# 或者: from keras.layers import RNN [as 别名]
def _build_encoder(self):
        """
        Build the encoder multilayer RNN (stacked RNN)
        """
        # Create a list of RNN Cells, these get stacked one after the other in the RNN,
        # implementing an efficient stacked RNN
        encoder_cells = []
        for n_hidden_neurons in self.encoder_layers:
            encoder_cells.append(self.cell(units=n_hidden_neurons,
                                           dropout=self.dropout,
                                           kernel_regularizer=l2(self.l2),
                                           recurrent_regularizer=l2(self.l2)))

        self.encoder = RNN(encoder_cells, return_state=True, name='encoder') 
开发者ID:albertogaspar,项目名称:dts,代码行数:16,代码来源:Seq2Seq.py

示例5: _build_decoder

# 需要导入模块: from keras import layers [as 别名]
# 或者: from keras.layers import RNN [as 别名]
def _build_decoder(self):
        decoder_cells = []
        for n_hidden_neurons in self.decoder_layers:
            decoder_cells.append(self.cell(units=n_hidden_neurons,
                                           dropout=self.dropout,
                                           kernel_regularizer=l2(self.l2),
                                           recurrent_regularizer=l2(self.l2)
                                           ))
        # return output for EACH timestamp
        self.decoder = RNN(decoder_cells, return_sequences=True, return_state=True, name='decoder') 
开发者ID:albertogaspar,项目名称:dts,代码行数:12,代码来源:Seq2Seq.py


注:本文中的keras.layers.RNN属性示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。