当前位置: 首页>>代码示例>>Python>>正文


Python Layer.build方法代码示例

本文整理汇总了Python中layer.Layer.build方法的典型用法代码示例。如果您正苦于以下问题:Python Layer.build方法的具体用法?Python Layer.build怎么用?Python Layer.build使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在layer.Layer的用法示例。


在下文中一共展示了Layer.build方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: NeuralNetwork

# 需要导入模块: from layer import Layer [as 别名]
# 或者: from layer.Layer import build [as 别名]
class NeuralNetwork(object):
    """
    A Neural network class with a scikit-compliant interface
    """

    cost_functions  = ['log-likelihood', 'cross-entropy']
    regularizations = ['l1', 'l2', 'none']


    ## --------------------------------------------
    def __init__(
        self,
        hidden_layers = [],
        learning_algorithm='Adam',
        output_activation='softmax',
        cost_function='cross-entropy',
        regularization='none',
        reg_lambda=1.0,
        learning_rate=0.1,
        early_stopping=False,
        stagnation=10,
        n_epochs=10,
        mini_batch_size=10,
        ):
        """
        Constructor
        """

        ## List of Layer objects, the input and output layers are automatically specified
        ## using the input data and the output_activation parameter
        self.hidden_layers = hidden_layers

        ## Parameters from the constructor
        self.learning_algorithm = learning_algorithm

        assert cost_function in self.cost_functions, 'Available cost functions are {0}'.format(', '.join(self.cost_functions))
        self.cost_function      = cost_function
        self.output_activation  = output_activation

        assert regularization in self.regularizations, 'Available regularizations are {0}'.format(', '.join(self.regularizations))
        self.regularization     = regularization
        self.reg_lambda         = reg_lambda

        self.learning_rate      = learning_rate

        self.early_stopping     = early_stopping
        self.stagnation         = stagnation

        self.n_epochs           = n_epochs
        self.mini_batch_size    = mini_batch_size



    ## --------------------------------------------
    def build(self, X, y):
        """
        Builds the neural network in tensorflow
        """

        ## Start a tensorflow interactive session
        self.session = tf.InteractiveSession()

        ## First, create a placeholder for the targets
        self.targets = tf.placeholder(tf.float32, shape=[None, self.n_categories])

        ## First, create the input layer
        self.input_layer = Layer(n_neurons=self.n_features)
        self.input_layer.build()

        ## Then create all the hidden layers
        current_input_layer = self.input_layer

        for layer in self.hidden_layers:
            layer.build(current_input_layer)
            current_input_layer = layer

        ## Create the output layer
        self.output_layer = Layer(n_neurons=self.n_categories, activation=self.output_activation)
        self.output_layer.build(current_input_layer)

        ## Define the cost function
        self.cost = None
        if self.cost_function == 'log-likelihood':
            self.cost = tf.reduce_mean(-tf.log(tf.reduce_sum(self.targets * self.output_layer.output, reduction_indices=[1])))
        else:
            self.cost = tf.reduce_mean(-tf.reduce_sum(self.targets * tf.log(self.output_layer.output), reduction_indices=[1]))

        ## Define the regularization parameters and function
        self.reg_lambda_param = tf.placeholder(tf.float32)
        self.batch_size       = tf.placeholder(tf.float32)

        if self.regularization == 'l1':
            self.reg_term = tf.reduce_sum(tf.abs(self.output_layer.weights))
            for layer in self.hidden_layers:
                self.reg_term += tf.reduce_sum(tf.abs(layer.weights))

        elif self.regularization == 'l2':
            self.reg_term = tf.reduce_sum(self.output_layer.weights * self.output_layer.weights)
            for layer in self.hidden_layers:
                self.reg_term += tf.reduce_sum(layer.weights * layer.weights)
#.........这里部分代码省略.........
开发者ID:mrunfeldt,项目名称:scikit-tfnn,代码行数:103,代码来源:neuralnetwork.py


注:本文中的layer.Layer.build方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。