本文整理汇总了Python中blocks.bricks.recurrent.SimpleRecurrent.biases_init方法的典型用法代码示例。如果您正苦于以下问题:Python SimpleRecurrent.biases_init方法的具体用法?Python SimpleRecurrent.biases_init怎么用?Python SimpleRecurrent.biases_init使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类blocks.bricks.recurrent.SimpleRecurrent
的用法示例。
在下文中一共展示了SimpleRecurrent.biases_init方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: CategoricalCrossEntropy
# 需要导入模块: from blocks.bricks.recurrent import SimpleRecurrent [as 别名]
# 或者: from blocks.bricks.recurrent.SimpleRecurrent import biases_init [as 别名]
softmax_out = softmax_out.reshape(shape)
softmax_out.name = 'softmax_out'
# comparing only last time-step
cost = CategoricalCrossEntropy().apply(y[-1, :, 0], softmax_out[-1])
cost.name = 'CrossEntropy'
error_rate = MisclassificationRate().apply(y[-1, :, 0], softmax_out[-1])
error_rate.name = 'error_rate'
# Initialization
for brick in (x_to_h1, h1_to_o):
brick.weights_init = IsotropicGaussian(0.01)
brick.biases_init = Constant(0)
brick.initialize()
rnn.weights_init = Identity()
rnn.biases_init = Constant(0)
rnn.initialize()
print 'Bulding training process...'
algorithm = GradientDescent(
cost=cost,
parameters=ComputationGraph(cost).parameters,
step_rule=learning_algorithm(learning_rate=1e-6, momentum=0.0,
clipping_threshold=1.0, algorithm='adam'))
cg = ComputationGraph(cost)
params_to_sync = {}
#cg.variables
counter = 0
print "---- cg.parameters ----"
示例2: GatedRecurrent
# 需要导入模块: from blocks.bricks.recurrent import SimpleRecurrent [as 别名]
# 或者: from blocks.bricks.recurrent.SimpleRecurrent import biases_init [as 别名]
#lstm = GatedRecurrent(dim=h_dim,
# activation=Tanh())
decode = Linear(name='decode',
input_dim=h_dim,
output_dim=1)
for brick in (encode, gates, decode):
brick.weights_init = IsotropicGaussian(0.01)
brick.biases_init = Constant(0.)
brick.initialize()
lstm.weights_init = IsotropicGaussian(0.01)
#lstm.weights_init = Orthogonal()
lstm.biases_init = Constant(0.)
lstm.initialize()
#ComputationGraph(encode.apply(x)).get_theano_function()(features_test)[0].shape
#ComputationGraph(lstm.apply(encoded)).get_theano_function()(features_test)
#ComputationGraph(decode.apply(hiddens[-1])).get_theano_function()(features_test)[0].shape
#ComputationGraph(SquaredError().apply(y, y_hat.flatten())).get_theano_function()(features_test, targets_test)[0].shape
encoded = encode.apply(x)
#hiddens = lstm.apply(encoded, gates.apply(x))
hiddens = lstm.apply(encoded)
y_hat = decode.apply(hiddens[-1])
cost = SquaredError().apply(y, y_hat)
cost.name = 'cost'