本文整理汇总了Python中rbm.RBM.params['W']方法的典型用法代码示例。如果您正苦于以下问题:Python RBM.params['W']方法的具体用法?Python RBM.params['W']怎么用?Python RBM.params['W']使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类rbm.RBM
的用法示例。
在下文中一共展示了RBM.params['W']方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: pretrain_rbm_layers
# 需要导入模块: from rbm import RBM [as 别名]
# 或者: from rbm.RBM import params['W'] [as 别名]
def pretrain_rbm_layers(v, validation_v=None, n_hidden=[], gibbs_steps=[], batch_size=[], num_epochs=[], learning_rate=[], probe_epochs=[]):
"""
Fake pre-training, just randomly initialising the weights of RBM layers
:param v:
:param validation_v:
:param n_hidden:
:param gibbs_steps:
:param batch_size:
:param num_epochs:
:param learning_rate:
:param probe_epochs:
:return:
"""
rbm_layers = []
n_rbm = len(n_hidden)
# create rbm layers
for i in range(n_rbm):
rbm = RBM(n_hidden=n_hidden[i],
gibbs_steps=gibbs_steps[i],
batch_size=batch_size[i],
num_epochs=num_epochs[i],
learning_rate=learning_rate[i],
probe_epochs=probe_epochs[i])
rbm_layers.append(rbm)
# pretrain rbm layers
n_v = v.shape[1]
for rbm, i in zip(rbm_layers, range(len(rbm_layers))):
print '### pretraining RBM Layer {i}'.format(i=i)
n_h = n_hidden[i]
initial_W = np.float32(np.random.uniform(
low=-4 * np.sqrt(6.0 / (n_h + n_v)),
high=4 * np.sqrt(6.0 / (n_h + n_v)),
size=(n_v, n_h)
))
rbm.params['W'] = initial_W
rbm.params['c'] = np.zeros((n_h, ), np.float32)
n_v = n_h
return rbm_layers