本文整理匯總了Python中mlp.MLP.preEpochHook方法的典型用法代碼示例。如果您正苦於以下問題:Python MLP.preEpochHook方法的具體用法?Python MLP.preEpochHook怎麽用?Python MLP.preEpochHook使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在類mlp.MLP
的用法示例。
在下文中一共展示了MLP.preEpochHook方法的1個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Python代碼示例。
示例1: map
# 需要導入模塊: from mlp import MLP [as 別名]
# 或者: from mlp.MLP import preEpochHook [as 別名]
elif cfg.pretrain:
rbmstack.run(0,cfg.weight_updates, mbp)
if cfg.finetune:
# clean up RBM parts which are not needed anymore
map(lambda x:x.deallocPChain(), rbmstack.layers)
map(lambda x:x.dealloc(), rbmstack.layers)
weights = map(lambda x: x.mat, rbmstack.weights)
biases = map(lambda x: x.bias_hi, rbmstack.weights)
from mlp import MLP
pymlp = MLP(cfg, weights,biases)
pymlp.preEpochHook = lambda mlp,epoch: epoch%10==0 and mlp.runMLP(mbp_test, cfg.test_batchsize,epoch)
try:
pymlp.train(mbp,cfg.finetune_epochs, cfg.finetune_batch_size, cfg.finetune_rprop)
except KeyboardInterrupt:
pass
map(lambda x:x.alloc(), rbmstack.layers)
map(lambda x:x.allocPChain(), rbmstack.layers)
rbmstack.saveAllLayers("-finetune")
pymlp.saveLastLayer()
if cfg.headless:
cp.exitCUDA()
sys.exit(0)
PLT_NUM=1
import matplotlib.pyplot as plt