本文整理汇总了Python中pybrain.datasets.SupervisedDataSet.addSample(X_train[j],y_train[j])-trainer方法的典型用法代码示例。如果您正苦于以下问题:Python SupervisedDataSet.addSample(X_train[j],y_train[j])-trainer方法的具体用法?Python SupervisedDataSet.addSample(X_train[j],y_train[j])-trainer怎么用?Python SupervisedDataSet.addSample(X_train[j],y_train[j])-trainer使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类pybrain.datasets.SupervisedDataSet
的用法示例。
在下文中一共展示了SupervisedDataSet.addSample(X_train[j],y_train[j])-trainer方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: linspace
# 需要导入模块: from pybrain.datasets import SupervisedDataSet [as 别名]
# 或者: from pybrain.datasets.SupervisedDataSet import addSample(X_train[j],y_train[j])-trainer [as 别名]
# We will vary the training set so that we have 10 different sizes
sizes = linspace(10, len(X_train), 10)
train_err = zeros(len(sizes))
test_err = zeros(len(sizes))
# Build a network with 2 hidden layers
net = buildNetwork(13, 7, 3, 1)
# The dataset will have 13 input features and 1 output
ds = SupervisedDataSet(13, 1)
for i,s in enumerate(sizes):
# Populate the dataset for training
ds.clear()
for j in range(1, int(s)):
ds.addSample(X_train[j], y_train[j])
-
# Setup a backprop trainer
trainer = BackpropTrainer(net, ds)
# Train the NN for 50 epochs
# The .train() function returns MSE over the training set
for e in range(0, 50):
train_err[i] = trainer.train()
# Find g error')
pl.legend()
pl.xlabel('Training Size')
pl.ylabel('RMS Error')
pl.show()
# <headingcell level=2>