当前位置: 首页>>代码示例>>Python>>正文


Python Network.backpropagation方法代码示例

本文整理汇总了Python中network.Network.backpropagation方法的典型用法代码示例。如果您正苦于以下问题:Python Network.backpropagation方法的具体用法?Python Network.backpropagation怎么用?Python Network.backpropagation使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在network.Network的用法示例。


在下文中一共展示了Network.backpropagation方法的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: range

# 需要导入模块: from network import Network [as 别名]
# 或者: from network.Network import backpropagation [as 别名]
 	[1.1, 3.5, 4.5, 7.6],
 	[2.1, 3.5, 5.5, 8.6],
 	[3.1, 5.5, 7.5, 9.6],
 	[0.1, 1.5, 2.5, 6.6],

 	[9.5, 8.1, 5.5, 3.6],
 	[5.5, 4.1, 3.5, 1.6],
 	[8.5, 7.1, 1.5, 1.2],
 	[6.5, 3.1, 2.1, 1.9],
]
yTrain = [
	[1], [1], [0], [1], [1], [0], [1],
 [1], [0], [0], [0], [1], [0], [0],
 [1],[1],[1],[1],
 [0],[0],[0],[0]
 ]

xTest= [[0.4, 1.9, 2.5, 3.1], [1.51, 2.0, 2.4, 3.8], [2.6, 5.1, 6.2, 7.2], [3.23, 4.1, 4.3, 4.9], [7.1, 7.6, 8.2, 9.3],
 [5.78, 5.1, 4.5, 3.55], [6.33, 4.8, 3.4, 2.5], [7.67, 6.45, 5.8, 4.31], [8.22, 6.32, 5.87, 3.59], [9.1, 8.5, 7.7, 6.1]]
yTest = [[1], [1], [1], [1], [1],
 [0], [0], [0], [0], [0]]
i = 0                
while cost>0:
    cost=nt.costTotal(False, nn, xTrain, yTrain, lamb)
    costTest=nt.costTotal(False, nn, xTest, yTest, lamb)
    delta=nt.backpropagation(False, nn, xTrain, yTrain, lamb)
    nn['theta']=[nn['theta'][i]-alf*delta[i] for i in range(0,len(nn['theta']))]
    i = i + 1
    print('Train cost ', cost[0,0], 'Test cost ', costTest[0,0], 'Iteration ', i)
    print(nt.runAll(nn, xTest))
开发者ID:Timopheym,项目名称:kaggle,代码行数:32,代码来源:train_sec.py

示例2: range

# 需要导入模块: from network import Network [as 别名]
# 或者: from network.Network import backpropagation [as 别名]
from network import Network
nn=Network.create([4, 1000, 1])

lamb=0.3
cost=1
alf = 0.2
xTrain = [[1, 2.3, 4.5, 5.3], [1.1, 1.3, 2.4, 2.4], [1.9, 1.7, 1.5, 1.3], [2.3, 2.9, 3.3, 4.9], [3, 5.2, 6.1, 8.2], [3.31, 2.9, 2.4, 1.5], [4.9, 5.7, 6.1, 6.3],
 [4.85, 5.0, 7.2, 8.1], [5.9, 5.3, 4.2, 3.3], [7.7, 5.4, 4.3, 3.9], [6.7, 5.3, 3.2, 1.4], [7.1, 8.6, 9.1, 9.9], [8.5, 7.4, 6.3, 4.1], [9.8, 5.3, 3.1, 2.9]]
yTrain = [[1], [1], [0], [1], [1], [0], [1],
 [1], [0], [0], [0], [1], [0], [0]]

xTest= [[0.4, 1.9, 2.5, 3.1], [1.51, 2.0, 2.4, 3.8], [2.6, 5.1, 6.2, 7.2], [3.23, 4.1, 4.3, 4.9], [7.1, 7.6, 8.2, 9.3],
 [5.78, 5.1, 4.5, 3.55], [6.33, 4.8, 3.4, 2.5], [7.67, 6.45, 5.8, 4.31], [8.22, 6.32, 5.87, 3.59], [9.1, 8.5, 7.7, 6.1]]
yTest = [[1], [1], [1], [1], [1],
 [0], [0], [0], [0], [0]]
                
while cost>0:
    cost=Network.costTotal(False, nn, xTrain, yTrain, lamb)
    costTest=Network.costTotal(False, nn, xTest, yTest, lamb)
    delta=Network.backpropagation(False, nn, xTrain, yTrain, lamb)
    nn['theta']=[nn['theta'][i]-alf*delta[i] for i in range(0,len(nn['theta']))]
    print('Train cost ', cost[0,0], 'Test cost ', costTest[0,0])
    print(Network.runAll(nn, xTest))
开发者ID:alexandrusenko,项目名称:DataAnalysis,代码行数:25,代码来源:test.py

示例3: plot_digit

# 需要导入模块: from network import Network [as 别名]
# 或者: from network.Network import backpropagation [as 别名]
# X_train = X_train[:100]

# http://rasbt.github.io/mlxtend/docs/data/mnist/
# def plot_digit(X, y, idx):
#     img = X[idx].reshape(28,28)
#     plt.imshow(img, cmap='Greys',  interpolation='nearest')
#     plt.title('true label: %d' % y[idx])
#     plt.show()

# plot_digit(X_train, y_train, 4)

nt = Network()
nn = nt.create([784, 100, 1])

lamb = 0.3
cost = 1
alf = 0.005

i = 0
results = []
while cost > 0:
    cost = nt.costTotal(False, nn, X_train, y_train, lamb)
    delta = nt.backpropagation(False, nn, X_train, y_train, lamb)
    nn["theta"] = [nn["theta"][i] - alf * delta[i] for i in range(0, len(nn["theta"]))]
    i = i + 1
    print("Train cost ", cost[0, 0], "Iteration ", i)
    results = nt.runAll(nn, X_test)
    print(results)

np.savetxt("results.csv", results, delimiter=",")
开发者ID:Timopheym,项目名称:kaggle,代码行数:32,代码来源:mnist.py


注:本文中的network.Network.backpropagation方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。