本文整理匯總了Python中adam.Adam方法的典型用法代碼示例。如果您正苦於以下問題:Python adam.Adam方法的具體用法?Python adam.Adam怎麽用?Python adam.Adam使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在類adam
的用法示例。
在下文中一共展示了adam.Adam方法的2個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Python代碼示例。
示例1: experiment
# 需要導入模塊: import adam [as 別名]
# 或者: from adam import Adam [as 別名]
def experiment(alg='ASNG', eta_x=0.1, eta_theta_factor=0., alpha=1.5, K=5, D=30, maxite=100000, log_file='log.csv'):
nc = (K-1) * D
f = fxc1(K, D, noise=True)
categories = K * np.ones(D, dtype=np.int)
if alg == 'ASNG':
opt_theta = AdaptiveSNG(categories, alpha=alpha, delta_init=nc**-eta_theta_factor)
elif alg == 'SNG':
opt_theta = SNG(categories, delta_init=nc**-eta_theta_factor)
elif alg == 'Adam':
opt_theta = Adam(categories, alpha=nc**-eta_theta_factor, beta1=0.9, beta2=0.999)
else:
print('invalid algorithm!')
return
optimizer_x = torch.optim.SGD(f.parameters(), lr=eta_x, momentum=0.9, weight_decay=0., nesterov=False)
lr_scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer_x, maxite)
print('{}, eta_x={}, eta_theta_factor={} alpha={}'.format(alg, eta_x, eta_theta_factor, alpha))
run(f, opt_theta, optimizer_x, lr_scheduler=lr_scheduler, maxite=maxite, dispspan=100, log_file=log_file)
示例2: Adam
# 需要導入模塊: import adam [as 別名]
# 或者: from adam import Adam [as 別名]
def Adam(grads, lr=0.0002, b1=0.1, b2=0.001, e=1e-8):
return adam.Adam(grads, lr, b1, b2, e)