本文整理汇总了Python中neon.optimizers.Schedule方法的典型用法代码示例。如果您正苦于以下问题:Python optimizers.Schedule方法的具体用法?Python optimizers.Schedule怎么用?Python optimizers.Schedule使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类neon.optimizers
的用法示例。
在下文中一共展示了optimizers.Schedule方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: get_args_and_hyperparameters
# 需要导入模块: from neon import optimizers [as 别名]
# 或者: from neon.optimizers import Schedule [as 别名]
def get_args_and_hyperparameters():
parser = NeonArgparser(__doc__)
args = parser.parse_args(gen_be=False)
# Override save path if None
if args.save_path is None:
args.save_path = 'frcn_alexnet.pickle'
if args.callback_args['save_path'] is None:
args.callback_args['save_path'] = args.save_path
if args.callback_args['serialize'] is None:
args.callback_args['serialize'] = min(args.epochs, 10)
# hyperparameters
args.batch_size = 64
hyper_params = lambda: None
hyper_params.use_pre_trained_weights = True # If true, load pre-trained weights to the model
hyper_params.max_train_imgs = 5000 # Make this smaller in small trial runs to save time
hyper_params.max_test_imgs = 5000 # Make this smaller in small trial runs to save time
hyper_params.num_epochs = args.epochs
hyper_params.samples_per_batch = args.batch_size # The mini-batch size
# The number of multi-scale samples to make for each input image. These
# samples are then fed into the network in multiple minibatches.
hyper_params.samples_per_img = hyper_params.samples_per_batch*7
hyper_params.frcn_fine_tune = False
hyper_params.shuffle = True
if hyper_params.use_pre_trained_weights:
# This will typically train in 10-15 epochs. Use a small learning rate
# and quickly reduce every 5-10 epochs. Use a high momentum since we
# are close to the minima.
s = 1e-4
hyper_params.learning_rate_scale = s
hyper_params.learning_rate_sched = Schedule(step_config=[15, 20],
change=[0.1*s, 0.01*s])
hyper_params.momentum = 0.9
else: # need to be less aggressive with reducing learning rate if the model is not pre-trained
s = 1e-2
hyper_params.learning_rate_scale = 1e-2
hyper_params.learning_rate_sched = Schedule(step_config=[8, 14, 18, 20],
change=[0.5*s, 0.1*s, 0.05*s, 0.01*s])
hyper_params.momentum = 0.1
hyper_params.class_score_threshold = 0.000001
hyper_params.score_exponent = 5
hyper_params.shuffle = True
return args, hyper_params