當前位置: 首頁>>代碼示例>>Python>>正文


Python optimizers.Schedule方法代碼示例

本文整理匯總了Python中neon.optimizers.Schedule方法的典型用法代碼示例。如果您正苦於以下問題:Python optimizers.Schedule方法的具體用法?Python optimizers.Schedule怎麽用?Python optimizers.Schedule使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在neon.optimizers的用法示例。


在下文中一共展示了optimizers.Schedule方法的1個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Python代碼示例。

示例1: get_args_and_hyperparameters

# 需要導入模塊: from neon import optimizers [as 別名]
# 或者: from neon.optimizers import Schedule [as 別名]
def get_args_and_hyperparameters():
    parser = NeonArgparser(__doc__)
    args = parser.parse_args(gen_be=False)
    
    # Override save path if None
    if args.save_path is None:
        args.save_path = 'frcn_alexnet.pickle'
    
    if args.callback_args['save_path'] is None:
        args.callback_args['save_path'] = args.save_path
    
    if args.callback_args['serialize'] is None:
        args.callback_args['serialize'] = min(args.epochs, 10)
    
    
    # hyperparameters
    args.batch_size = 64
    hyper_params = lambda: None
    hyper_params.use_pre_trained_weights = True # If true, load pre-trained weights to the model
    hyper_params.max_train_imgs = 5000 # Make this smaller in small trial runs to save time
    hyper_params.max_test_imgs = 5000 # Make this smaller in small trial runs to save time
    hyper_params.num_epochs = args.epochs
    hyper_params.samples_per_batch = args.batch_size # The mini-batch size
    # The number of multi-scale samples to make for each input image. These
    # samples are then fed into the network in multiple minibatches.
    hyper_params.samples_per_img = hyper_params.samples_per_batch*7 
    hyper_params.frcn_fine_tune = False
    hyper_params.shuffle = True
    if hyper_params.use_pre_trained_weights:
        # This will typically train in 10-15 epochs. Use a small learning rate
        # and quickly reduce every 5-10 epochs. Use a high momentum since we
        # are close to the minima.
        s = 1e-4
        hyper_params.learning_rate_scale = s
        hyper_params.learning_rate_sched = Schedule(step_config=[15, 20], 
                                        change=[0.1*s, 0.01*s])
        hyper_params.momentum = 0.9
    else: # need to be less aggressive with reducing learning rate if the model is not pre-trained
        s = 1e-2
        hyper_params.learning_rate_scale = 1e-2
        hyper_params.learning_rate_sched = Schedule(step_config=[8, 14, 18, 20], 
                                        change=[0.5*s, 0.1*s, 0.05*s, 0.01*s])
        hyper_params.momentum = 0.1
    hyper_params.class_score_threshold = 0.000001
    hyper_params.score_exponent = 5
    hyper_params.shuffle = True
    return args, hyper_params 
開發者ID:NervanaSystems,項目名稱:ModelZoo,代碼行數:49,代碼來源:transfer_learning.py


注:本文中的neon.optimizers.Schedule方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。