当前位置: 首页>>代码示例>>Python>>正文


Python Model.batch_meta方法代码示例

本文整理汇总了Python中neon.models.Model.batch_meta方法的典型用法代码示例。如果您正苦于以下问题:Python Model.batch_meta方法的具体用法?Python Model.batch_meta怎么用?Python Model.batch_meta使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在neon.models.Model的用法示例。


在下文中一共展示了Model.batch_meta方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: RandomEMDataIterator

# 需要导入模块: from neon.models import Model [as 别名]
# 或者: from neon.models.Model import batch_meta [as 别名]
                                  dim_ordering=args.dim_ordering, batch_range=args.test_range, name='test', 
                                  isTest=True, concatenate_batches=True, NBUF=args.nbebuf,
                                  image_in_size=args.image_in_size) if args.callback_args['eval_freq'] else None
        else:
            # make dummy random data just for testing model inits
            train = RandomEMDataIterator(name='train')
            test = RandomEMDataIterator(name='test')
    
        if not args.model_file:
            # create the model based on the architecture specified via command line
            arch = EMModelArchitecture.init_model_arch(args.model_arch, train.parser.nclass, 
                                                       not train.parser.independent_labels)
            model = Model(layers=arch.layers)
            
            # allocate saving counts for training labels here so they can be saved with convnet checkpoints
            model.batch_meta = {'prior_train_count':np.zeros((train.parser.nclass,),dtype=np.int64), 
                                'prior_total_count':np.zeros((1,),dtype=np.int64)}

        if hasattr(model,'batch_meta'):
            train.parser.batch_meta['prior_train_count'] = model.batch_meta['prior_train_count']
            train.parser.batch_meta['prior_total_count'] = model.batch_meta['prior_total_count']
    
        assert( train.nmacrobatches > 0 )    # no training batches specified and not in write_output mode
        macro_epoch = model.epoch_index//train.nmacrobatches+1
        macro_batch = model.epoch_index%train.nmacrobatches+1
        if args.data_config and macro_batch > train.batch_range[0]:
            print('Model loaded at model epoch %d, setting to training batch %d' % (model.epoch_index,macro_batch,))
            train.reset_batchnum(macro_batch)
        
        # print out epoch and batch as they were in cuda-convnets2, starting at 1
        print('Training from epoch %d to %d with %d/%d training/testing batches per epoch, %d examples/batch' \
            % (macro_epoch, args.epochs, train.nmacrobatches, test.nmacrobatches if test else 0, 
开发者ID:elhuhdron,项目名称:emdrp,代码行数:34,代码来源:emneon.py


注:本文中的neon.models.Model.batch_meta方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。