當前位置: 首頁>>代碼示例>>Python>>正文


Python optimizers.AdaDelta方法代碼示例

本文整理匯總了Python中chainer.optimizers.AdaDelta方法的典型用法代碼示例。如果您正苦於以下問題:Python optimizers.AdaDelta方法的具體用法?Python optimizers.AdaDelta怎麽用?Python optimizers.AdaDelta使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在chainer.optimizers的用法示例。


在下文中一共展示了optimizers.AdaDelta方法的4個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Python代碼示例。

示例1: setUp

# 需要導入模塊: from chainer import optimizers [as 別名]
# 或者: from chainer.optimizers import AdaDelta [as 別名]
def setUp(self):
        fd, path = tempfile.mkstemp()
        os.close(fd)
        self.temp_file_path = path

        child = link.Chain()
        with child.init_scope():
            child.linear = links.Linear(2, 3)
            child.Wc = chainer.Parameter(shape=(2, 3))

        self.parent = link.Chain()
        with self.parent.init_scope():
            self.parent.child = child
            self.parent.Wp = chainer.Parameter(shape=(2, 3))

        self.optimizer = optimizers.AdaDelta()
        self.optimizer.setup(self.parent)

        self.parent.cleargrads()
        self.optimizer.update()  # init states 
開發者ID:chainer,項目名稱:chainer,代碼行數:22,代碼來源:test_hdf5.py

示例2: create

# 需要導入模塊: from chainer import optimizers [as 別名]
# 或者: from chainer.optimizers import AdaDelta [as 別名]
def create(self):
        return optimizers.AdaDelta(eps=1e-5) 
開發者ID:chainer,項目名稱:chainer,代碼行數:4,代碼來源:test_optimizers_by_linear_model.py

示例3: setUp

# 需要導入模塊: from chainer import optimizers [as 別名]
# 或者: from chainer.optimizers import AdaDelta [as 別名]
def setUp(self):
        if self.file_type == 'filename':
            fd, path = tempfile.mkstemp()
            os.close(fd)
            self.file = path
        elif self.file_type == 'bytesio':
            self.file = six.BytesIO()
        else:
            assert False

        child = link.Chain()
        with child.init_scope():
            child.linear = links.Linear(2, 3)
            child.Wc = chainer.Parameter(shape=(2, 3))

        self.parent = link.Chain()
        with self.parent.init_scope():
            self.parent.child = child
            self.parent.Wp = chainer.Parameter(shape=(2, 3))

        self.optimizer = optimizers.AdaDelta()
        self.optimizer.setup(self.parent)

        self.parent.cleargrads()
        self.optimizer.update()  # init all states

        self.savez = numpy.savez_compressed if self.compress else numpy.savez 
開發者ID:chainer,項目名稱:chainer,代碼行數:29,代碼來源:test_npz.py

示例4: set_params

# 需要導入模塊: from chainer import optimizers [as 別名]
# 或者: from chainer.optimizers import AdaDelta [as 別名]
def set_params(self, params):

        self.gpu = params.get('gpu',False)
        self.learning_rate = params.get('learning_rate',0.00025)
        self.decay_rate = params.get('decay_rate',0.95)
        self.discount = params.get('discount',0.95)
        self.clip_err = params.get('clip_err',False)
        self.target_net_update = params.get('target_net_update',10000)
        self.double_DQN = params.get('double_DQN',False)

        # setting up various possible gradient update algorithms
        opt = params.get('optim_name', 'ADAM')
        if opt == 'RMSprop':
            self.optimizer = optimizers.RMSprop(lr=self.learning_rate, alpha=self.decay_rate)

        elif opt == 'ADADELTA':
            print("Supplied learning rate not used with ADADELTA gradient update method")
            self.optimizer = optimizers.AdaDelta()

        elif opt == 'ADAM':
            self.optimizer = optimizers.Adam(alpha=self.learning_rate)

        elif opt == 'SGD':
            self.optimizer = optimizers.SGD(lr=self.learning_rate)

        else:
            print('The requested optimizer is not supported!!!')
            exit()

        if self.clip_err is not False:
            self.optimizer.add_hook(chainer.optimizer.GradientClipping(self.clip_err))

        self.optim_name = params['optim_name'] 
開發者ID:sisl,項目名稱:Chimp,代碼行數:35,代碼來源:chainer_backend.py


注:本文中的chainer.optimizers.AdaDelta方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。