當前位置: 首頁>>代碼示例>>Java>>正文


Java IUpdater類代碼示例

本文整理匯總了Java中org.nd4j.linalg.learning.config.IUpdater的典型用法代碼示例。如果您正苦於以下問題:Java IUpdater類的具體用法?Java IUpdater怎麽用?Java IUpdater使用的例子?那麽, 這裏精選的類代碼示例或許可以為您提供幫助。


IUpdater類屬於org.nd4j.linalg.learning.config包,在下文中一共展示了IUpdater類的15個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: setLearningRate

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
private static void setLearningRate(MultiLayerNetwork net, int layerNumber, double newLr, ISchedule newLrSchedule, boolean refreshUpdater) {

        Layer l = net.getLayer(layerNumber).conf().getLayer();
        if (l instanceof BaseLayer) {
            BaseLayer bl = (BaseLayer) l;
            IUpdater u = bl.getIUpdater();
            if (u != null && u.hasLearningRate()) {
                if (newLrSchedule != null) {
                    u.setLrAndSchedule(Double.NaN, newLrSchedule);
                } else {
                    u.setLrAndSchedule(newLr, null);
                }
            }

            //Need to refresh the updater - if we change the LR (or schedule) we may rebuild the updater blocks, which are
            // built by creating blocks of params with the same configuration
            if (refreshUpdater) {
                refreshUpdater(net);
            }
        }
    }
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:22,代碼來源:NetworkUtils.java

示例2: getGraphConfCNN

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
private static ComputationGraphConfiguration getGraphConfCNN(int seed, IUpdater updater) {
    Nd4j.getRandom().setSeed(seed);
    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                    .weightInit(WeightInit.XAVIER).updater(updater).seed(seed).graphBuilder()
                    .addInputs("in")
                    .addLayer("0", new ConvolutionLayer.Builder().nOut(3).kernelSize(2, 2).stride(1, 1)
                                    .padding(0, 0).activation(Activation.TANH).build(), "in")
                    .addLayer("1", new ConvolutionLayer.Builder().nOut(3).kernelSize(2, 2).stride(1, 1)
                                    .padding(0, 0).activation(Activation.TANH).build(), "0")
                    .addLayer("2", new OutputLayer.Builder().lossFunction(LossFunctions.LossFunction.MSE).nOut(10)
                                    .build(), "1")
                    .setOutputs("2").setInputTypes(InputType.convolutional(10, 10, 3)).pretrain(false)
                    .backprop(true).build();
    return conf;
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:17,代碼來源:TestCompareParameterAveragingSparkVsSingleMachine.java

示例3: testUpdaters

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
@Test
public void testUpdaters() {
    SparkDl4jMultiLayer sparkNet = getBasicNetwork();
    MultiLayerNetwork netCopy = sparkNet.getNetwork().clone();

    netCopy.fit(data);
    IUpdater expectedUpdater = ((BaseLayer) netCopy.conf().getLayer()).getIUpdater();
    double expectedLR = ((Nesterovs)((BaseLayer) netCopy.conf().getLayer()).getIUpdater()).getLearningRate();
    double expectedMomentum = ((Nesterovs)((BaseLayer) netCopy.conf().getLayer()).getIUpdater()).getMomentum();

    IUpdater actualUpdater = ((BaseLayer) sparkNet.getNetwork().conf().getLayer()).getIUpdater();
    sparkNet.fit(sparkData);
    double actualLR = ((Nesterovs)((BaseLayer) sparkNet.getNetwork().conf().getLayer()).getIUpdater()).getLearningRate();
    double actualMomentum = ((Nesterovs)((BaseLayer) sparkNet.getNetwork().conf().getLayer()).getIUpdater()).getMomentum();

    assertEquals(expectedUpdater, actualUpdater);
    assertEquals(expectedLR, actualLR, 0.01);
    assertEquals(expectedMomentum, actualMomentum, 0.01);

}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:21,代碼來源:TestSparkMultiLayerParameterAveraging.java

示例4: use

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
@OptionMetadata(
  displayName = "updater",
  description = "The updater to use (default = SGD).",
  commandLineParamName = "updater",
  commandLineParamSynopsis = "-updater <string>",
  displayOrder = 12
)
public IUpdater getUpdater() {
  return iUpdater;
}
 
開發者ID:Waikato,項目名稱:wekaDeeplearning4j,代碼行數:11,代碼來源:NeuralNetConfiguration.java

示例5: updaterConfigurationsEquals

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
public static boolean updaterConfigurationsEquals(Layer layer1, String param1, Layer layer2, String param2) {
    org.deeplearning4j.nn.conf.layers.Layer l1 = layer1.conf().getLayer();
    org.deeplearning4j.nn.conf.layers.Layer l2 = layer2.conf().getLayer();
    IUpdater u1 = l1.getUpdaterByParam(param1);
    IUpdater u2 = l2.getUpdaterByParam(param2);

    //For updaters to be equal (and hence combinable), we require that:
    //(a) The updater-specific configurations are equal (inc. LR, LR/momentum schedules etc)
    //(b) If one or more of the params are pretrainable params, they are in the same layer
    //    This last point is necessary as we don't want to modify the pretrain gradient/updater state during
    //    backprop, or modify the pretrain gradient/updater state of one layer while training another
    if (!u1.equals(u2)) {
        //Different updaters or different config
        return false;
    }

    boolean isPretrainParam1 = layer1.conf().getLayer().isPretrainParam(param1);
    boolean isPretrainParam2 = layer2.conf().getLayer().isPretrainParam(param2);
    if (isPretrainParam1 || isPretrainParam2) {
        //One or both of params are pretrainable.
        //Either layers differ -> don't want to combine a pretrain updaters across layers
        //Or one is pretrain and the other isn't -> don't want to combine pretrain updaters within a layer
        return layer1 == layer2 && isPretrainParam1 && isPretrainParam2;
    }

    return true;
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:28,代碼來源:UpdaterUtils.java

示例6: getUpdaterByParam

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
@Override
public IUpdater getUpdaterByParam(String paramName) {
    switch (paramName) {
        case BatchNormalizationParamInitializer.BETA:
        case BatchNormalizationParamInitializer.GAMMA:
            return iUpdater;
        case BatchNormalizationParamInitializer.GLOBAL_MEAN:
        case BatchNormalizationParamInitializer.GLOBAL_VAR:
            return new NoOp();
        default:
            throw new IllegalArgumentException("Unknown parameter: \"" + paramName + "\"");
    }
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:14,代碼來源:BatchNormalization.java

示例7: getUpdaterByParam

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
/**
 * Get the updater for the given parameter. Typically the same updater will be used for all updaters, but this
 * is not necessarily the case
 *
 * @param paramName    Parameter name
 * @return             IUpdater for the parameter
 */
@Override
public IUpdater getUpdaterByParam(String paramName) {
    if(biasUpdater != null && initializer().isBiasParam(this, paramName)){
        return biasUpdater;
    }
    return iUpdater;
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:15,代碼來源:BaseLayer.java

示例8: getUpdaterByParam

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
@Override
public IUpdater getUpdaterByParam(String paramName) {
    // center loss utilizes alpha directly for this so any updater can be used for other layers
    switch (paramName) {
        case CenterLossParamInitializer.CENTER_KEY:
            return new NoOp();
        default:
            return iUpdater;
    }
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:11,代碼來源:CenterLossOutputLayer.java

示例9: getConf

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
private static MultiLayerConfiguration getConf(int seed, IUpdater updater) {
    Nd4j.getRandom().setSeed(seed);
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                    .weightInit(WeightInit.XAVIER).updater(updater).seed(seed).list()
                    .layer(0, new DenseLayer.Builder().nIn(10).nOut(10).build()).layer(1, new OutputLayer.Builder()
                                    .lossFunction(LossFunctions.LossFunction.MSE).nIn(10).nOut(10).build())
                    .pretrain(false).backprop(true).build();
    return conf;
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:11,代碼來源:TestCompareParameterAveragingSparkVsSingleMachine.java

示例10: getConfCNN

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
private static MultiLayerConfiguration getConfCNN(int seed, IUpdater updater) {
    Nd4j.getRandom().setSeed(seed);
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                    .weightInit(WeightInit.XAVIER).updater(updater).seed(seed).list()
                    .layer(0, new ConvolutionLayer.Builder().nOut(3).kernelSize(2, 2).stride(1, 1).padding(0, 0)
                                    .activation(Activation.TANH).build())
                    .layer(1, new ConvolutionLayer.Builder().nOut(3).kernelSize(2, 2).stride(1, 1).padding(0, 0)
                                    .activation(Activation.TANH).build())
                    .layer(1, new OutputLayer.Builder().lossFunction(LossFunctions.LossFunction.MSE).nOut(10)
                                    .build())
                    .setInputType(InputType.convolutional(10, 10, 3)).pretrain(false).backprop(true).build();
    return conf;
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:15,代碼來源:TestCompareParameterAveragingSparkVsSingleMachine.java

示例11: getGraphConf

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
private static ComputationGraphConfiguration getGraphConf(int seed, IUpdater updater) {
    Nd4j.getRandom().setSeed(seed);
    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                    .weightInit(WeightInit.XAVIER).updater(updater).seed(seed).graphBuilder()
                    .addInputs("in")
                    .addLayer("0", new DenseLayer.Builder().nIn(10).nOut(10).build(), "in").addLayer("1",
                                    new OutputLayer.Builder().lossFunction(LossFunctions.LossFunction.MSE).nIn(10)
                                                    .nOut(10).build(),
                                    "0")
                    .setOutputs("1").pretrain(false).backprop(true).build();
    return conf;
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:14,代碼來源:TestCompareParameterAveragingSparkVsSingleMachine.java

示例12: setUpdater

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
public void setUpdater(IUpdater updater) {
  iUpdater = updater;
}
 
開發者ID:Waikato,項目名稱:wekaDeeplearning4j,代碼行數:4,代碼來源:NeuralNetConfiguration.java

示例13: copyConfigToLayer

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
private void copyConfigToLayer(String layerName, Layer layer) {

            if (layer.getIDropout() == null)
                layer.setIDropout(idropOut);

            if (layer instanceof BaseLayer) {
                BaseLayer bLayer = (BaseLayer) layer;
                if (Double.isNaN(bLayer.getL1()))
                    bLayer.setL1(l1);
                if (Double.isNaN(bLayer.getL2()))
                    bLayer.setL2(l2);
                if (bLayer.getActivationFn() == null)
                    bLayer.setActivationFn(activationFn);
                if (bLayer.getWeightInit() == null)
                    bLayer.setWeightInit(weightInit);
                if (Double.isNaN(bLayer.getBiasInit()))
                    bLayer.setBiasInit(biasInit);

                //Configure weight noise:
                if(weightNoise != null && ((BaseLayer) layer).getWeightNoise() == null){
                    ((BaseLayer) layer).setWeightNoise(weightNoise.clone());
                }

                //Configure updaters:
                if(iUpdater != null && bLayer.getIUpdater() == null){
                    bLayer.setIUpdater(iUpdater);
                }
                if(biasUpdater != null && bLayer.getBiasUpdater() == null){
                    bLayer.setBiasUpdater(biasUpdater);
                }

                if(bLayer.getIUpdater() == null && iUpdater == null && bLayer.initializer().numParams(bLayer) > 0){
                    //No updater set anywhere
                    IUpdater u = new Sgd();
                    bLayer.setIUpdater(u);
                    log.warn("*** No updater configuration is set for layer {} - defaulting to {} ***", layerName, u);
                }

                if (bLayer.getGradientNormalization() == null)
                    bLayer.setGradientNormalization(gradientNormalization);
                if (Double.isNaN(bLayer.getGradientNormalizationThreshold()))
                    bLayer.setGradientNormalizationThreshold(gradientNormalizationThreshold);
            }

            if (layer instanceof ActivationLayer){
                ActivationLayer al = (ActivationLayer)layer;
                if(al.getActivationFn() == null)
                    al.setActivationFn(activationFn);
            }
        }
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:51,代碼來源:NeuralNetConfiguration.java

示例14: getUpdaterByParam

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
@Override
public IUpdater getUpdaterByParam(String paramName) {
    return null;
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:5,代碼來源:FrozenLayer.java

示例15: biasUpdater

import org.nd4j.linalg.learning.config.IUpdater; //導入依賴的package包/類
/**
 * Gradient updater configuration, for the biases only. If not set, biases will use the updater as
 * set by {@link #updater(IUpdater)}
 *
 * @param updater Updater to use for bias parameters
 */
public Builder biasUpdater(IUpdater updater){
    this.biasUpdater = updater;
    return this;
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:11,代碼來源:NeuralNetConfiguration.java


注:本文中的org.nd4j.linalg.learning.config.IUpdater類示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。