當前位置: 首頁>>代碼示例>>Java>>正文


Java Nd4j.ENFORCE_NUMERICAL_STABILITY屬性代碼示例

本文整理匯總了Java中org.nd4j.linalg.factory.Nd4j.ENFORCE_NUMERICAL_STABILITY屬性的典型用法代碼示例。如果您正苦於以下問題:Java Nd4j.ENFORCE_NUMERICAL_STABILITY屬性的具體用法?Java Nd4j.ENFORCE_NUMERICAL_STABILITY怎麽用?Java Nd4j.ENFORCE_NUMERICAL_STABILITY使用的例子?那麽, 這裏精選的屬性代碼示例或許可以為您提供幫助。您也可以進一步了解該屬性所在org.nd4j.linalg.factory.Nd4j的用法示例。


在下文中一共展示了Nd4j.ENFORCE_NUMERICAL_STABILITY屬性的2個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: main

public static void main(String[] args) throws IOException {
     
    Nd4j.MAX_SLICES_TO_PRINT = -1;
    Nd4j.MAX_ELEMENTS_PER_SLICE = -1;
    Nd4j.ENFORCE_NUMERICAL_STABILITY = true;
    final int numRows = 4;
    final int numColumns = 1;
    int outputNum = 10;
    int numSamples = 150;
    int batchSize = 150;
    int iterations = 100;
    int seed = 123;
    int listenerFreq = iterations/2;

    log.info("Load data....");
    DataSetIterator iter = new IrisDataSetIterator(batchSize, numSamples);
    
    DataSet iris = iter.next();

    iris.normalizeZeroMeanZeroUnitVariance();

    log.info("Build model....");
    NeuralNetConfiguration conf = new NeuralNetConfiguration.Builder().regularization(true)
            .miniBatch(true)
           
            .layer(new RBM.Builder().l2(1e-1).l1(1e-3)
                    .nIn(numRows * numColumns)  
                    .nOut(outputNum) 
                    .activation("relu")  
                    .weightInit(WeightInit.RELU)  
                    .lossFunction(LossFunctions.LossFunction.RECONSTRUCTION_CROSSENTROPY).k(3)
                    .hiddenUnit(HiddenUnit.RECTIFIED).visibleUnit(VisibleUnit.GAUSSIAN)
                    .updater(Updater.ADAGRAD).gradientNormalization(GradientNormalization.ClipL2PerLayer)
                    .build())
            .seed(seed)  
            .iterations(iterations)
            .learningRate(1e-3)  
            .optimizationAlgo(OptimizationAlgorithm.LBFGS)
            .build();
    Layer model = LayerFactories.getFactory(conf.getLayer()).create(conf);
    model.setListeners(new ScoreIterationListener(listenerFreq));

    log.info("Evaluate weights....");
    INDArray w = model.getParam(DefaultParamInitializer.WEIGHT_KEY);
    log.info("Weights: " + w);
    log.info("Scaling the dataset");
    iris.scale();
    log.info("Train model....");
    for(int i = 0; i < 20; i++) {
        log.info("Epoch "+i+":");
        model.fit(iris.getFeatureMatrix());
    }

}
 
開發者ID:PacktPublishing,項目名稱:Deep-Learning-with-Hadoop,代碼行數:54,代碼來源:RBM.java

示例2: testFromSvmLightBackprop

@Test
public void testFromSvmLightBackprop() throws Exception {
    JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(sc.sc(), new ClassPathResource("iris_svmLight_0.txt").getTempFileFromArchive().getAbsolutePath()).toJavaRDD().map(new Function<LabeledPoint, LabeledPoint>() {
        @Override
        public LabeledPoint call(LabeledPoint v1) throws Exception {
            return new LabeledPoint(v1.label(), Vectors.dense(v1.features().toArray()));
        }
    }).cache();
    Nd4j.ENFORCE_NUMERICAL_STABILITY = true;

    DataSet d = new IrisDataSetIterator(150,150).next();
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
            .seed(123)
            .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
            .iterations(10)
            .list()
            .layer(0, new DenseLayer.Builder()
                    .nIn(4).nOut(100)
                    .weightInit(WeightInit.XAVIER)
                    .activation("relu")
                    .build())
            .layer(1, new org.deeplearning4j.nn.conf.layers.OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                    .nIn(100).nOut(3)
                    .activation("softmax")
                    .weightInit(WeightInit.XAVIER)
                    .build())
            .backprop(true)
            .build();



    MultiLayerNetwork network = new MultiLayerNetwork(conf);
    network.init();
    System.out.println("Initializing network");

    SparkDl4jMultiLayer master = new SparkDl4jMultiLayer(sc,conf,new ParameterAveragingTrainingMaster(true,numExecutors(),1,5,1,0));

    MultiLayerNetwork network2 = master.fitLabeledPoint(data);
    Evaluation evaluation = new Evaluation();
    evaluation.eval(d.getLabels(), network2.output(d.getFeatureMatrix()));
    System.out.println(evaluation.stats());


}
 
開發者ID:PacktPublishing,項目名稱:Deep-Learning-with-Hadoop,代碼行數:44,代碼來源:TestSparkMultiLayerParameterAveraging.java


注:本文中的org.nd4j.linalg.factory.Nd4j.ENFORCE_NUMERICAL_STABILITY屬性示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。