当前位置: 首页>>代码示例>>Java>>正文


Java ScoreIterationListener类代码示例

本文整理汇总了Java中org.deeplearning4j.optimize.listeners.ScoreIterationListener的典型用法代码示例。如果您正苦于以下问题:Java ScoreIterationListener类的具体用法?Java ScoreIterationListener怎么用?Java ScoreIterationListener使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


ScoreIterationListener类属于org.deeplearning4j.optimize.listeners包,在下文中一共展示了ScoreIterationListener类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: createNetwork

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
private MultiLayerNetwork createNetwork(int numLabels) {
  MultiLayerNetwork network = null;
  switch (modelType) {
    case "LeNet":
      network = lenetModel(numLabels);
      break;
    case "AlexNet":
      network = alexnetModel(numLabels);
      break;
    case "custom":
      network = customModel(numLabels);
      break;
    default:
      throw new InvalidInputTypeException("Incorrect model provided.");
  }
  network.init();
  network.setListeners(new ScoreIterationListener(listenerFreq));
  return network;
}
 
开发者ID:MyRobotLab,项目名称:myrobotlab,代码行数:20,代码来源:Deeplearning4j.java

示例2: testNoImprovementNEpochsTermination

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
@Test
public void testNoImprovementNEpochsTermination() {
    //Idea: terminate training if score (test set loss) does not improve for 5 consecutive epochs
    //Simulate this by setting LR = 0.0

    Nd4j.getRandom().setSeed(12345);
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().seed(12345)
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                    .updater(new Sgd(0.0)).weightInit(WeightInit.XAVIER).list()
                    .layer(0, new OutputLayer.Builder().nIn(4).nOut(3)
                                    .lossFunction(LossFunctions.LossFunction.MCXENT).build())
                    .pretrain(false).backprop(true).build();
    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.setListeners(new ScoreIterationListener(1));

    DataSetIterator irisIter = new IrisDataSetIterator(150, 150);

    EarlyStoppingModelSaver<MultiLayerNetwork> saver = new InMemoryModelSaver<>();
    EarlyStoppingConfiguration<MultiLayerNetwork> esConf =
                    new EarlyStoppingConfiguration.Builder<MultiLayerNetwork>()
                                    .epochTerminationConditions(new MaxEpochsTerminationCondition(100),
                                                    new ScoreImprovementEpochTerminationCondition(5))
                                    .iterationTerminationConditions(
                                                    new MaxTimeIterationTerminationCondition(3, TimeUnit.SECONDS),
                                                    new MaxScoreIterationTerminationCondition(7.5)) //Initial score is ~2.5
                                    .scoreCalculator(new DataSetLossCalculator(irisIter, true)).modelSaver(saver)
                                    .build();

    IEarlyStoppingTrainer trainer = new EarlyStoppingTrainer(esConf, net, irisIter);
    EarlyStoppingResult result = trainer.fit();

    //Expect no score change due to 0 LR -> terminate after 6 total epochs
    assertEquals(6, result.getTotalEpochs());
    assertEquals(0, result.getBestModelEpoch());
    assertEquals(EarlyStoppingResult.TerminationReason.EpochTerminationCondition, result.getTerminationReason());
    String expDetails = new ScoreImprovementEpochTerminationCondition(5).toString();
    assertEquals(expDetails, result.getTerminationDetails());
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:39,代码来源:TestEarlyStopping.java

示例3: testSetParams

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
@Test
public void testSetParams() {
    NeuralNetConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.LINE_GRADIENT_DESCENT)
                    .updater(new Sgd(1e-1))
                    .layer(new org.deeplearning4j.nn.conf.layers.OutputLayer.Builder().nIn(4).nOut(3)
                                    .weightInit(WeightInit.ZERO).activation(Activation.SOFTMAX)
                                    .lossFunction(LossFunctions.LossFunction.MCXENT).build())
                    .build();

    int numParams = conf.getLayer().initializer().numParams(conf);
    INDArray params = Nd4j.create(1, numParams);
    OutputLayer l = (OutputLayer) conf.getLayer().instantiate(conf,
                    Collections.<IterationListener>singletonList(new ScoreIterationListener(1)), 0, params, true);
    params = l.params();
    l.setParams(params);
    assertEquals(params, l.params());
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:19,代码来源:OutputLayerTest.java

示例4: testAutoEncoder

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
@Test
public void testAutoEncoder() throws Exception {

    MnistDataFetcher fetcher = new MnistDataFetcher(true);
    NeuralNetConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.LINE_GRADIENT_DESCENT).updater(new Sgd(0.1))
                    .layer(new org.deeplearning4j.nn.conf.layers.AutoEncoder.Builder().nIn(784).nOut(600)
                                    .corruptionLevel(0.6)
                                    .lossFunction(LossFunctions.LossFunction.RECONSTRUCTION_CROSSENTROPY).build())
                    .build();


    fetcher.fetch(100);
    DataSet d2 = fetcher.next();

    INDArray input = d2.getFeatureMatrix();
    int numParams = conf.getLayer().initializer().numParams(conf);
    INDArray params = Nd4j.create(1, numParams);
    AutoEncoder da = (AutoEncoder) conf.getLayer().instantiate(conf,
                    Arrays.<IterationListener>asList(new ScoreIterationListener(1)), 0, params, true);
    assertEquals(da.params(), da.params());
    assertEquals(471784, da.params().length());
    da.setParams(da.params());
    da.setBackpropGradientsViewArray(Nd4j.create(1, params.length()));
    da.fit(input);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:27,代码来源:AutoEncoderTest.java

示例5: testMNISTConfig

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
@Test
@Ignore //Should be run manually
public void testMNISTConfig() throws Exception {
    int batchSize = 64; // Test batch size
    DataSetIterator mnistTrain = new MnistDataSetIterator(batchSize, true, 12345);

    ComputationGraph net = getCNNMnistConfig();
    net.init();
    net.setListeners(new ScoreIterationListener(1));

    for (int i = 0; i < 50; i++) {
        net.fit(mnistTrain.next());
        Thread.sleep(1000);
    }

    Thread.sleep(100000);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:18,代码来源:CenterLossOutputLayerTest.java

示例6: testIterationListener

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
@Test
public void testIterationListener() {
    MultiLayerNetwork model1 = new MultiLayerNetwork(getConf());
    model1.init();
    model1.setListeners(Collections.singletonList((IterationListener) new ScoreIterationListener(1)));

    MultiLayerNetwork model2 = new MultiLayerNetwork(getConf());
    model2.setListeners(Collections.singletonList((IterationListener) new ScoreIterationListener(1)));
    model2.init();

    Layer[] l1 = model1.getLayers();
    for (int i = 0; i < l1.length; i++)
        assertTrue(l1[i].getListeners() != null && l1[i].getListeners().size() == 1);

    Layer[] l2 = model2.getLayers();
    for (int i = 0; i < l2.length; i++)
        assertTrue(l2[i].getListeners() != null && l2[i].getListeners().size() == 1);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:19,代码来源:MultiLayerNeuralNetConfigurationTest.java

示例7: testCifarDataSetIteratorReset

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
@Ignore // use when checking cifar dataset iterator
@Test
public void testCifarDataSetIteratorReset() {
    int epochs = 2;
    Nd4j.getRandom().setSeed(12345);
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .weightInit(WeightInit.XAVIER).seed(12345L).list()
                    .layer(0, new DenseLayer.Builder().nIn(400).nOut(50).activation(Activation.RELU).build())
                    .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                                    .activation(Activation.SOFTMAX).nIn(50).nOut(10).build())
                    .pretrain(false).backprop(true)
                    .inputPreProcessor(0, new CnnToFeedForwardPreProcessor(20, 20, 1)).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();
    net.setListeners(new ScoreIterationListener(1));

    MultipleEpochsIterator ds =
                    new MultipleEpochsIterator(epochs, new CifarDataSetIterator(10, 20, new int[] {20, 20, 1}));
    net.fit(ds);
    assertEquals(epochs, ds.epochs);
    assertEquals(2, ds.batch);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:24,代码来源:MultipleEpochsIteratorTest.java

示例8: testRemoteFull

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
@Test
@Ignore
public void testRemoteFull() throws Exception {
    //Use this in conjunction with startRemoteUI()

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).list()
                    .layer(0, new DenseLayer.Builder().activation(Activation.TANH).nIn(4).nOut(4).build())
                    .layer(1, new OutputLayer.Builder().lossFunction(LossFunctions.LossFunction.MCXENT)
                                    .activation(Activation.SOFTMAX).nIn(4).nOut(3).build())
                    .pretrain(false).backprop(true).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();
    StatsStorageRouter ssr = new RemoteUIStatsStorageRouter("http://localhost:9000");
    net.setListeners(new StatsListener(ssr), new ScoreIterationListener(1));

    DataSetIterator iter = new IrisDataSetIterator(150, 150);

    for (int i = 0; i < 500; i++) {
        net.fit(iter);
        //            Thread.sleep(100);
        Thread.sleep(100);
    }

}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:27,代码来源:TestRemoteReceiver.java

示例9: testNetwork

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
@Test
public void testNetwork() {
    DatasetFacade df = DatasetFacade.dataRows(sqlContext.read().json("src/test/resources/dl4jnetwork"));
    Pipeline p = new Pipeline().setStages(new PipelineStage[] {getAssembler(new String[] {"x", "y"}, "features")});
    DatasetFacade part2 = DatasetFacade.dataRows(p.fit(df.get()).transform(df.get()).select("features", "label"));

    ParamSerializer ps = new ParamHelper();
    MultiLayerConfiguration mc = getNNConfiguration();
    Collection<IterationListener> il = new ArrayList<>();
    il.add(new ScoreIterationListener(1));

    SparkDl4jNetwork sparkDl4jNetwork =
                    new SparkDl4jNetwork(mc, 2, ps, 1, il, true).setFeaturesCol("features").setLabelCol("label");

    SparkDl4jModel sm = sparkDl4jNetwork.fit(part2.get());
    MultiLayerNetwork mln = sm.getMultiLayerNetwork();
    Assert.assertNotNull(mln);
    DatasetFacade transformed = DatasetFacade.dataRows(sm.transform(part2.get()));
    List<?> rows = transformed.get().collectAsList();
    Assert.assertNotNull(sm.getTrainingStats());
    Assert.assertNotNull(rows);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:23,代码来源:SparkDl4jNetworkTest.java

示例10: testNetworkLoader

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
@Test
public void testNetworkLoader() throws Exception {
    DatasetFacade df = DatasetFacade.dataRows(sqlContext.read().json("src/test/resources/dl4jnetwork"));
    Pipeline p = new Pipeline().setStages(new PipelineStage[] {getAssembler(new String[] {"x", "y"}, "features")});
    DatasetFacade part2 = DatasetFacade.dataRows(p.fit(df.get()).transform(df.get()).select("features", "label"));

    ParamSerializer ps = new ParamHelper();
    MultiLayerConfiguration mc = getNNConfiguration();
    Collection<IterationListener> il = new ArrayList<>();
    il.add(new ScoreIterationListener(1));

    SparkDl4jNetwork sparkDl4jNetwork =
                    new SparkDl4jNetwork(mc, 2, ps, 1, il, true).setFeaturesCol("features").setLabelCol("label");

    String fileName = UUID.randomUUID().toString();
    SparkDl4jModel sm = sparkDl4jNetwork.fit(part2.get());
    sm.write().overwrite().save(fileName);
    SparkDl4jModel spdm = SparkDl4jModel.load(fileName);
    Assert.assertNotNull(spdm);

    File file1 = new File(fileName);
    File file2 = new File(fileName + "_metadata");
    FileUtils.deleteDirectory(file1);
    FileUtils.deleteDirectory(file2);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:26,代码来源:SparkDl4jNetworkTest.java

示例11: logScore

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
public Builder logScore() {
	if(!m_network.isModelInitialized()) {
		throw new RuntimeException("Model is not yet initialized. You can initialized the model first with e.g. configuration() or loadFromFile()");
	}
	
	m_network.setListeners(new ScoreIterationListener(1));
	return this;
}
 
开发者ID:braeunlich,项目名称:anagnostes,代码行数:9,代码来源:NeuralNetwork.java

示例12: main

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
public static void main(String[] args) throws Exception {
    final int numRows = 28;
    final int numColumns = 28;
    int seed = 123;
    int numSamples = MnistDataFetcher.NUM_EXAMPLES;
    int batchSize = 1000;
    int iterations = 1;
    int listenerFreq = iterations/5;

    log.info("Load data....");
    DataSetIterator iter = new MnistDataSetIterator(batchSize,numSamples,true);

    log.info("Build model....");
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
            .seed(seed)
            .iterations(iterations)
            .optimizationAlgo(OptimizationAlgorithm.LINE_GRADIENT_DESCENT)
            .list(8)
            .layer(0, new RBM.Builder().nIn(numRows * numColumns).nOut(2000).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(1, new RBM.Builder().nIn(2000).nOut(1000).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(2, new RBM.Builder().nIn(1000).nOut(500).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(3, new RBM.Builder().nIn(500).nOut(30).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(4, new RBM.Builder().nIn(30).nOut(500).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build()) 
            .layer(5, new RBM.Builder().nIn(500).nOut(1000).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(6, new RBM.Builder().nIn(1000).nOut(2000).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(7, new OutputLayer.Builder(LossFunctions.LossFunction.MSE).activation(Activation.SIGMOID).nIn(2000).nOut(numRows*numColumns).build())
            .pretrain(true).backprop(true)
            .build();

    MultiLayerNetwork model = new MultiLayerNetwork(conf);
    model.init();

    model.setListeners(new ScoreIterationListener(listenerFreq));

    log.info("Train model....");
    while(iter.hasNext()) {
        DataSet next = iter.next();
        model.fit(new DataSet(next.getFeatureMatrix(),next.getFeatureMatrix()));
    }
}
 
开发者ID:PacktPublishing,项目名称:Deep-Learning-with-Hadoop,代码行数:41,代码来源:DeepAutoEncoder.java

示例13: main

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
public static void main(String[] args) throws IOException {
    Nd4j.getMemoryManager().setAutoGcWindow(GC_WINDOW);
    // create dictionaries
    Dictionaries.createDictionary(CORPUS_FILENAME, ROW_SIZE);
    // try to load network file
    File networkFile = new File(obtainFilePath(MODEL_FILENAME));
    int offset = 0;
    if (networkFile.exists()) {
        System.out.println("Loading the existing network...");
        net = ModelSerializer.restoreComputationGraph(networkFile);
        System.out.print("Enter d to start dialog or a number to continue training from that minibatch: ");
        String input;
        try (Scanner scanner = new Scanner(System.in)) {
            input = scanner.nextLine();
            if (input.toLowerCase().equals("d")) {
                startDialog(scanner);
            } else {
                offset = Integer.valueOf(input);
                test();
            }
        }
    } else {
        System.out.println("Creating a new network...");
        net = ConstructGraph.createComputationGraph(Dictionaries.dict);
    }
    System.out.println("Number of parameters: " + net.numParams());
    net.setListeners(new ScoreIterationListener(1));
    train(networkFile, offset);
}
 
开发者ID:IsaacChanghau,项目名称:NeuralNetworksLite,代码行数:30,代码来源:DialogueLSTMRNN.java

示例14: main

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
public static void main(String[] args){

        //Generate the training data
        DataSetIterator iterator = getTrainingData(batchSize,rng);

        //Create the network
        int numInput = 2;
        int numOutputs = 1;
        int nHidden = 10;
        MultiLayerNetwork net = new MultiLayerNetwork(new NeuralNetConfiguration.Builder()
                .seed(seed)
                .iterations(iterations)
                .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                .learningRate(learningRate)
                .weightInit(WeightInit.XAVIER)
                .updater(Updater.NESTEROVS).momentum(0.9)
                .list()
                .layer(0, new DenseLayer.Builder().nIn(numInput).nOut(nHidden)
                        .activation(Activation.TANH)
                        .build())
                .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.MSE)
                        .activation(Activation.IDENTITY)
                        .nIn(nHidden).nOut(numOutputs).build())
                .pretrain(false).backprop(true).build()
        );
        net.init();
        net.setListeners(new ScoreIterationListener(1));


        //Train the network on the full data set, and evaluate in periodically
        for( int i=0; i<nEpochs; i++ ){
            iterator.reset();
            net.fit(iterator);
        }
        // Test the addition of 2 numbers (Try different numbers here)
        final INDArray input = Nd4j.create(new double[] { 0.111111, 0.3333333333333 }, new int[] { 1, 2 });
        INDArray out = net.output(input, false);
        System.out.println(out);

    }
 
开发者ID:IsaacChanghau,项目名称:NeuralNetworksLite,代码行数:41,代码来源:RegressionSum.java

示例15: main

import org.deeplearning4j.optimize.listeners.ScoreIterationListener; //导入依赖的package包/类
public static void main(final String[] args){

        //Switch these two options to do different functions with different networks
        final MathFunction fn = new SinXDivXMathFunction();
        final MultiLayerConfiguration conf = getDeepDenseLayerNetworkConfiguration();

        //Generate the training data
        final INDArray x = Nd4j.linspace(-10,10,nSamples).reshape(nSamples, 1);
        final DataSetIterator iterator = getTrainingData(x,fn,batchSize,rng);

        //Create the network
        final MultiLayerNetwork net = new MultiLayerNetwork(conf);
        net.init();
        net.setListeners(new ScoreIterationListener(1));


        //Train the network on the full data set, and evaluate in periodically
        final INDArray[] networkPredictions = new INDArray[nEpochs/ plotFrequency];
        for( int i=0; i<nEpochs; i++ ){
            iterator.reset();
            net.fit(iterator);
            if((i+1) % plotFrequency == 0) networkPredictions[i/ plotFrequency] = net.output(x, false);
        }

        //Plot the target data and the network predictions
        plot(fn,x,fn.getFunctionValues(x),networkPredictions);
    }
 
开发者ID:IsaacChanghau,项目名称:NeuralNetworksLite,代码行数:28,代码来源:RegressionMathFunctions.java


注:本文中的org.deeplearning4j.optimize.listeners.ScoreIterationListener类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。