本文整理匯總了Java中org.nd4j.linalg.factory.Nd4j.MAX_ELEMENTS_PER_SLICE屬性的典型用法代碼示例。如果您正苦於以下問題:Java Nd4j.MAX_ELEMENTS_PER_SLICE屬性的具體用法?Java Nd4j.MAX_ELEMENTS_PER_SLICE怎麽用?Java Nd4j.MAX_ELEMENTS_PER_SLICE使用的例子?那麽, 這裏精選的屬性代碼示例或許可以為您提供幫助。您也可以進一步了解該屬性所在類org.nd4j.linalg.factory.Nd4j
的用法示例。
在下文中一共展示了Nd4j.MAX_ELEMENTS_PER_SLICE屬性的1個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。
示例1: main
public static void main(String[] args) throws IOException {
Nd4j.MAX_SLICES_TO_PRINT = -1;
Nd4j.MAX_ELEMENTS_PER_SLICE = -1;
Nd4j.ENFORCE_NUMERICAL_STABILITY = true;
final int numRows = 4;
final int numColumns = 1;
int outputNum = 10;
int numSamples = 150;
int batchSize = 150;
int iterations = 100;
int seed = 123;
int listenerFreq = iterations/2;
log.info("Load data....");
DataSetIterator iter = new IrisDataSetIterator(batchSize, numSamples);
DataSet iris = iter.next();
iris.normalizeZeroMeanZeroUnitVariance();
log.info("Build model....");
NeuralNetConfiguration conf = new NeuralNetConfiguration.Builder().regularization(true)
.miniBatch(true)
.layer(new RBM.Builder().l2(1e-1).l1(1e-3)
.nIn(numRows * numColumns)
.nOut(outputNum)
.activation("relu")
.weightInit(WeightInit.RELU)
.lossFunction(LossFunctions.LossFunction.RECONSTRUCTION_CROSSENTROPY).k(3)
.hiddenUnit(HiddenUnit.RECTIFIED).visibleUnit(VisibleUnit.GAUSSIAN)
.updater(Updater.ADAGRAD).gradientNormalization(GradientNormalization.ClipL2PerLayer)
.build())
.seed(seed)
.iterations(iterations)
.learningRate(1e-3)
.optimizationAlgo(OptimizationAlgorithm.LBFGS)
.build();
Layer model = LayerFactories.getFactory(conf.getLayer()).create(conf);
model.setListeners(new ScoreIterationListener(listenerFreq));
log.info("Evaluate weights....");
INDArray w = model.getParam(DefaultParamInitializer.WEIGHT_KEY);
log.info("Weights: " + w);
log.info("Scaling the dataset");
iris.scale();
log.info("Train model....");
for(int i = 0; i < 20; i++) {
log.info("Epoch "+i+":");
model.fit(iris.getFeatureMatrix());
}
}