本文整理汇总了Java中cc.mallet.optimize.Optimizable.ByCombiningBatchGradient方法的典型用法代码示例。如果您正苦于以下问题:Java Optimizable.ByCombiningBatchGradient方法的具体用法?Java Optimizable.ByCombiningBatchGradient怎么用?Java Optimizable.ByCombiningBatchGradient使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类cc.mallet.optimize.Optimizable
的用法示例。
在下文中一共展示了Optimizable.ByCombiningBatchGradient方法的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。
示例1: ThreadedOptimizable
import cc.mallet.optimize.Optimizable; //导入方法依赖的package包/类
/**
* Initializes the optimizable and starts new threads.
*
* @param optimizable Optimizable to be parallelized
* @param numFactors Number of factors in model's parameters, used to
* initialize the gradient
* @param cacheIndicator Determines when value/gradient become stale
*/
public ThreadedOptimizable(Optimizable.ByCombiningBatchGradient optimizable,
InstanceList trainingSet, int numFactors,
CacheStaleIndicator cacheIndicator) {
// set up
this.trainingSet = trainingSet;
this.optimizable = optimizable;
int numBatches = optimizable.getNumBatches();
assert(numBatches > 0) : "Invalid number of batches: " + numBatches;
batchCachedValue = new double[numBatches];
batchCachedGradient = new ArrayList<double[]>(numBatches);
for (int i = 0; i < numBatches; ++i) {
batchCachedGradient.add(new double[numFactors]);
}
this.cacheIndicator = cacheIndicator;
logger.info("Creating " + numBatches + " threads for updating gradient...");
executor = (ThreadPoolExecutor) Executors.newFixedThreadPool(numBatches);
this.createTasks();
}
示例2: newCRFOptimizable
import cc.mallet.optimize.Optimizable; //导入方法依赖的package包/类
public Optimizable.ByCombiningBatchGradient newCRFOptimizable (CRF crf, InstanceList trainingData, int numBatches) {
return new CRFOptimizableByBatchLabelLikelihood (crf, trainingData, numBatches);
}
开发者ID:kostagiolasn,项目名称:NucleosomePatternClassifier,代码行数:4,代码来源:CRFOptimizableByBatchLabelLikelihood.java
示例3: getOptimizable
import cc.mallet.optimize.Optimizable; //导入方法依赖的package包/类
public Optimizable.ByCombiningBatchGradient getOptimizable() {
return optimizable;
}