当前位置: 首页>>代码示例>>Java>>正文


Java SloppyMath.logAdd方法代码示例

本文整理汇总了Java中edu.berkeley.nlp.math.SloppyMath.logAdd方法的典型用法代码示例。如果您正苦于以下问题:Java SloppyMath.logAdd方法的具体用法?Java SloppyMath.logAdd怎么用?Java SloppyMath.logAdd使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在edu.berkeley.nlp.math.SloppyMath的用法示例。


在下文中一共展示了SloppyMath.logAdd方法的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: getLogProbabilities

import edu.berkeley.nlp.math.SloppyMath; //导入方法依赖的package包/类
/**
 * Calculate the log probabilities of each class, for the given datum
 * (feature bundle).
 */
public <F, L> double[] getLogProbabilities(EncodedDatum datum,
		double[] weights, Encoding<F, L> encoding,
		IndexLinearizer indexLinearizer) {
	// Compute unnormalized log probabilities
	int numSubLabels = encoding.getNumSubLabels();
	double[] logProbabilities = DoubleArrays.constantArray(0.0,
			numSubLabels);
	for (int i = 0; i < datum.getNumActiveFeatures(); i++) {
		int featureIndex = datum.getFeatureIndex(i);
		double featureCount = datum.getFeatureCount(i);
		for (int j = 0; j < numSubLabels; j++) {
			int index = indexLinearizer.getLinearIndex(featureIndex, j);
			double weight = weights[index];
			logProbabilities[j] += weight * featureCount;
		}
	}
	// Normalize
	double logNormalizer = SloppyMath.logAdd(logProbabilities);
	for (int i = 0; i < numSubLabels; i++) {
		logProbabilities[i] -= logNormalizer;
	}

	return logProbabilities;
}
 
开发者ID:text-machine-lab,项目名称:CliRel,代码行数:29,代码来源:ProperNameObjectiveFunction.java

示例2: getLogProbabilities

import edu.berkeley.nlp.math.SloppyMath; //导入方法依赖的package包/类
/**
 * Calculate the log probabilities of each class, for the given datum
 * (feature bundle). Note that the weighted votes (refered to as
 * activations) are *almost* log probabilities, but need to be normalized.
 */
private static <F, L> double[] getLogProbabilities(EncodedDatum datum,
		double[] weights, Encoding<F, L> encoding,
		IndexLinearizer indexLinearizer) {

	double[] logProbabilities = new double[encoding.getNumLabels()];
	for (int labelIndex = 0; labelIndex < encoding.getNumLabels(); ++labelIndex) {
		for (int num = 0; num < datum.getNumActiveFeatures(); ++num) {
			int featureIndex = datum.getFeatureIndex(num);
			double featureCount = datum.getFeatureCount(num);
			int linearFeatureIndex = indexLinearizer.getLinearIndex(
					featureIndex, labelIndex);
			logProbabilities[labelIndex] += weights[linearFeatureIndex]
					* featureCount;
		}
	}

	double logSumProb = SloppyMath.logAdd(logProbabilities);
	for (int labelIndex = 0; labelIndex < encoding.getNumLabels(); ++labelIndex) {
		logProbabilities[labelIndex] -= logSumProb;
	}

	return logProbabilities;
}
 
开发者ID:text-machine-lab,项目名称:CliRel,代码行数:29,代码来源:MaximumEntropyClassifier.java

示例3: makeProbsFromLogScoresInPlace

import edu.berkeley.nlp.math.SloppyMath; //导入方法依赖的package包/类
/**
 * 
 * @param logScores
 * @param <K>
 */
public static <K> void makeProbsFromLogScoresInPlace(Counter<K> logScores) {
	double logSum = SloppyMath.logAdd(logScores);
	for (Map.Entry<K, Double> entry : logScores.entrySet()) {
		double logScore = entry.getValue();
		double prob = Math.exp(logScore - logSum);
		entry.setValue(prob);
	}
	logScores.setDirty(true);
}
 
开发者ID:text-machine-lab,项目名称:CliRel,代码行数:15,代码来源:Counters.java


注:本文中的edu.berkeley.nlp.math.SloppyMath.logAdd方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。