本文整理汇总了Java中edu.berkeley.nlp.math.SloppyMath.logAdd方法的典型用法代码示例。如果您正苦于以下问题:Java SloppyMath.logAdd方法的具体用法?Java SloppyMath.logAdd怎么用?Java SloppyMath.logAdd使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类edu.berkeley.nlp.math.SloppyMath
的用法示例。
在下文中一共展示了SloppyMath.logAdd方法的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。
示例1: getLogProbabilities
import edu.berkeley.nlp.math.SloppyMath; //导入方法依赖的package包/类
/**
* Calculate the log probabilities of each class, for the given datum
* (feature bundle).
*/
public <F, L> double[] getLogProbabilities(EncodedDatum datum,
double[] weights, Encoding<F, L> encoding,
IndexLinearizer indexLinearizer) {
// Compute unnormalized log probabilities
int numSubLabels = encoding.getNumSubLabels();
double[] logProbabilities = DoubleArrays.constantArray(0.0,
numSubLabels);
for (int i = 0; i < datum.getNumActiveFeatures(); i++) {
int featureIndex = datum.getFeatureIndex(i);
double featureCount = datum.getFeatureCount(i);
for (int j = 0; j < numSubLabels; j++) {
int index = indexLinearizer.getLinearIndex(featureIndex, j);
double weight = weights[index];
logProbabilities[j] += weight * featureCount;
}
}
// Normalize
double logNormalizer = SloppyMath.logAdd(logProbabilities);
for (int i = 0; i < numSubLabels; i++) {
logProbabilities[i] -= logNormalizer;
}
return logProbabilities;
}
示例2: getLogProbabilities
import edu.berkeley.nlp.math.SloppyMath; //导入方法依赖的package包/类
/**
* Calculate the log probabilities of each class, for the given datum
* (feature bundle). Note that the weighted votes (refered to as
* activations) are *almost* log probabilities, but need to be normalized.
*/
private static <F, L> double[] getLogProbabilities(EncodedDatum datum,
double[] weights, Encoding<F, L> encoding,
IndexLinearizer indexLinearizer) {
double[] logProbabilities = new double[encoding.getNumLabels()];
for (int labelIndex = 0; labelIndex < encoding.getNumLabels(); ++labelIndex) {
for (int num = 0; num < datum.getNumActiveFeatures(); ++num) {
int featureIndex = datum.getFeatureIndex(num);
double featureCount = datum.getFeatureCount(num);
int linearFeatureIndex = indexLinearizer.getLinearIndex(
featureIndex, labelIndex);
logProbabilities[labelIndex] += weights[linearFeatureIndex]
* featureCount;
}
}
double logSumProb = SloppyMath.logAdd(logProbabilities);
for (int labelIndex = 0; labelIndex < encoding.getNumLabels(); ++labelIndex) {
logProbabilities[labelIndex] -= logSumProb;
}
return logProbabilities;
}
示例3: makeProbsFromLogScoresInPlace
import edu.berkeley.nlp.math.SloppyMath; //导入方法依赖的package包/类
/**
*
* @param logScores
* @param <K>
*/
public static <K> void makeProbsFromLogScoresInPlace(Counter<K> logScores) {
double logSum = SloppyMath.logAdd(logScores);
for (Map.Entry<K, Double> entry : logScores.entrySet()) {
double logScore = entry.getValue();
double prob = Math.exp(logScore - logSum);
entry.setValue(prob);
}
logScores.setDirty(true);
}