本文整理汇总了Java中cc.mallet.types.Instance.unLock方法的典型用法代码示例。如果您正苦于以下问题:Java Instance.unLock方法的具体用法?Java Instance.unLock怎么用?Java Instance.unLock使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类cc.mallet.types.Instance
的用法示例。
在下文中一共展示了Instance.unLock方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。
示例1: convolution
import cc.mallet.types.Instance; //导入方法依赖的package包/类
/**
* construct word co-occurrence features from the original sequence
* do combinatoric, n choose 2, can be extended to n choose 3
public void convolution() {
int fi = -1;
int pre = -1;
int i,j;
int curLen = length;
for(i = 0; i < curLen-1; i++) {
for(j = i + 1; j < curLen; j++) {
pre = features[i];
fi = features[j];
Object preO = dictionary.lookupObject(pre);
Object curO = dictionary.lookupObject(fi);
Object coO = preO.toString() + "_" + curO.toString();
add(coO);
}
}
}*/
public Instance pipe (Instance carrier)
{
FeatureSequence fseq = (FeatureSequence) carrier.getData();
FeatureSequence ret =
new FeatureSequence ((Alphabet)getDataAlphabet());
int i,j, curLen;
curLen=fseq.getLength();
//first add fseq to ret
for(i = 0; i < curLen; i++) {
ret.add(fseq.getObjectAtPosition(i));
}
//second word co-occurrence
int pre, cur;
Object coO;
for(i = 0; i < curLen-1; i++) {
for(j = i + 1; j < curLen; j++) {
pre = fseq.getIndexAtPosition(i);
cur = fseq.getIndexAtPosition(j);
coO = pre + "_" + cur;
ret.add(coO);
}
}
if(carrier.isLocked()) {
carrier.unLock();
}
carrier.setData(ret);
return carrier;
}
示例2: train
import cc.mallet.types.Instance; //导入方法依赖的package包/类
public NaiveBayes train (InstanceList trainingSet)
{
// Get a classifier trained on the labeled examples only
NaiveBayes c = (NaiveBayes) nbTrainer.newClassifierTrainer().train (trainingSet);
double prevLogLikelihood = 0, logLikelihood = 0;
boolean converged = false;
int iteration = 0;
while (!converged) {
// Make a new trainingSet that has some labels set
InstanceList trainingSet2 = new InstanceList (trainingSet.getPipe());
for (int ii = 0; ii < trainingSet.size(); ii++) {
Instance inst = trainingSet.get(ii);
if (inst.getLabeling() != null)
trainingSet2.add(inst, 1.0);
else {
Instance inst2 = inst.shallowCopy();
inst2.unLock();
inst2.setLabeling(c.classify(inst).getLabeling());
inst2.lock();
trainingSet2.add(inst2, unlabeledDataWeight);
}
}
c = (NaiveBayes) nbTrainer.newClassifierTrainer().train (trainingSet2);
logLikelihood = c.dataLogLikelihood (trainingSet2);
System.err.println ("Loglikelihood = "+logLikelihood);
// Wait for a change in log-likelihood of less than 0.01% and at least 10 iterations
if (Math.abs((logLikelihood - prevLogLikelihood)/logLikelihood) < 0.0001)
converged = true;
prevLogLikelihood = logLikelihood;
iteration++;
}
return c;
}