当前位置: 首页>>代码示例>>Java>>正文


Java InfoGainAttributeEval.buildEvaluator方法代码示例

本文整理汇总了Java中weka.attributeSelection.InfoGainAttributeEval.buildEvaluator方法的典型用法代码示例。如果您正苦于以下问题:Java InfoGainAttributeEval.buildEvaluator方法的具体用法?Java InfoGainAttributeEval.buildEvaluator怎么用?Java InfoGainAttributeEval.buildEvaluator使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在weka.attributeSelection.InfoGainAttributeEval的用法示例。


在下文中一共展示了InfoGainAttributeEval.buildEvaluator方法的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: InformationGain

import weka.attributeSelection.InfoGainAttributeEval; //导入方法依赖的package包/类
public InformationGain(Instances dataset) throws Exception{
	data = dataset;
	infoG = new InfoGainAttributeEval();
	infoG.buildEvaluator(data);
	filter = new AttributeSelection();
	search = new Ranker();
	search.setThreshold(0.0);
}
 
开发者ID:a-n-d-r-e-i,项目名称:seagull,代码行数:9,代码来源:InformationGain.java

示例2: classify

import weka.attributeSelection.InfoGainAttributeEval; //导入方法依赖的package包/类
public void classify(String trainingFile,String testingFile) {

        try {
            initTrainingSet(trainingFile);
            initTestingSet(testingFile);



            // train NaiveBayes
            J48 cModel = new J48();
            cModel.buildClassifier(TrainingSet);
            Instance current;
            double pred=0;
            for (int i = 0; i < TestingSet.numInstances(); i++) {
                current=TestingSet.get(i);
                if(featureVectorClassValues.get((int)pred).equalsIgnoreCase("Statement")||featureVectorClassValues.get((int)pred).equalsIgnoreCase("Backchannel Question")||featureVectorClassValues.get((int)pred).equalsIgnoreCase("Yes-No Question")||featureVectorClassValues.get((int)pred).equalsIgnoreCase("Open Question")){
                    current.setValue(featureVectorAttributes.get(0),featureVectorClassValues.get((int)pred));
                    System.out.println(pred+"  :  "+featureVectorClassValues.get((int)pred));
                    System.out.println(current.toString());
               }
                pred=cModel.classifyInstance(current);

            }




//            J48 cModel = new J48();
//            cModel.setUnpruned(true);
//            cModel.buildClassifier(TrainingSet);

            Evaluation eTest = new Evaluation(TrainingSet);
            eTest.evaluateModel(cModel, TestingSet);


            //print out the results
            System.out.println("=====================================================================");
            System.out.println("Results for "+this.getClass().getSimpleName());
            String strSummary = eTest.toSummaryString();
            System.out.println(strSummary);

            System.out.println("F-measure : "+eTest.weightedFMeasure());
            System.out.println("precision : "+eTest.weightedPrecision());
            System.out.println("recall : "+eTest.weightedRecall());
            System.out.println("=====================================================================");


            InfoGainAttributeEval infoGainAttributeEval = new InfoGainAttributeEval();
            infoGainAttributeEval.buildEvaluator(TrainingSet);

            for (int i = 0; i <featureVectorAttributes.size()-1; i++) {
                double v = infoGainAttributeEval.evaluateAttribute(i);
                System.out.print(featureVectorAttributes.get(i).name()+"\t\t");
                System.out.println(v);
            }

        } catch (Exception e) {
            e.printStackTrace();
        }

    }
 
开发者ID:catchsudheera,项目名称:sanwada,代码行数:62,代码来源:FeatureSet05.java

示例3: classify

import weka.attributeSelection.InfoGainAttributeEval; //导入方法依赖的package包/类
public void classify(String trainingFile,String testingFile) {

        try {

           // initiateBagOfWords(trainingFile);
            initTrainingSet(trainingFile);

           // initiateBagOfWords(testingFile);
            initTestingSet(testingFile);

            StringToWordVector filter = new StringToWordVector();
            int[] indices= new int[1];
            indices[0]=6;
            filter.setAttributeIndicesArray(indices);
            filter.setInputFormat(TrainingSet);
            filter.setWordsToKeep(6);
            filter.setDoNotOperateOnPerClassBasis(false);
            filter.setTFTransform(true);
            filter.setOutputWordCounts(true);

            TrainingSet = Filter.useFilter(TrainingSet, filter);
            TestingSet = Filter.useFilter(TestingSet, filter);



            Classifier cModel = new SimpleLogistic();
            cModel.buildClassifier(TrainingSet);

            weka.core.SerializationHelper.write(System.getProperty("user.dir")+"/Classification/src/datafiles/cls.model",cModel);
            weka.core.SerializationHelper.write(System.getProperty("user.dir")+"/Classification/src/datafiles/testingSet.model",TestingSet);

            Evaluation eTest = new Evaluation(TrainingSet);
            eTest.evaluateModel(cModel, TestingSet);


            //print out the results
            System.out.println("=====================================================================");
            System.out.println("Results for "+this.getClass().getSimpleName());
            String strSummary = eTest.toSummaryString();
            System.out.println(strSummary);

            InfoGainAttributeEval infoGainAttributeEval = new InfoGainAttributeEval();
            infoGainAttributeEval.buildEvaluator(TrainingSet);

            for (int i = 0; i <featureVectorAttributes.size()-1; i++) {
                double v = infoGainAttributeEval.evaluateAttribute(i);
                System.out.print(i+" "+featureVectorAttributes.get(i).name()+"\t\t");
                System.out.println(v);
            }

            System.out.println("=====================================================================");

            System.out.println("recall : "+eTest.weightedRecall());
            System.out.println("precision : "+eTest.weightedPrecision());
            System.out.println("F-measure : "+eTest.weightedFMeasure());

            System.out.println("================= Rounded Values =========================");

            System.out.println("recall : "+Math.round(eTest.weightedRecall() * 100.0) / 100.0);
            System.out.println("precision : "+Math.round(eTest.weightedPrecision() * 100.0) / 100.0);
            System.out.println("F-measure : "+Math.round(eTest.weightedFMeasure() * 100.0) / 100.0);
            System.out.println("=====================================================================");

            printErrors(cModel);


        } catch (Exception e) {
            e.printStackTrace();
        }

    }
 
开发者ID:catchsudheera,项目名称:sanwada,代码行数:72,代码来源:FeatureSetAll.java


注:本文中的weka.attributeSelection.InfoGainAttributeEval.buildEvaluator方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。