當前位置: 首頁>>代碼示例>>Java>>正文


Java J48.buildClassifier方法代碼示例

本文整理匯總了Java中weka.classifiers.trees.J48.buildClassifier方法的典型用法代碼示例。如果您正苦於以下問題:Java J48.buildClassifier方法的具體用法?Java J48.buildClassifier怎麽用?Java J48.buildClassifier使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在weka.classifiers.trees.J48的用法示例。


在下文中一共展示了J48.buildClassifier方法的15個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: performTraining

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
private J48 performTraining() {
        J48 j48 = new J48();
        String[] options = {"-U"};
//        Use unpruned tree. -U
        try {
            j48.setOptions(options);
            j48.buildClassifier(trainingData);
        } catch (Exception ex) {
            ex.printStackTrace();
        }
        return j48;
    }
 
開發者ID:PacktPublishing,項目名稱:Java-Data-Science-Made-Easy,代碼行數:13,代碼來源:BookDecisionTree.java

示例2: getEvalResultbySMOTE

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbySMOTE(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
開發者ID:Gu-Youngfeng,項目名稱:CraTer,代碼行數:40,代碼來源:ImbalanceProcessingAve.java

示例3: run

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
public void run() throws Exception {
    // data
    ArffLoader arffLoader = new ArffLoader();
    arffLoader.setSource(new File("/home/bhill/apps/weka/weka-3-6-11/data/soybean.arff"));
    Instances data = arffLoader.getDataSet();
    data.setClassIndex(data.numAttributes()-1);

    J48 j48 = new J48();
    j48.buildClassifier(data);



    // j48.m_root.m_sons

    System.out.println();
    // J48.train()
    // check for nodes
}
 
開發者ID:williamClanton,項目名稱:jbossBA,代碼行數:19,代碼來源:Main.java

示例4: main

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
public static void main(String[] args) throws Exception{
	
	String databasePath = "data/features.arff";
	
	// Load the data in arff format
	Instances data = new Instances(new BufferedReader(new FileReader(databasePath)));
	
	// Set class the last attribute as class
	data.setClassIndex(data.numAttributes() - 1);

	// Build a basic decision tree model
	String[] options = new String[]{};
	J48 model = new J48();
	model.setOptions(options);
	model.buildClassifier(data);
	
	// Output decision tree
	System.out.println("Decision tree model:\n"+model);
	
	// Output source code implementing the decision tree
	System.out.println("Source code:\n"+model.toSource("ActivityRecognitionEngine"));
	
	// Check accuracy of model using 10-fold cross-validation
	Evaluation eval = new Evaluation(data);
	eval.crossValidateModel(model, data, 10, new Random(1), new String[] {});
	System.out.println("Model performance:\n"+eval.toSummaryString());
	
	String[] activities = new String[]{"Walk", "Walk", "Walk", "Run", "Walk", "Run", "Run", "Sit", "Sit", "Sit"};
	DiscreteLowPass dlpFilter = new DiscreteLowPass(3);
	for(String str : activities){
		System.out.println(str +" -> "+ dlpFilter.filter(str));
	}
	
}
 
開發者ID:PacktPublishing,項目名稱:Machine-Learning-End-to-Endguide-for-Java-developers,代碼行數:35,代碼來源:ActivityRecognition.java

示例5: getEvalResultbyNo

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Only use C4.5 to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyNo(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);
		
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(j48, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
			
	}
 
開發者ID:Gu-Youngfeng,項目名稱:CraTer,代碼行數:33,代碼來源:ImbalanceProcessingAve.java

示例6: getEvalResultbyResampling

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>Resampling</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyResampling(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		Resample resample = new Resample();
		resample.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(resample);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
			
	}
 
開發者ID:Gu-Youngfeng,項目名稱:CraTer,代碼行數:40,代碼來源:ImbalanceProcessingAve.java

示例7: getEvalResultbyCost

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>Cost-sensitive learning</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyCost(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/**Classifier setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);
		
		CostSensitiveClassifier csc = new CostSensitiveClassifier();
		csc.setClassifier(j48);
		csc.setCostMatrix(new CostMatrix(new BufferedReader(new FileReader("files/costm"))));
		
		Evaluation eval = new Evaluation(ins);
		
		eval.crossValidateModel(csc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
			
	}
 
開發者ID:Gu-Youngfeng,項目名稱:CraTer,代碼行數:38,代碼來源:ImbalanceProcessingAve.java

示例8: getEvalResultbyDefault

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyDefault(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
開發者ID:Gu-Youngfeng,項目名稱:CraTer,代碼行數:40,代碼來源:FeatureSelectionAve.java

示例9: trainJ48BySet

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
public static void trainJ48BySet(final Instances trainingSet) throws Exception {
        // Create a classifier
        final J48 tree = new J48();
        tree.setMinNumObj(1);
        //tree.setConfidenceFactor(0.5f);
        tree.setReducedErrorPruning(true);
        //tree.setDebug(true);
        //
        tree.buildClassifier(trainingSet);
//        ClassifierToJs.saveModel(tree, GitHubPublisher.localPath + RetailSalePrediction.predict_retail_sales + File.separator + "prediction_set_script.model");

        // Test the model
        final Evaluation eval = new Evaluation(trainingSet);
        eval.crossValidateModel(tree, trainingSet, 10, new Random(1));
        //eval.evaluateModel(tree, trainingSet);

        // Print the result à la Weka explorer:
        logger.info(eval.toSummaryString());
//        FileUtils.writeStringToFile(new File(GitHubPublisher.localPath + RetailSalePrediction.predict_retail_sales + File.separator + "prediction_set_summary.txt"), eval.toSummaryString());

//        try {
//            final File file = new File(GitHubPublisher.localPath + RetailSalePrediction.predict_retail_sales + File.separator + "prediction_set_script.js");
//            FileUtils.writeStringToFile(file, ClassifierToJs.compress(ClassifierToJs.toSource(tree, "predictCommonBySet")), "UTF-8");
//        } catch (final Exception e) {
//            logger.error(e.getLocalizedMessage(), e);
//        }
    }
 
開發者ID:cobr123,項目名稱:VirtaMarketAnalyzer,代碼行數:28,代碼來源:RetailSalePrediction.java

示例10: trainJ48CrossValidation

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
public static void trainJ48CrossValidation(final Instances trainingSet) throws Exception {
        // Create a classifier
        final J48 tree = new J48();
        tree.setMinNumObj(1);
        //tree.setConfidenceFactor(0.5f);
        tree.setReducedErrorPruning(true);
//        tree.setDebug(true);

        //evaluate j48 with cross validation
        final Evaluation eval = new Evaluation(trainingSet);

        //first supply the classifier
        //then the training data
        //number of folds
        //random seed
        eval.crossValidateModel(tree, trainingSet, 10, new Random(new Date().getTime()));
        logger.info(eval.toSummaryString());
        Utils.writeFile(GitHubPublisher.localPath + RetailSalePrediction.predict_retail_sales + File.separator + "prediction_cv_summary.txt", eval.toSummaryString());

        tree.buildClassifier(trainingSet);
//                logger.info(tree.graph());

//        try {
//            final File file = new File(GitHubPublisher.localPath + RetailSalePrediction.predict_retail_sales + File.separator + "prediction_cv_script.js");
//            FileUtils.writeStringToFile(file, ClassifierToJs.compress(ClassifierToJs.toSource(tree, "predictCommonByCV")), "UTF-8");
//        } catch (final Exception e) {
//            logger.error(e.getLocalizedMessage(), e);
//        }
    }
 
開發者ID:cobr123,項目名稱:VirtaMarketAnalyzer,代碼行數:30,代碼來源:RetailSalePrediction.java

示例11: train

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
@Override
public Classifier train(Instances instances) {
    J48 j48 = new J48();
    j48.setReducedErrorPruning(reducedErrorPruning);
    try {
        j48.buildClassifier(instances);
    } catch (Exception e) {
        throw new ClassifierBuildingException("Exception occured while building classifier: " + e.getMessage(), e);
    }
    return j48;
}
 
開發者ID:NLeSC,項目名稱:eEcology-Classification,代碼行數:12,代碼來源:J48Trainer.java

示例12: getDummyTreeLearner

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
private J48 getDummyTreeLearner() {
    J48 j48 = new J48() {
        private static final long serialVersionUID = 1L;

        @Override
        public void buildClassifier(Instances data) throws Exception {
        }
    };
    try {
        j48.buildClassifier(null);
    } catch (Exception e) {
        throw new RuntimeException(e); // TODO
    }
    return j48;
}
 
開發者ID:NLeSC,項目名稱:eEcology-Classification,代碼行數:16,代碼來源:ClassifierDescriptionSaverTest.java

示例13: crossValidation

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
public void crossValidation(String file, ArrayList<String> keys) {
	try {
		// load the file
		loadTrainFile(file);

		// build the classifier
		String[] options = { "-C", "0.1", "-M", "30" };

		J48 tree = new J48(); // new instance of tree
		tree.setOptions(options); // set the options
		tree.buildClassifier(train); // build classifier

		// evaluate the classifier
		Evaluation eval = new Evaluation(train);
		eval.crossValidateModel(tree, train, 10, new Random(1));

		// print the results
		Logger.log(LogLevel.Classification, eval.toSummaryString());
		double[][] matrix = eval.confusionMatrix();

		for (int i = 0; i < matrix.length; i++) {
			double[] line = matrix[i];
			for (int j = 0; j < line.length; j++) {
				System.out.print((int) line[j] + "\t");
			}
			System.out.println();
		}

		printLatex(keys, matrix);
	} catch (Exception e) {
		Logger.log(LogLevel.Error, e.toString());
	}
}
 
開發者ID:mbraeunlein,項目名稱:ExtendedHodoku,代碼行數:34,代碼來源:Analyzer.java

示例14: classify

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
public void classify(String trainingFile,String testingFile) {

        try {
            initTrainingSet(trainingFile);
            initTestingSet(testingFile);

            J48 cModel = new J48();
            cModel.setUnpruned(true);
            cModel.buildClassifier(TrainingSet);

            Evaluation eTest = new Evaluation(TrainingSet);
            eTest.evaluateModel(cModel, TestingSet);


            //print out the results
            System.out.println("=====================================================================");
            System.out.println("Results for "+this.getClass().getSimpleName());
            String strSummary = eTest.toSummaryString();
            System.out.println(strSummary);

            System.out.println("F-measure : "+eTest.weightedFMeasure());
            System.out.println("precision : "+eTest.weightedPrecision());
            System.out.println("recall : "+eTest.weightedRecall());
            System.out.println("=====================================================================");


        } catch (Exception e) {
            e.printStackTrace();
        }

    }
 
開發者ID:catchsudheera,項目名稱:sanwada,代碼行數:32,代碼來源:FeatureSet01.java

示例15: getEvalResultbyChiSquare

import weka.classifiers.trees.J48; //導入方法依賴的package包/類
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b>, combined with <b>Chi-Square</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyChiSquare(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/**chi-squared filter to process the whole dataset first*/
		ChiSquaredAttributeEval evall = new ChiSquaredAttributeEval();	
		Ranker ranker = new Ranker();
		AttributeSelection selector = new AttributeSelection();
		
		selector.setEvaluator(evall);
		selector.setSearch(ranker);
		selector.setInputFormat(ins);
		ins = Filter.useFilter(ins, selector);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
開發者ID:Gu-Youngfeng,項目名稱:CraTer,代碼行數:50,代碼來源:FeatureSelectionAve.java


注:本文中的weka.classifiers.trees.J48.buildClassifier方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。