当前位置: 首页>>代码示例>>Java>>正文


Java RbfKernel类代码示例

本文整理汇总了Java中it.uniroma2.sag.kelp.kernel.standard.RbfKernel的典型用法代码示例。如果您正苦于以下问题:Java RbfKernel类的具体用法?Java RbfKernel怎么用?Java RbfKernel使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


RbfKernel类属于it.uniroma2.sag.kelp.kernel.standard包,在下文中一共展示了RbfKernel类的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: main

import it.uniroma2.sag.kelp.kernel.standard.RbfKernel; //导入依赖的package包/类
public static void main(String[] args) throws Exception {
	// The epsilon in loss function of the regressor
	float pReg = 0.1f;
	// The regularization parameter of the regressor
	float c = 2f;
	// The gamma parameter in the RBF kernel
	float gamma = 1f;

	// The label indicating the value considered by the regressor
	Label label = new StringLabel("r");

	// Load the dataset
	SimpleDataset dataset = new SimpleDataset();
	dataset.populate("src/main/resources/sv_regression_test/mg_scale.klp");
	// Split the dataset in train and test datasets
	dataset.shuffleExamples(new Random(0));
	SimpleDataset[] split = dataset.split(0.7f);
	SimpleDataset trainDataset = split[0];
	SimpleDataset testDataset = split[1];

	// Kernel for the first representation (0-index)
	Kernel linear = new LinearKernel("0");
	// Applying the RBF kernel
	Kernel rbf = new RbfKernel(gamma, linear);
	// Applying a cache
	FixIndexKernelCache kernelCache = new FixIndexKernelCache(
			trainDataset.getNumberOfExamples());
	rbf.setKernelCache(kernelCache);

	// instantiate the regressor
	EpsilonSvmRegression regression = new EpsilonSvmRegression(rbf, label,
			c, pReg);

	// learn
	regression.learn(trainDataset);
	// get the prediction function
	RegressionFunction regressor = regression.getPredictionFunction();

	// initializing the performance evaluator
	RegressorEvaluator evaluator = new RegressorEvaluator(
			trainDataset.getRegressionProperties());

	// For each example from the test set
	for (Example e : testDataset.getExamples()) {
		// Predict the value
		Prediction prediction = regressor.predict(e);
		// Print the original and the predicted values
		System.out.println("real value: " + e.getRegressionValue(label)
				+ "\t-\tpredicted value: " + prediction.getScore(label));
		// Update the evaluator
		evaluator.addCount(e, prediction);
	}

	// Get the Mean Squared Error for the targeted label
	float measSquareError = evaluator.getMeanSquaredError(label);

	System.out.println("\nMean Squared Error:\t" + measSquareError);
}
 
开发者ID:SAG-KeLP-Legacy,项目名称:kelp-examples,代码行数:59,代码来源:EpsilonSVRegressionExample.java

示例2: main

import it.uniroma2.sag.kelp.kernel.standard.RbfKernel; //导入依赖的package包/类
public static void main(String[] args) {
	try {
		// Read a dataset into a trainingSet variable
		SimpleDataset trainingSet = new SimpleDataset();
		trainingSet.populate("src/main/resources/multiplerepresentation/train.klp");
		// Read a dataset into a test variable
		SimpleDataset testSet = new SimpleDataset();
		testSet.populate("src/main/resources/multiplerepresentation/test.klp");

		List<Label> classes = trainingSet.getClassificationLabels();

		
		for (int i=0; i<classes.size(); ++i) {
			Label l = classes.get(i);
			System.out.println("Class: " + l.toString());
			System.out.println(trainingSet.getNumberOfPositiveExamples(l));
			System.out.println(testSet.getNumberOfPositiveExamples(l));
		}
		
		// instantiate a passive aggressive algorithm
		KernelizedPassiveAggressiveClassification kPA = new KernelizedPassiveAggressiveClassification();
		// set an aggressiveness parameter
		kPA.setC(2f);

		// Kernel for the first representation (0-index)
		Kernel linear = new LinearKernel("0");
		// Normalize the linear kernel
		NormalizationKernel normalizedKernel = new NormalizationKernel(
				linear);
		// Apply a 2-degree Polynomial kernel on the score (normalized) computed by
		// the linear kernel
		Kernel polyKernel = new PolynomialKernel(2f, normalizedKernel);

		// Kernel for the second representation (1-index)
		Kernel linear1 = new LinearKernel("1");
		// Normalize the linear kernel
		NormalizationKernel normalizedKernel1 = new NormalizationKernel(
				linear1);
		// Apply a RBF kernel on the score (normalized) computed by
		// the linear kernel
		Kernel rbfKernel = new RbfKernel(2f, normalizedKernel1);
		// tell the algorithm that the kernel we want to use in learning is
		// the polynomial kernel

		LinearKernelCombination linearCombination = new LinearKernelCombination();
		linearCombination.addKernel(1f, polyKernel);
		linearCombination.addKernel(1f, rbfKernel);
		// normalize the weights such that their sum is 1
		linearCombination.normalizeWeights();
		
		// set the kernel for the PA algorithm
		kPA.setKernel(linearCombination);
		
		// Instantiate a OneVsAll learning algorithm
		// It is a so called meta learner, it receives in input a binary learning algorithm
		OneVsAllLearning metaOneVsAllLearner = new OneVsAllLearning();
		metaOneVsAllLearner.setBaseAlgorithm(kPA);
		metaOneVsAllLearner.setLabels(classes);

		long startLearningTime = System.currentTimeMillis();
		// learn and get the prediction function
		metaOneVsAllLearner.learn(trainingSet);
		OneVsAllClassifier f = metaOneVsAllLearner.getPredictionFunction();
		long endLearningTime = System.currentTimeMillis();

		// classify examples and compute some statistics
		MulticlassClassificationEvaluator ev = new MulticlassClassificationEvaluator(classes);
		for (Example e : testSet.getExamples()) {
			OneVsAllClassificationOutput prediction = f.predict(e);
			ev.addCount(e, prediction);
		}

		System.out
				.println("Accuracy: "
						+ ev.getAccuracy());
		System.out.println("Learning time without cache: " + (endLearningTime-startLearningTime) + " ms");
	} catch (Exception e1) {
		e1.printStackTrace();
	}
}
 
开发者ID:SAG-KeLP-Legacy,项目名称:kelp-examples,代码行数:81,代码来源:OneVsAllPassiveAggressiveExample.java


注:本文中的it.uniroma2.sag.kelp.kernel.standard.RbfKernel类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。