本文整理匯總了Java中org.apache.spark.SparkConf.setJars方法的典型用法代碼示例。如果您正苦於以下問題:Java SparkConf.setJars方法的具體用法?Java SparkConf.setJars怎麽用?Java SparkConf.setJars使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在類org.apache.spark.SparkConf
的用法示例。
在下文中一共展示了SparkConf.setJars方法的2個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。
示例1: jsc
import org.apache.spark.SparkConf; //導入方法依賴的package包/類
@Memoized
JavaStreamingContext jsc() {
SparkConf conf = new SparkConf(true)
.setMaster(master())
.setAppName(getClass().getName());
if (!jars().isEmpty()) conf.setJars(jars().toArray(new String[0]));
for (Map.Entry<String, String> entry : conf().entrySet()) {
conf.set(entry.getKey(), entry.getValue());
}
return new JavaStreamingContext(conf, new Duration(batchDuration()));
}
示例2: createSparkContext
import org.apache.spark.SparkConf; //導入方法依賴的package包/類
private static JavaSparkContext createSparkContext(SparkContextOptions contextOptions) {
if (usesProvidedSparkContext) {
LOG.info("Using a provided Spark Context");
JavaSparkContext jsc = contextOptions.getProvidedSparkContext();
if (jsc == null || jsc.sc().isStopped()){
LOG.error("The provided Spark context " + jsc + " was not created or was stopped");
throw new RuntimeException("The provided Spark context was not created or was stopped");
}
return jsc;
} else {
LOG.info("Creating a brand new Spark Context.");
SparkConf conf = new SparkConf();
if (!conf.contains("spark.master")) {
// set master if not set.
conf.setMaster(contextOptions.getSparkMaster());
}
if (contextOptions.getFilesToStage() != null && !contextOptions.getFilesToStage().isEmpty()) {
conf.setJars(contextOptions.getFilesToStage().toArray(new String[0]));
}
conf.setAppName(contextOptions.getAppName());
// register immutable collections serializers because the SDK uses them.
conf.set("spark.kryo.registrator", BeamSparkRunnerRegistrator.class.getName());
return new JavaSparkContext(conf);
}
}