本文整理汇总了Java中org.apache.spark.SparkConf.setJars方法的典型用法代码示例。如果您正苦于以下问题:Java SparkConf.setJars方法的具体用法?Java SparkConf.setJars怎么用?Java SparkConf.setJars使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类org.apache.spark.SparkConf
的用法示例。
在下文中一共展示了SparkConf.setJars方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。
示例1: jsc
import org.apache.spark.SparkConf; //导入方法依赖的package包/类
@Memoized
JavaStreamingContext jsc() {
SparkConf conf = new SparkConf(true)
.setMaster(master())
.setAppName(getClass().getName());
if (!jars().isEmpty()) conf.setJars(jars().toArray(new String[0]));
for (Map.Entry<String, String> entry : conf().entrySet()) {
conf.set(entry.getKey(), entry.getValue());
}
return new JavaStreamingContext(conf, new Duration(batchDuration()));
}
示例2: createSparkContext
import org.apache.spark.SparkConf; //导入方法依赖的package包/类
private static JavaSparkContext createSparkContext(SparkContextOptions contextOptions) {
if (usesProvidedSparkContext) {
LOG.info("Using a provided Spark Context");
JavaSparkContext jsc = contextOptions.getProvidedSparkContext();
if (jsc == null || jsc.sc().isStopped()){
LOG.error("The provided Spark context " + jsc + " was not created or was stopped");
throw new RuntimeException("The provided Spark context was not created or was stopped");
}
return jsc;
} else {
LOG.info("Creating a brand new Spark Context.");
SparkConf conf = new SparkConf();
if (!conf.contains("spark.master")) {
// set master if not set.
conf.setMaster(contextOptions.getSparkMaster());
}
if (contextOptions.getFilesToStage() != null && !contextOptions.getFilesToStage().isEmpty()) {
conf.setJars(contextOptions.getFilesToStage().toArray(new String[0]));
}
conf.setAppName(contextOptions.getAppName());
// register immutable collections serializers because the SDK uses them.
conf.set("spark.kryo.registrator", BeamSparkRunnerRegistrator.class.getName());
return new JavaSparkContext(conf);
}
}