当前位置: 首页>>代码示例>>Java>>正文


Java SparkConf.contains方法代码示例

本文整理汇总了Java中org.apache.spark.SparkConf.contains方法的典型用法代码示例。如果您正苦于以下问题:Java SparkConf.contains方法的具体用法?Java SparkConf.contains怎么用?Java SparkConf.contains使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.apache.spark.SparkConf的用法示例。


在下文中一共展示了SparkConf.contains方法的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: provide

import org.apache.spark.SparkConf; //导入方法依赖的package包/类
/**
 * Provide a {@link JavaSparkContext} based on default settings
 *
 * @return a {@link JavaSparkContext} based on default settings
 */
public static JavaSparkContext provide() {
    SparkConf config = new SparkConf()
            .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
            .registerKryoClasses(getSerializableClasses());

    if (!config.contains("spark.app.name")) {
        config.setAppName("RDF2X");
    }
    if (!config.contains("spark.master")) {
        config.setMaster("local");
    }

    // set serialization registration required if you want to make sure you registered all your classes
    // some spark internal classes will need to be registered as well
    // config.set("spark.kryo.registrationRequired", "true");


    log.info("Getting Spark Context for config: \n{}", config.toDebugString());
    return new JavaSparkContext(config);
}
 
开发者ID:Merck,项目名称:rdf2x,代码行数:26,代码来源:SparkContextProvider.java

示例2: updateLocalConfiguration

import org.apache.spark.SparkConf; //导入方法依赖的package包/类
/**
 * When using a persistent context the running Context's configuration will override a passed
 * in configuration. Spark allows us to override these inherited properties via
 * SparkContext.setLocalProperty
 */
private void updateLocalConfiguration(final JavaSparkContext sparkContext, final SparkConf sparkConfiguration) {
    /*
     * While we could enumerate over the entire SparkConfiguration and copy into the Thread
     * Local properties of the Spark Context this could cause adverse effects with future
     * versions of Spark. Since the api for setting multiple local properties at once is
     * restricted as private, we will only set those properties we know can effect SparkGraphComputer
     * Execution rather than applying the entire configuration.
     */
    final String[] validPropertyNames = {
            "spark.job.description",
            "spark.jobGroup.id",
            "spark.job.interruptOnCancel",
            "spark.scheduler.pool"
    };

    for (String propertyName : validPropertyNames) {
        if (sparkConfiguration.contains(propertyName)) {
            String propertyValue = sparkConfiguration.get(propertyName);
            this.logger.info("Setting Thread Local SparkContext Property - "
                    + propertyName + " : " + propertyValue);

            sparkContext.setLocalProperty(propertyName, sparkConfiguration.get(propertyName));
        }
    }
}
 
开发者ID:PKUSilvester,项目名称:LiteGraph,代码行数:31,代码来源:SparkGraphComputer.java

示例3: createSparkContext

import org.apache.spark.SparkConf; //导入方法依赖的package包/类
private static JavaSparkContext createSparkContext(SparkContextOptions contextOptions) {
  if (usesProvidedSparkContext) {
    LOG.info("Using a provided Spark Context");
    JavaSparkContext jsc = contextOptions.getProvidedSparkContext();
    if (jsc == null || jsc.sc().isStopped()){
      LOG.error("The provided Spark context " + jsc + " was not created or was stopped");
      throw new RuntimeException("The provided Spark context was not created or was stopped");
    }
    return jsc;
  } else {
    LOG.info("Creating a brand new Spark Context.");
    SparkConf conf = new SparkConf();
    if (!conf.contains("spark.master")) {
      // set master if not set.
      conf.setMaster(contextOptions.getSparkMaster());
    }

    if (contextOptions.getFilesToStage() != null && !contextOptions.getFilesToStage().isEmpty()) {
      conf.setJars(contextOptions.getFilesToStage().toArray(new String[0]));
    }

    conf.setAppName(contextOptions.getAppName());
    // register immutable collections serializers because the SDK uses them.
    conf.set("spark.kryo.registrator", BeamSparkRunnerRegistrator.class.getName());
    return new JavaSparkContext(conf);
  }
}
 
开发者ID:apache,项目名称:beam,代码行数:28,代码来源:SparkContextFactory.java


注:本文中的org.apache.spark.SparkConf.contains方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。