當前位置: 首頁>>代碼示例>>Java>>正文


Java SparkConf.contains方法代碼示例

本文整理匯總了Java中org.apache.spark.SparkConf.contains方法的典型用法代碼示例。如果您正苦於以下問題:Java SparkConf.contains方法的具體用法?Java SparkConf.contains怎麽用?Java SparkConf.contains使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在org.apache.spark.SparkConf的用法示例。


在下文中一共展示了SparkConf.contains方法的3個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: provide

import org.apache.spark.SparkConf; //導入方法依賴的package包/類
/**
 * Provide a {@link JavaSparkContext} based on default settings
 *
 * @return a {@link JavaSparkContext} based on default settings
 */
public static JavaSparkContext provide() {
    SparkConf config = new SparkConf()
            .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
            .registerKryoClasses(getSerializableClasses());

    if (!config.contains("spark.app.name")) {
        config.setAppName("RDF2X");
    }
    if (!config.contains("spark.master")) {
        config.setMaster("local");
    }

    // set serialization registration required if you want to make sure you registered all your classes
    // some spark internal classes will need to be registered as well
    // config.set("spark.kryo.registrationRequired", "true");


    log.info("Getting Spark Context for config: \n{}", config.toDebugString());
    return new JavaSparkContext(config);
}
 
開發者ID:Merck,項目名稱:rdf2x,代碼行數:26,代碼來源:SparkContextProvider.java

示例2: updateLocalConfiguration

import org.apache.spark.SparkConf; //導入方法依賴的package包/類
/**
 * When using a persistent context the running Context's configuration will override a passed
 * in configuration. Spark allows us to override these inherited properties via
 * SparkContext.setLocalProperty
 */
private void updateLocalConfiguration(final JavaSparkContext sparkContext, final SparkConf sparkConfiguration) {
    /*
     * While we could enumerate over the entire SparkConfiguration and copy into the Thread
     * Local properties of the Spark Context this could cause adverse effects with future
     * versions of Spark. Since the api for setting multiple local properties at once is
     * restricted as private, we will only set those properties we know can effect SparkGraphComputer
     * Execution rather than applying the entire configuration.
     */
    final String[] validPropertyNames = {
            "spark.job.description",
            "spark.jobGroup.id",
            "spark.job.interruptOnCancel",
            "spark.scheduler.pool"
    };

    for (String propertyName : validPropertyNames) {
        if (sparkConfiguration.contains(propertyName)) {
            String propertyValue = sparkConfiguration.get(propertyName);
            this.logger.info("Setting Thread Local SparkContext Property - "
                    + propertyName + " : " + propertyValue);

            sparkContext.setLocalProperty(propertyName, sparkConfiguration.get(propertyName));
        }
    }
}
 
開發者ID:PKUSilvester,項目名稱:LiteGraph,代碼行數:31,代碼來源:SparkGraphComputer.java

示例3: createSparkContext

import org.apache.spark.SparkConf; //導入方法依賴的package包/類
private static JavaSparkContext createSparkContext(SparkContextOptions contextOptions) {
  if (usesProvidedSparkContext) {
    LOG.info("Using a provided Spark Context");
    JavaSparkContext jsc = contextOptions.getProvidedSparkContext();
    if (jsc == null || jsc.sc().isStopped()){
      LOG.error("The provided Spark context " + jsc + " was not created or was stopped");
      throw new RuntimeException("The provided Spark context was not created or was stopped");
    }
    return jsc;
  } else {
    LOG.info("Creating a brand new Spark Context.");
    SparkConf conf = new SparkConf();
    if (!conf.contains("spark.master")) {
      // set master if not set.
      conf.setMaster(contextOptions.getSparkMaster());
    }

    if (contextOptions.getFilesToStage() != null && !contextOptions.getFilesToStage().isEmpty()) {
      conf.setJars(contextOptions.getFilesToStage().toArray(new String[0]));
    }

    conf.setAppName(contextOptions.getAppName());
    // register immutable collections serializers because the SDK uses them.
    conf.set("spark.kryo.registrator", BeamSparkRunnerRegistrator.class.getName());
    return new JavaSparkContext(conf);
  }
}
 
開發者ID:apache,項目名稱:beam,代碼行數:28,代碼來源:SparkContextFactory.java


注:本文中的org.apache.spark.SparkConf.contains方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。