本文整理汇总了Java中org.apache.spark.SparkConf.get方法的典型用法代码示例。如果您正苦于以下问题:Java SparkConf.get方法的具体用法?Java SparkConf.get怎么用?Java SparkConf.get使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类org.apache.spark.SparkConf
的用法示例。
在下文中一共展示了SparkConf.get方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。
示例1: updateLocalConfiguration
import org.apache.spark.SparkConf; //导入方法依赖的package包/类
/**
* When using a persistent context the running Context's configuration will override a passed
* in configuration. Spark allows us to override these inherited properties via
* SparkContext.setLocalProperty
*/
private void updateLocalConfiguration(final JavaSparkContext sparkContext, final SparkConf sparkConfiguration) {
/*
* While we could enumerate over the entire SparkConfiguration and copy into the Thread
* Local properties of the Spark Context this could cause adverse effects with future
* versions of Spark. Since the api for setting multiple local properties at once is
* restricted as private, we will only set those properties we know can effect SparkGraphComputer
* Execution rather than applying the entire configuration.
*/
final String[] validPropertyNames = {
"spark.job.description",
"spark.jobGroup.id",
"spark.job.interruptOnCancel",
"spark.scheduler.pool"
};
for (String propertyName : validPropertyNames) {
if (sparkConfiguration.contains(propertyName)) {
String propertyValue = sparkConfiguration.get(propertyName);
this.logger.info("Setting Thread Local SparkContext Property - "
+ propertyName + " : " + propertyValue);
sparkContext.setLocalProperty(propertyName, sparkConfiguration.get(propertyName));
}
}
}
示例2: ElasticsearchListener
import org.apache.spark.SparkConf; //导入方法依赖的package包/类
/**
* Listener constructor.
*
* Attempts to establish the default mappings in the Elasticsearch index.
*
* @param conf
*/
public ElasticsearchListener(SparkConf conf) {
esHost = conf.get("spark.elasticsearch.host", "localhost");
esPort = conf.get("spark.elasticsearch.port", "9200");
esIndex = conf.get("spark.elasticsearch.index", "spark");
appId = conf.getAppId();
appName = conf.get("spark.app.name", "");
esConnector = new ElasticsearchConnector(esHost, esPort, esIndex);
indexInitialized = esConnector.addDefaultMappings();
if (!indexInitialized) {
LOGGER.warn("Failed to initialize Elasticsearch index '" + esIndex + "' on " + esHost + ":" + esPort);
}
}