本文整理汇总了Python中pyspark.context.SparkContext.setSystemProperty方法的典型用法代码示例。如果您正苦于以下问题:Python SparkContext.setSystemProperty方法的具体用法?Python SparkContext.setSystemProperty怎么用?Python SparkContext.setSystemProperty使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类pyspark.context.SparkContext
的用法示例。
在下文中一共展示了SparkContext.setSystemProperty方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1:
# 需要导入模块: from pyspark.context import SparkContext [as 别名]
# 或者: from pyspark.context.SparkContext import setSystemProperty [as 别名]
# try again if port unavailable
if check == notfound:
port += 1
# return the first available port
return port
# this is the deprecated equivalent of ADD_JARS
add_files = None
if os.environ.get("ADD_FILES") is not None:
add_files = os.environ.get("ADD_FILES").split(',')
if os.environ.get("SPARK_EXECUTOR_URI"):
SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])
# setup mesos-based connection
conf = (SparkConf()
.setMaster(os.environ["SPARK_MASTER"]))
# set the UI port
conf.set("spark.ui.port", ui_get_available_port())
# configure docker containers as executors
conf.setSparkHome(os.environ.get("SPARK_HOME"))
conf.set("spark.mesos.executor.docker.image", "lab41/spark-mesos-dockerworker-ipython")
conf.set("spark.mesos.executor.home", "/usr/local/spark-1.4.1-bin-hadoop2.4")
conf.set("spark.executorEnv.MESOS_NATIVE_LIBRARY", "/usr/local/lib/libmesos.so")
conf.set("spark.network.timeout", "100")
示例2:
# 需要导入模块: from pyspark.context import SparkContext [as 别名]
# 或者: from pyspark.context.SparkContext import setSystemProperty [as 别名]
import math
# flag to confirm the writting of forecasted value to db
real_flag = config.real_flag
total_t1 = datetime.now()
## Logging ##
import os
import sys
from pyspark.context import SparkContext
from pyspark.sql.session import SparkSession
from pyspark.sql import SparkSession
#import pyspark
SparkContext.setSystemProperty('spark.executor.cores', '16')
full_t1 = datetime.now()
# initialise sparkContext
#conf1 = pyspark.SparkConf().setAll([('spark.executor.memory', '24g'), ('spark.executor.cores', 8), ('spark.cores.max', 8), ('spark.driver.memory','24g')])
#spark2 = SparkSession.builder.config(conf=conf1).getOrCreate()
spark1 = SparkSession.builder \
.master(config.sp_master) \
.appName(config.sp_appname) \
.config('spark.executor.memory', config.sp_memory) \
.config("spark.cores.max", config.sp_cores) \
.config('spark.executor.cores',config.sp_cores) \
.getOrCreate()
sc = spark1.sparkContext