当前位置: 首页>>代码示例>>Python>>正文


Python SparkContext.setSystemProperty方法代码示例

本文整理汇总了Python中pyspark.context.SparkContext.setSystemProperty方法的典型用法代码示例。如果您正苦于以下问题:Python SparkContext.setSystemProperty方法的具体用法?Python SparkContext.setSystemProperty怎么用?Python SparkContext.setSystemProperty使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在pyspark.context.SparkContext的用法示例。


在下文中一共展示了SparkContext.setSystemProperty方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1:

# 需要导入模块: from pyspark.context import SparkContext [as 别名]
# 或者: from pyspark.context.SparkContext import setSystemProperty [as 别名]
      # try again if port unavailable
      if check == notfound:
         port += 1

  # return the first available port
  return port


# this is the deprecated equivalent of ADD_JARS
add_files = None
if os.environ.get("ADD_FILES") is not None:
    add_files = os.environ.get("ADD_FILES").split(',')

if os.environ.get("SPARK_EXECUTOR_URI"):
    SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])

# setup mesos-based connection
conf = (SparkConf()
         .setMaster(os.environ["SPARK_MASTER"]))

# set the UI port
conf.set("spark.ui.port", ui_get_available_port())

# configure docker containers as executors
conf.setSparkHome(os.environ.get("SPARK_HOME"))
conf.set("spark.mesos.executor.docker.image", "lab41/spark-mesos-dockerworker-ipython")
conf.set("spark.mesos.executor.home", "/usr/local/spark-1.4.1-bin-hadoop2.4")
conf.set("spark.executorEnv.MESOS_NATIVE_LIBRARY", "/usr/local/lib/libmesos.so")
conf.set("spark.network.timeout", "100")
开发者ID:codeaudit,项目名称:ipython-spark-docker,代码行数:31,代码来源:shell.py

示例2:

# 需要导入模块: from pyspark.context import SparkContext [as 别名]
# 或者: from pyspark.context.SparkContext import setSystemProperty [as 别名]
import math
# flag to confirm the writting of forecasted value to db
real_flag = config.real_flag
total_t1 = datetime.now()
## Logging ##

import os
import sys


from pyspark.context import SparkContext
from pyspark.sql.session import SparkSession

from pyspark.sql import SparkSession
#import pyspark
SparkContext.setSystemProperty('spark.executor.cores', '16')
full_t1 = datetime.now()
# initialise sparkContext

#conf1 = pyspark.SparkConf().setAll([('spark.executor.memory', '24g'), ('spark.executor.cores', 8), ('spark.cores.max', 8), ('spark.driver.memory','24g')])
#spark2 = SparkSession.builder.config(conf=conf1).getOrCreate()

spark1 = SparkSession.builder \
    .master(config.sp_master) \
    .appName(config.sp_appname) \
    .config('spark.executor.memory', config.sp_memory) \
    .config("spark.cores.max", config.sp_cores) \
    .config('spark.executor.cores',config.sp_cores) \
    .getOrCreate()

sc = spark1.sparkContext
开发者ID:abhoopathi,项目名称:friendly-lamp,代码行数:33,代码来源:p8_final.py


注:本文中的pyspark.context.SparkContext.setSystemProperty方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。