当前位置: 首页>>代码示例>>Scala>>正文


Scala Time类代码示例

本文整理汇总了Scala中org.apache.spark.streaming.Time的典型用法代码示例。如果您正苦于以下问题:Scala Time类的具体用法?Scala Time怎么用?Scala Time使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


在下文中一共展示了Time类的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Scala代码示例。

示例1: TemperatureStreaming

//设置package包名称以及导入依赖的类
import com.typesafe.config.ConfigFactory
import org.apache.spark.SparkConf
import org.apache.spark.streaming.mqtt.MQTTUtils
import org.apache.spark.streaming.{Time, Seconds, StreamingContext}

import scala.util.Try


object TemperatureStreaming extends App {

  val sparkConf = new SparkConf().setAppName("RaspberryPi-Temperature").setMaster("local[*]")
  val ssc = new StreamingContext(sparkConf, Seconds(5))
  val brokerUrl = Try(ConfigFactory.load().getString("brokerUrl")).toOption.fold("tcp://localhost:1883")(x => x)
  val TOPIC = "TemperatureEvent"
  val temperatureStream = MQTTUtils.createStream(ssc, brokerUrl, TOPIC)
  temperatureStream.compute(Time(5)).fold()(x => x)
  temperatureStream.foreachRDD(s => s.foreach(d => println(s"Message from Pi:$d")))
  ssc.start()
  ssc.awaitTermination()
} 
开发者ID:shiv4nsh,项目名称:raspi-spark-streaming-mqtt,代码行数:21,代码来源:TemperatureStreaming.scala

示例2: OutputManager

//设置package包名称以及导入依赖的类
package iomanager

import java.util

import com.typesafe.config.Config
import org.apache.kafka.clients.producer.{KafkaProducer, ProducerConfig, ProducerRecord}
import org.apache.spark.streaming.Time

import scala.collection.JavaConversions._
import scala.collection.parallel.mutable.ParArray

object OutputManager {

  var producer: KafkaProducer[String, String] = null
  var predictionWindow = 0

  def prepareOutputStream(config: Config) = {

    predictionWindow = config.getInt("output.predictionWindow")*1000

    val brokers = config.getStringList("output.kafka.brokers").reduce(_ + "," + _)

    val props = new util.HashMap[String, Object]()
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, brokers)
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
      "org.apache.kafka.common.serialization.StringSerializer")
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
      "org.apache.kafka.common.serialization.StringSerializer")

    producer = new KafkaProducer[String, String](props)
  }

  def sendPredictions(predictions: (
    ParArray[(String, Double, String, String)],
      ParArray[(String, Double, String, String)]), time: Time) = {
    val simplePredictions =
      "{\"predictionStart\":"+time.milliseconds+
        ",\"predictionEnd\":"+(time.milliseconds+predictionWindow)+
      ",\"positive\":["+predictions._1.map(_._3).mkString(",")+
      "],\"negative\":["+predictions._2.map(_._3).mkString(",")+"]}"
    val advancedPredictions =
      "{\"predictionStart\":"+time.milliseconds+
        ",\"predictionEnd\":"+(time.milliseconds+predictionWindow)+
      ",\"positive\":["+predictions._1.map(_._4).mkString(",")+
      "],\"negative\":["+predictions._2.map(_._4).mkString(",")+"]}"

    val simpleMess =
      new ProducerRecord[String, String]("simple-predictions",simplePredictions)
    val advancedMess =
      new ProducerRecord[String, String]("advanced-predictions",advancedPredictions)

    producer.send(simpleMess)
    producer.send(advancedMess)
  }

} 
开发者ID:jandion,项目名称:SparkOFP,代码行数:57,代码来源:OutputManager.scala


注:本文中的org.apache.spark.streaming.Time类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。