当前位置: 首页>>代码示例>>Java>>正文


Java KafkaProducer.close方法代码示例

本文整理汇总了Java中org.apache.kafka.clients.producer.KafkaProducer.close方法的典型用法代码示例。如果您正苦于以下问题:Java KafkaProducer.close方法的具体用法?Java KafkaProducer.close怎么用?Java KafkaProducer.close使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.apache.kafka.clients.producer.KafkaProducer的用法示例。


在下文中一共展示了KafkaProducer.close方法的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: main

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public static void main(String[] args) throws InterruptedException, IOException {
    UncaughtExceptionHandling.setup();

    KafkaProducer<ByteBuffer, ByteBuffer> msgProducer = KAFKA_CLIENTS
            .createProducer(ByteBufferSerializer.class, ByteBufferSerializer.class);

    LOG.info("Sending ...");

    for(int i = 0; i < TOTAL_MSGS; i++) {
        ByteBuffer data = ByteBuffer.allocate(4).putInt(i);
        msgProducer.send(new ProducerRecord<>(KMQ_CONFIG.getMsgTopic(), data));
        try { Thread.sleep(100L); } catch (InterruptedException e) { throw new RuntimeException(e); }
        LOG.info(String.format("Sent message %d", i));
    }

    msgProducer.close();

    LOG.info("Sent");
}
 
开发者ID:softwaremill,项目名称:kmq,代码行数:20,代码来源:StandaloneSender.java

示例2: createProducer

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public void createProducer(String bootstrapServer) {
  long numberOfEvents = 5;

  Properties props = new Properties();
  props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer);
  props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
  props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);

  KafkaProducer<String, String> producer = new KafkaProducer<>(
      props);

  for (int i = 0; i < numberOfEvents; i++) {
    String key = "testContainers";
    String value = "AreAwesome";
    ProducerRecord<String, String> record = new ProducerRecord<>(
        "hello_world_topic", key, value);
    try {
      producer.send(record).get();
    } catch (InterruptedException | ExecutionException e) {
      e.printStackTrace();
    }
    System.out.printf("key = %s, value = %s\n", key, value);
  }

  producer.close();
}
 
开发者ID:gAmUssA,项目名称:testcontainers-java-module-confluent-platform,代码行数:27,代码来源:HelloProducer.java

示例3: main

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public static void main(final String[] args) {
    Properties producerProps = new Properties();
    producerProps.put("bootstrap.servers", "localhost:9092");
    producerProps.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    producerProps.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    producerProps.put("acks", "all");
    producerProps.put("retries", 1);
    producerProps.put("batch.size", 20000);
    producerProps.put("linger.ms", 1);
    producerProps.put("buffer.memory", 24568545);
    KafkaProducer<String, String> producer = new KafkaProducer<String, String>(producerProps);

    for (int i = 0; i < 2000; i++) {
        ProducerRecord data = new ProducerRecord<String, String>("test1", "Hello this is record " + i);
        Future<RecordMetadata> recordMetadata = producer.send(data);
    }
    producer.close();
}
 
开发者ID:PacktPublishing,项目名称:Building-Data-Streaming-Applications-with-Apache-Kafka,代码行数:19,代码来源:DemoProducer.java

示例4: main

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public static void main(String[] args) {

		if (args.length < 1) {
			System.out.println("Provide number of vehicle");
			System.exit(1);
		}

		// Number of vehicles for which data needs to be generated.
		int numberOfvehicle = Integer.parseInt(args[0]);

		// Get producer to push data into Kafka
		KafkaProducer<Integer, String> producer = configureKafka();

		// Get vehicle start point.
		Map<String, Location> vehicleStartPoint = getVehicleStartPoints(numberOfvehicle);
		
		// Push data into Kafka
		pushVehicleStartPointToKafka(vehicleStartPoint, producer);

		producer.close();
	}
 
开发者ID:PacktPublishing,项目名称:Practical-Real-time-Processing-and-Analytics,代码行数:22,代码来源:VehicleStartPointGenerator.java

示例5: main

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public static void main(String[] args) {

		if (args.length < 2) {
			System.out.println("Provide total number of records and range of distance from start point");
			System.exit(1);
		}

		//Total number of records this simulator will generate
		int totalNumberOfRecords = Integer.parseInt(args[0]);
		
		//Distance in meters as Radius
		int distanceFromVehicleStartPoint = Integer.parseInt(args[1]);

		// Get Kafka producer
		KafkaProducer<Integer, String> producer = configureKafka();

		// Get Vehicle Start Points
		Map<String, Location> vehicleStartPoint = getVehicleStartPoints();
		
		// Generate data within distance and push to Kafka
		generateDataAndPushToKafka(producer, vehicleStartPoint.size(), totalNumberOfRecords,
				distanceFromVehicleStartPoint, vehicleStartPoint);
		
		producer.close();
	}
 
开发者ID:PacktPublishing,项目名称:Practical-Real-time-Processing-and-Analytics,代码行数:26,代码来源:VehicleDataGeneration.java

示例6: main

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
/**
	 * @param args
	 */
	public static void main(String[] args) {
		
		Properties props=new Properties();
		props.put("bootstrap.servers", "localhost:9092,localhost:9093");
		props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
		props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
		
		KafkaProducer<String,String> sampleProducer= new KafkaProducer<String,String>(props);
		
//		ProducerRecord<String, String> record = new ProducerRecord<String, String>(topicName, value);		
//		sampleProducer.send(record);
		for (int i = 0; i < 10; i++)
			sampleProducer.send(new ProducerRecord<String, String>("demo-topic1","Data:"+ Integer.toString(i)));
		sampleProducer.close();
	}
 
开发者ID:sarojrout,项目名称:spring-tutorial,代码行数:19,代码来源:SampleProducer.java

示例7: publishDummyDataNumbers

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public void publishDummyDataNumbers() {
    final String topic = "NumbersTopic";

    // Create publisher
    final Map<String, Object> config = new HashMap<>();
    config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
    config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
    config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");

    final KafkaProducer<Integer, Integer> producer = new KafkaProducer<>(config);
    for (int value = 0; value < 10000; value++) {
        producer.send(new ProducerRecord<>(topic, value, value));
    }
    producer.flush();
    producer.close();
}
 
开发者ID:SourceLabOrg,项目名称:kafka-webview,代码行数:17,代码来源:WebKafkaConsumerTest.java

示例8: main

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public static void main(String[] args) {
    Properties props = new Properties();
    props.put("bootstrap.servers", "localhost:9092");
    props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("value.serializer", "org.apache.kafka.common.serialization.LongSerializer");
    props.put("linger.ms", 0);

    KafkaProducer<String, Long> producer = new KafkaProducer<>(props);

    for(int i=0; i < 10000; i++){
        String ip = "127.0.0." + i % 10;
        System.out.println(ip);
        producer.send(new ProducerRecord<>("visits", ip, System.currentTimeMillis() + i));
    }

    producer.close();

}
 
开发者ID:ftrossbach,项目名称:kiqr,代码行数:19,代码来源:TestDriver.java

示例9: sendMessages

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
private static void sendMessages(KafkaClients clients, KmqConfig kmqConfig) {
    KafkaProducer<ByteBuffer, ByteBuffer> msgProducer = clients.createProducer(ByteBufferSerializer.class, ByteBufferSerializer.class);

    LOG.info("Sending ...");

    for(int i = 0; i < TOTAL_MSGS; i++) {
        ByteBuffer data = ByteBuffer.allocate(4).putInt(i);
        msgProducer.send(new ProducerRecord<>(kmqConfig.getMsgTopic(), data));
        try { Thread.sleep(100L); } catch (InterruptedException e) { throw new RuntimeException(e); }
    }

    msgProducer.close();

    LOG.info("Sent");
}
 
开发者ID:softwaremill,项目名称:kmq,代码行数:16,代码来源:EmbeddedExample.java

示例10: main

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public static void main(String args[]) {
	Properties properties = new Properties();
	 
	properties.put("bootstrap.servers", "localhost:9092");
	properties.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
	properties.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
	properties.put("acks", "1");
	 
	KafkaProducer<Integer, String> producer = new KafkaProducer<Integer, String>(properties);
	int counter =0;
	int nbrOfEventsRequired = Integer.parseInt(args[0]);
	while (counter<nbrOfEventsRequired) {
		StringBuffer stream = new StringBuffer();
		
		long phoneNumber = ThreadLocalRandom.current().nextLong(9999999950l,
				9999999999l);
		int bin = ThreadLocalRandom.current().nextInt(100000, 9999999);
		int bout = ThreadLocalRandom.current().nextInt(100000, 9999999);
		
		stream.append(phoneNumber);
		stream.append(",");
		stream.append(bin);
		stream.append(",");
		stream.append(bout);
		stream.append(",");
		stream.append(System.currentTimeMillis());

		System.out.println(stream.toString());
		ProducerRecord<Integer, String> data = new ProducerRecord<Integer, String>(
				"device-data", stream.toString());
		producer.send(data);
		counter++;
	}
	
	producer.close();
}
 
开发者ID:PacktPublishing,项目名称:Practical-Real-time-Processing-and-Analytics,代码行数:37,代码来源:DataGenerator.java

示例11: main

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public static void main(String args[]) {
	Properties properties = new Properties();
	 
	properties.put("bootstrap.servers", "localhost:9092");
	properties.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
	properties.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
	properties.put("acks", "1");
	 
	KafkaProducer<Integer, String> producer = new KafkaProducer<Integer, String>(properties);
	int counter =0;
	int nbrOfEventsRequired = Integer.parseInt(args[0]);
	while (counter<nbrOfEventsRequired) {
		StringBuffer stream = new StringBuffer();
		
		long phoneNumber = ThreadLocalRandom.current().nextLong(9999999950l,
				9999999960l);
		int bin = ThreadLocalRandom.current().nextInt(1000, 9999);
		int bout = ThreadLocalRandom.current().nextInt(1000, 9999);
		
		stream.append(phoneNumber);
		stream.append(",");
		stream.append(bin);
		stream.append(",");
		stream.append(bout);
		stream.append(",");
		stream.append(new Date(ThreadLocalRandom.current().nextLong()));

		System.out.println(stream.toString());
		ProducerRecord<Integer, String> data = new ProducerRecord<Integer, String>(
				"storm-trident-diy", stream.toString());
		producer.send(data);
		counter++;
	}
	
	producer.close();
}
 
开发者ID:PacktPublishing,项目名称:Practical-Real-time-Processing-and-Analytics,代码行数:37,代码来源:DataGenerator.java

示例12: run

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public void run(String brokers, long amount, long delay) throws IOException {
    // set up the producer
    KafkaProducer<String, String> producer;
    try (InputStream props = Resources.getResource("producer.props").openStream()) {
        Properties properties = new Properties();
        properties.load(props);

        if (brokers != null && !brokers.isEmpty()) {
            properties.put("bootstrap.servers", brokers);
        }

        producer = new KafkaProducer<>(properties);
    }

    try {
        for (int i = 0; i < amount; i++) {
            // send lots of messages
            Date t = new Date();
            producer.send(new ProducerRecord<String, String>(
                    "fast-messages", String.valueOf(i),
                    String.format("{\"type\":\"test\", \"t\":%d, \"k\":%d}", t.getTime(), i)));
            System.out.println("Sent msg number " + i);
            if (delay > 0) {
                Thread.sleep(delay);
            }
        }
    } catch (Throwable throwable) {
        System.out.printf("%s", throwable.getStackTrace());
    } finally {
        producer.close();
    }
}
 
开发者ID:javaronok,项目名称:kafka-mgd-sample,代码行数:33,代码来源:Producer.java

示例13: generate

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
static void generate(final String kafka) throws Exception {

        Runtime.getRuntime().addShutdownHook(new Thread() {
            @Override
            public void run() {
                isRunning = false;
            }
        });

        final Properties producerProps = new Properties();
        producerProps.put(ProducerConfig.CLIENT_ID_CONFIG, "SmokeTest");
        producerProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafka);
        producerProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        producerProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
        producerProps.put(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, true);

        final KafkaProducer<String, Integer> producer = new KafkaProducer<>(producerProps);

        final Random rand = new Random(System.currentTimeMillis());

        int numRecordsProduced = 0;
        while (isRunning) {
            final String key = "" + rand.nextInt(MAX_NUMBER_OF_KEYS);
            final int value = rand.nextInt(10000);

            final ProducerRecord<String, Integer> record = new ProducerRecord<>("data", key, value);

            producer.send(record, new Callback() {
                @Override
                public void onCompletion(final RecordMetadata metadata, final Exception exception) {
                    if (exception != null) {
                        exception.printStackTrace();
                        Exit.exit(1);
                    }
                }
            });

            numRecordsProduced++;
            if (numRecordsProduced % 1000 == 0) {
                System.out.println(numRecordsProduced + " records produced");
            }
            Utils.sleep(rand.nextInt(50));
        }
        producer.close();
        System.out.println(numRecordsProduced + " records produced");
    }
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:47,代码来源:EosTestDriver.java

示例14: sendMessage

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public void sendMessage(String msg) {
    KafkaProducer<String,String> producer = new KafkaProducer<String, String>(properties);
    ProducerRecord<String,String> record = new ProducerRecord<String, String>(properties.getProperty("topic"),msg);
    producer.send(record);
    producer.close();
}
 
开发者ID:wanghan0501,项目名称:WiFiProbeAnalysis,代码行数:7,代码来源:KafkaProducerForHive.java

示例15: step

import org.apache.kafka.clients.producer.KafkaProducer; //导入方法依赖的package包/类
public static void step( int start
                       , int end
                       , DemoConfig config
                       , KafkaProducer<String, String> producer
                       , Long timeOffset
                       , Integer stepTimeMs
                       , Set<String> hostnameFilter
                       ) throws IOException, InterruptedException {
  List<SourceState> states = new ArrayList<>();
  for(DemoConfig.DataSource ds : config.getSources()) {
    BufferedReader reader = null;
    if(ds.getInputFile().endsWith(".gz")) {
      reader = new BufferedReader(new InputStreamReader(new GZIPInputStream(new FileInputStream(ds.getInputFile())), Charset.defaultCharset()));
    }
    else {
      reader = Files.newBufferedReader(new File(ds.getInputFile()).toPath(), Charset.defaultCharset());
    }
    states.add(new SourceState(ds, reader));
  }
  LocalImporter.Progress progress = new LocalImporter.Progress();
  try {
    for (int index = start; index <= end; ++index) {
      progress.update();
      boolean allDone = true;
      long sTime = System.currentTimeMillis();
      int numMessagesWritten = 0;
      for (SourceState s : states) {
        String fileName = toFileName(s.getSource().getInputFile());
        List<Map<String, Object>> messages = s.read(index);
        for (Map<String, Object> message : messages) {
          int timestamp = Integer.parseInt("" + message.get("timestamp"));
          message.put("time_offset", timestamp);
          message.put("source_file", fileName);
          message.put("timestamp", 1000L*timestamp + timeOffset);
          if(matchesFilter(s.getSource(), hostnameFilter, message)) {
            String jsonMap = JSONUtils.INSTANCE.toJSON(message, false);
            if(producer != null) {
              numMessagesWritten++;
              producer.send(new ProducerRecord<String, String>(s.getSource().getOutputTopic(), jsonMap));
            }
            else {
              System.out.println(jsonMap);
            }
          }
        }
        allDone &= s.allDone;
      }
      if(allDone) {
        break;
      }
      else if(numMessagesWritten > 0 && producer != null){
        long eTime = System.currentTimeMillis();
        long durationMs = eTime - sTime;
        if(durationMs < stepTimeMs) {
          long msSleeping = stepTimeMs - durationMs;
          Thread.sleep(msSleeping);
        }
      }
    }
  }
  finally {
    if(producer != null) {
      producer.close();
    }
  }
}
 
开发者ID:simonellistonball,项目名称:metron-field-demos,代码行数:67,代码来源:DemoLoader.java


注:本文中的org.apache.kafka.clients.producer.KafkaProducer.close方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。