当前位置: 首页>>代码示例>>Java>>正文


Java KafkaAvroSerializer类代码示例

本文整理汇总了Java中io.confluent.kafka.serializers.KafkaAvroSerializer的典型用法代码示例。如果您正苦于以下问题:Java KafkaAvroSerializer类的具体用法?Java KafkaAvroSerializer怎么用?Java KafkaAvroSerializer使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


KafkaAvroSerializer类属于io.confluent.kafka.serializers包,在下文中一共展示了KafkaAvroSerializer类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: KsqlGenericRowAvroSerializer

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
public KsqlGenericRowAvroSerializer(org.apache.kafka.connect.data.Schema schema,
                                    SchemaRegistryClient schemaRegistryClient, KsqlConfig
                                        ksqlConfig) {
  String avroSchemaStr = SchemaUtil.buildAvroSchema(schema, "avro_schema");
  Schema.Parser parser = new Schema.Parser();
  avroSchema = parser.parse(avroSchemaStr);
  fields = avroSchema.getFields();

  Map<String, Object> map = new HashMap<>();

  // Automatically register the schema in the Schema Registry if it has not been registered.
  map.put(AbstractKafkaAvroSerDeConfig.AUTO_REGISTER_SCHEMAS, true);
  map.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, ksqlConfig.getString(KsqlConfig.SCHEMA_REGISTRY_URL_PROPERTY));
  kafkaAvroSerializer = new KafkaAvroSerializer(schemaRegistryClient, map);

}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:17,代码来源:KsqlGenericRowAvroSerializer.java

示例2: getSerializedRow

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
private byte[] getSerializedRow(String topicName, SchemaRegistryClient schemaRegistryClient,
                                Schema rowAvroSchema,
                                GenericRow
    genericRow) {
  Map map = new HashMap();
  // Automatically register the schema in the Schema Registry if it has not been registered.
  map.put(AbstractKafkaAvroSerDeConfig.AUTO_REGISTER_SCHEMAS, true);
  map.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "");
  KafkaAvroSerializer kafkaAvroSerializer = new KafkaAvroSerializer(schemaRegistryClient, map);
  GenericRecord avroRecord = new GenericData.Record(rowAvroSchema);
  List<Schema.Field> fields = rowAvroSchema.getFields();
  for (int i = 0; i < genericRow.getColumns().size(); i++) {
    if (fields.get(i).schema().getType() == Schema.Type.ARRAY) {
      avroRecord.put(fields.get(i).name(), Arrays.asList((Object[]) genericRow.getColumns().get(i)));
    } else {
      avroRecord.put(fields.get(i).name(), genericRow.getColumns().get(i));
    }
  }

  return kafkaAvroSerializer.serialize(topicName, avroRecord);
}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:22,代码来源:KsqlGenericRowAvroDeserializerTest.java

示例3: configureKafkaProducer

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
@Override public KafkaProducer configureKafkaProducer(Properties producerProps) {
  String schemaRegistryUrl;
  Integer identityMapCapacity;

  if (producerProps.containsKey(KAFKA_SCHEMA_REGISTRY_URL_FIELD)) {
    schemaRegistryUrl = (String) producerProps.get(KAFKA_SCHEMA_REGISTRY_URL_FIELD);
  }
  else {
    throw new IllegalArgumentException("Field " + KAFKA_SCHEMA_REGISTRY_URL_FIELD + " required.");
  }

  if (producerProps.containsKey(KAFKA_SCHEMA_REGISTRY_IDENTITY_MAP_CAPACITY_FIELD)) {
    identityMapCapacity = (Integer) producerProps.get(KAFKA_SCHEMA_REGISTRY_IDENTITY_MAP_CAPACITY_FIELD);
  }
  else {
    identityMapCapacity = 100;
  }

  CachedSchemaRegistryClient client = new CachedSchemaRegistryClient(schemaRegistryUrl, identityMapCapacity);
  KafkaAvroSerializer serializer = new KafkaAvroSerializer(client);
  return new KafkaProducer<Object, Object>(producerProps, serializer, serializer);
}
 
开发者ID:verisign,项目名称:storm-graphite,代码行数:23,代码来源:SchemaRegistryKafkaReporter.java

示例4: activateOptions

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
@Override
public void activateOptions() {
    if (this.brokerList == null)
        throw new ConfigException("The bootstrap servers property is required");

    if (this.schemaRegistryUrl == null)
        throw new ConfigException("The schema registry url is required");

    if (this.topic == null)
        throw new ConfigException("Topic is required");

    Properties props = new Properties();
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerList);
    props.put("schema.registry.url", this.schemaRegistryUrl);
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);

    this.producer = new KafkaProducer<String, IndexedRecord>(props);
}
 
开发者ID:elodina,项目名称:java-kafka,代码行数:20,代码来源:KafkaLogAppender.java

示例5: testConfluentSerDes

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
@Test
public void testConfluentSerDes() throws Exception {

    org.apache.avro.Schema schema = new org.apache.avro.Schema.Parser().parse(GENERIC_TEST_RECORD_SCHEMA);
    GenericRecord record = new GenericRecordBuilder(schema).set("field1", "some value").set("field2", "some other value").build();

    Map<String, Object> config = new HashMap<>();
    config.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, rootTarget.getUri().toString());

    KafkaAvroSerializer kafkaAvroSerializer = new KafkaAvroSerializer();
    kafkaAvroSerializer.configure(config, false);
    byte[] bytes = kafkaAvroSerializer.serialize("topic", record);

    KafkaAvroDeserializer kafkaAvroDeserializer = new KafkaAvroDeserializer();
    kafkaAvroDeserializer.configure(config, false);

    GenericRecord result = (GenericRecord) kafkaAvroDeserializer.deserialize("topic", bytes);
    LOG.info(result.toString());
}
 
开发者ID:hortonworks,项目名称:registry,代码行数:20,代码来源:ConfluentRegistryCompatibleResourceTest.java

示例6: getAvroProducer

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
private static Producer<Object, Object> getAvroProducer(String brokers, String schemaregistry) {
  System.out.println("Starting [AvroProducer] with brokers=[" + brokers + "] and schema-registry=[" + schemaregistry + "]");
  Properties producerProps = new Properties();
  producerProps.put("bootstrap.servers", brokers);
  producerProps.put("acks", "all");
  producerProps.put("key.serializer", KafkaAvroSerializer.class.getName());
  producerProps.put("value.serializer", KafkaAvroSerializer.class.getName());
  producerProps.put("linger.ms", "10"); // ?
  producerProps.put("schema.registry.url", schemaregistry);
  return new KafkaProducer<>(producerProps);
}
 
开发者ID:Landoop,项目名称:landoop-avro-generator,代码行数:12,代码来源:SimpleAvroProducer.java

示例7: getStringAvroProducer

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
private static Producer<String, Object> getStringAvroProducer(String brokers, String schemaregistry) {
  System.out.println("Starting [AvroProducer] with brokers=[" + brokers + "] and schema-registry=[" + schemaregistry + "]");
  Properties producerProps = new Properties();
  producerProps.put("bootstrap.servers", brokers);
  producerProps.put("acks", "all");
  producerProps.put("key.serializer", StringSerializer.class.getName());
  producerProps.put("value.serializer", KafkaAvroSerializer.class.getName());
  producerProps.put("linger.ms", "10"); // ?
  producerProps.put("schema.registry.url", schemaregistry);
  return new KafkaProducer<>(producerProps);
}
 
开发者ID:Landoop,项目名称:landoop-avro-generator,代码行数:12,代码来源:SimpleTextAvroProducer.java

示例8: build

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
public KafkaReporter build() {
    Properties props = producerProperties == null ? new Properties() : producerProperties;
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, brokerList);
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
    props.put("schema.registry.url", schemaRegistryUrl);

    return new KafkaReporter(registry, name, filter, rateUnit, durationUnit, kafkaTopic, props);
}
 
开发者ID:elodina,项目名称:java-kafka,代码行数:10,代码来源:KafkaReporter.java

示例9: KafkaReporter

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
public KafkaReporter(MetricsRegistry metricsRegistry, Properties producerProperties, String topic) {
    super(metricsRegistry, "kafka-topic-reporter");

    producerProperties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
    producerProperties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);

    this.producer = new KafkaProducer<String, IndexedRecord>(producerProperties);
    this.converter = new JsonToAvroConverter();
    this.topic = topic;

}
 
开发者ID:elodina,项目名称:java-kafka,代码行数:12,代码来源:KafkaReporter.java

示例10: testConfluentAvroDeserializer

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
@Test
public void testConfluentAvroDeserializer() throws IOException, RestClientException {
  WorkUnitState mockWorkUnitState = getMockWorkUnitState();
  mockWorkUnitState.setProp("schema.registry.url", TEST_URL);

  Schema schema = SchemaBuilder.record(TEST_RECORD_NAME)
      .namespace(TEST_NAMESPACE).fields()
      .name(TEST_FIELD_NAME).type().stringType().noDefault()
      .endRecord();

  GenericRecord testGenericRecord = new GenericRecordBuilder(schema).set(TEST_FIELD_NAME, "testValue").build();

  SchemaRegistryClient mockSchemaRegistryClient = mock(SchemaRegistryClient.class);
  when(mockSchemaRegistryClient.getByID(any(Integer.class))).thenReturn(schema);

  Serializer<Object> kafkaEncoder = new KafkaAvroSerializer(mockSchemaRegistryClient);
  Deserializer<Object> kafkaDecoder = new KafkaAvroDeserializer(mockSchemaRegistryClient);

  ByteBuffer testGenericRecordByteBuffer =
      ByteBuffer.wrap(kafkaEncoder.serialize(TEST_TOPIC_NAME, testGenericRecord));

  KafkaSchemaRegistry<Integer, Schema> mockKafkaSchemaRegistry = mock(KafkaSchemaRegistry.class);
  KafkaDeserializerExtractor kafkaDecoderExtractor =
      new KafkaDeserializerExtractor(mockWorkUnitState,
          Optional.fromNullable(Deserializers.CONFLUENT_AVRO), kafkaDecoder, mockKafkaSchemaRegistry);

  ByteArrayBasedKafkaRecord mockMessageAndOffset = getMockMessageAndOffset(testGenericRecordByteBuffer);

  Assert.assertEquals(kafkaDecoderExtractor.decodeRecord(mockMessageAndOffset), testGenericRecord);
}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:31,代码来源:KafkaDeserializerExtractorTest.java

示例11: testConfluentAvroDeserializerForSchemaEvolution

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
@Test
public void testConfluentAvroDeserializerForSchemaEvolution() throws IOException, RestClientException, SchemaRegistryException {
  WorkUnitState mockWorkUnitState = getMockWorkUnitState();
  mockWorkUnitState.setProp("schema.registry.url", TEST_URL);

  Schema schemaV1 = SchemaBuilder.record(TEST_RECORD_NAME)
      .namespace(TEST_NAMESPACE).fields()
      .name(TEST_FIELD_NAME).type().stringType().noDefault()
      .endRecord();

  Schema schemaV2 = SchemaBuilder.record(TEST_RECORD_NAME)
      .namespace(TEST_NAMESPACE).fields()
      .name(TEST_FIELD_NAME).type().stringType().noDefault()
      .optionalString(TEST_FIELD_NAME2).endRecord();

  GenericRecord testGenericRecord = new GenericRecordBuilder(schemaV1).set(TEST_FIELD_NAME, "testValue").build();

  SchemaRegistryClient mockSchemaRegistryClient = mock(SchemaRegistryClient.class);
  when(mockSchemaRegistryClient.getByID(any(Integer.class))).thenReturn(schemaV1);

  Serializer<Object> kafkaEncoder = new KafkaAvroSerializer(mockSchemaRegistryClient);
  Deserializer<Object> kafkaDecoder = new KafkaAvroDeserializer(mockSchemaRegistryClient);

  ByteBuffer testGenericRecordByteBuffer =
      ByteBuffer.wrap(kafkaEncoder.serialize(TEST_TOPIC_NAME, testGenericRecord));

  KafkaSchemaRegistry<Integer, Schema> mockKafkaSchemaRegistry = mock(KafkaSchemaRegistry.class);
  when(mockKafkaSchemaRegistry.getLatestSchemaByTopic(TEST_TOPIC_NAME)).thenReturn(schemaV2);

  KafkaDeserializerExtractor kafkaDecoderExtractor = new KafkaDeserializerExtractor(mockWorkUnitState,
      Optional.fromNullable(Deserializers.CONFLUENT_AVRO), kafkaDecoder, mockKafkaSchemaRegistry);
  when(kafkaDecoderExtractor.getSchema()).thenReturn(schemaV2);

  ByteArrayBasedKafkaRecord mockMessageAndOffset = getMockMessageAndOffset(testGenericRecordByteBuffer);

  GenericRecord received = (GenericRecord) kafkaDecoderExtractor.decodeRecord(mockMessageAndOffset);
  Assert.assertEquals(received.toString(), "{\"testField\": \"testValue\", \"testField2\": null}");

}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:40,代码来源:KafkaDeserializerExtractorTest.java

示例12: initKafka

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
private void initKafka() {
    schemaRegistryClient = new MockSchemaRegistryClient();
    kafkaAvroDecoder = new KafkaAvroDecoder(schemaRegistryClient);
    Properties defaultConfig = new Properties();
    defaultConfig.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, "bogus");
    avroSerializer = new KafkaAvroSerializer(schemaRegistryClient);
}
 
开发者ID:pinterest,项目名称:secor,代码行数:8,代码来源:SecorSchemaRegistryClientTest.java

示例13: SpecificAvroSerializer

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
/**
 * Constructor used by Kafka Streams.
 */
public SpecificAvroSerializer() {
    inner = new KafkaAvroSerializer();
}
 
开发者ID:jeqo,项目名称:talk-kafka-messaging-logs,代码行数:7,代码来源:SpecificAvroSerializer.java

示例14: GenericAvroSerializer

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
/**
 * Constructor used by Kafka Streams.
 */
public GenericAvroSerializer() {
    inner = new KafkaAvroSerializer();
}
 
开发者ID:jeqo,项目名称:talk-kafka-messaging-logs,代码行数:7,代码来源:GenericAvroSerializer.java

示例15: GenericAvroSerializer

import io.confluent.kafka.serializers.KafkaAvroSerializer; //导入依赖的package包/类
public GenericAvroSerializer(SchemaRegistryClient client) {
    inner = new KafkaAvroSerializer(client);
}
 
开发者ID:confluentinc,项目名称:strata-tutorials,代码行数:4,代码来源:GenericAvroSerializer.java


注:本文中的io.confluent.kafka.serializers.KafkaAvroSerializer类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。