當前位置: 首頁>>代碼示例>>Java>>正文


Java GenericDatumReader.read方法代碼示例

本文整理匯總了Java中org.apache.avro.generic.GenericDatumReader.read方法的典型用法代碼示例。如果您正苦於以下問題:Java GenericDatumReader.read方法的具體用法?Java GenericDatumReader.read怎麽用?Java GenericDatumReader.read使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在org.apache.avro.generic.GenericDatumReader的用法示例。


在下文中一共展示了GenericDatumReader.read方法的10個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: processAvroMessage

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
/**
 * Processes an Avro Blob containing a single message and with no embedded
 * schema. This is the pattern when Avro objects are passed over messaging
 * infrastructure such as Apache Kafka.
 * 
 * @param avroMessage
 *            The Blob that holds the single Avro message object
 * @param avroKey
 *            The Blob that holds the single Avro key object (if passed)
 * @param outStream
 *            The stream to which the JSON string must be submitted
 * @param outTuple
 *            The tuple holding the JSON string
 * @param messageSchema
 *            The schema of the Avro messsage object
 * @param keySchema
 *            The schema of the Avro key object
 * @throws Exception
 */
private void processAvroMessage(Blob avroMessage, Blob avroKey, StreamingOutput<OutputTuple> outStream,
		OutputTuple outTuple, Schema messageSchema, Schema keySchema) throws Exception {
	// Deserialize message
	GenericDatumReader<GenericRecord> consumer = new GenericDatumReader<GenericRecord>(messageSchema);
	ByteArrayInputStream consumedByteArray = new ByteArrayInputStream(avroMessage.getData());
	Decoder consumedDecoder = DecoderFactory.get().binaryDecoder(consumedByteArray, null);
	GenericRecord consumedDatum = consumer.read(null, consumedDecoder);
	if (LOGGER.isTraceEnabled())
		LOGGER.log(TraceLevel.TRACE, "JSON representation of Avro message: " + consumedDatum.toString());
	outTuple.setString(outputJsonMessage, consumedDatum.toString());
	// Deserialize key (if specified)
	if (avroKey != null) {
		consumer = new GenericDatumReader<GenericRecord>(keySchema);
		consumedByteArray = new ByteArrayInputStream(avroKey.getData());
		consumedDecoder = DecoderFactory.get().binaryDecoder(consumedByteArray, null);
		consumedDatum = consumer.read(null, consumedDecoder);
		if (LOGGER.isTraceEnabled())
			LOGGER.log(TraceLevel.TRACE, "JSON representation of Avro key: " + consumedDatum.toString());
		if (outputJsonKey != null)
			outTuple.setString(outputJsonKey, consumedDatum.toString());
	}
	// Submit new tuple to output port 0
	outStream.submit(outTuple);
}
 
開發者ID:IBMStreams,項目名稱:streamsx.avro,代碼行數:44,代碼來源:AvroToJSON.java

示例2: fromBytes

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
public static MapOutputValue fromBytes(byte[] bytes,
    Map<String, Schema> schemaMap) throws IOException {
  DataInputStream dataInputStream = new DataInputStream(
      new ByteArrayInputStream(bytes));
  int length = dataInputStream.readInt();
  byte[] sourceNameBytes = new byte[length];
  dataInputStream.read(sourceNameBytes);
  String schemaName = new String(sourceNameBytes);

  int recordDataLength = dataInputStream.readInt();

  byte[] recordBytes = new byte[recordDataLength];
  dataInputStream.read(recordBytes);
  Schema schema = schemaMap.get(schemaName);
  GenericRecord record = new GenericData.Record(schema);
  binaryDecoder = DecoderFactory.get().binaryDecoder(recordBytes,
      binaryDecoder);
  GenericDatumReader<GenericRecord> gdr = new GenericDatumReader<GenericRecord>(
      schema);
  gdr.read(record, binaryDecoder);
  return new MapOutputValue(schemaName, record);
}
 
開發者ID:Hanmourang,項目名稱:Pinot,代碼行數:23,代碼來源:MapOutputValue.java

示例3: testEnums

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
@Test
public void testEnums() throws Exception {
  Schema schema = Enums.SCHEMA$;

  String avroJson = "{\"enum1\": \"X\", \"enum2\": {\"test.Enum2\": \"A\"}, \"enum3\": {\"null\": null}, \"enum4\": [{\"test.Enum4\": \"SAT\"}, {\"test.Enum4\": \"SUN\"}]}}";
  Decoder decoder = DecoderFactory.get().jsonDecoder(schema, avroJson);
  GenericDatumReader<Record> reader = new GenericDatumReader<Record>(schema);
  Record record1 = reader.read(null, decoder);

  String mongoJson = "{\"enum1\": \"X\", \"enum2\": \"A\", \"enum3\": null, \"enum4\": [\"SAT\", \"SUN\"]}}";
  BSONObject object = (BSONObject) JSON.parse(mongoJson);
  Record record2 = RecordConverter.toRecord(schema, object, getClass().getClassLoader());

  assertThat(record2, is(record1));
  assertThat(AvroHelper.toSimpleJson(schema, record2), is(AvroHelper.toSimpleJson(schema, record1)));
}
 
開發者ID:tfeng,項目名稱:toolbox,代碼行數:17,代碼來源:TestDocumentDecoder.java

示例4: testUnions

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
@Test
public void testUnions() throws Exception {
  Schema schema = Unions.SCHEMA$;

  String avroJson = "{\"union1\": {\"int\": 1}, \"union2\": {\"test.Union2\": {\"union21\": {\"long\": 2}}}, \"union3\": {\"array\": [{\"boolean\": true}, {\"boolean\": false}, {\"null\": null}]}, \"union4\": {\"map\": {\"a\": {\"string\": \"A\"}, \"b\": {\"string\": \"B\"}, \"c\": {\"string\": \"C\"}}}, \"union5\": {\"null\": null}, \"union6\": {\"null\": null}}";
  Decoder decoder = DecoderFactory.get().jsonDecoder(schema, avroJson);
  GenericDatumReader<Record> reader = new GenericDatumReader<Record>(schema);
  Record record1 = reader.read(null, decoder);

  String mongoJson = "{\"union1\": 1, \"union2\": {\"union21\": 2}, \"union3\": [true, false, null], \"union4\": {\"a\": \"A\", \"b\": \"B\", \"c\": \"C\"}, \"union5\": null, \"union6\": null}";
  DBObject object = (DBObject) JSON.parse(mongoJson);
  Record record2 = RecordConverter.toRecord(schema, object, getClass().getClassLoader());

  assertThat(record2, is(record1));
  assertThat(AvroHelper.toSimpleJson(schema, record2), is(AvroHelper.toSimpleJson(schema, record1)));
}
 
開發者ID:tfeng,項目名稱:toolbox,代碼行數:17,代碼來源:TestDocumentDecoder.java

示例5: toRecords

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
private static List<GenericRecord> toRecords(String inputJson,
		Schema writerSchema, Schema readerSchema) throws IOException {
	HackedJsonDecoder jsonDecoder = new HackedJsonDecoder(writerSchema,
			inputJson);
	GenericDatumReader<GenericRecord> reader = 
			new GenericDatumReader<GenericRecord>(
					writerSchema, readerSchema);
	List<GenericRecord> records = new ArrayList<GenericRecord>();
	while (true) {
		try {
			GenericRecord record = reader.read(null, jsonDecoder);
			records.add(record);
		} catch (EOFException e) {
			break;
		}
	}
	return records;
}
 
開發者ID:openaire,項目名稱:iis,代碼行數:19,代碼來源:JsonCodersTest.java

示例6: toObject

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
public Object toObject(byte[] bytes) {
    Object object = null;
    GenericDatumReader<Object> reader = null;
    try {
        Decoder decoder = DecoderFactory.get().binaryDecoder(bytes, null);
        reader = new GenericDatumReader<Object>(schema);
        object =  reader.read(null, decoder);
    } catch(Exception e) {
        throw new SerializationException("An exception has thrown in Avro generic deserialization process", e);
    }
    return object;
}
 
開發者ID:keedio,項目名稱:flume-enrichment-interceptor-skeleton,代碼行數:13,代碼來源:AVROGenericSerializer.java

示例7: process

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
/**
   * Process messages from Kafka.
   *
   * @param msg The received message as a byte array.
   */
  public void process(byte[] msg) {
    //each message must start with one byte identifying the Avro schema do decode it
    byte schemaId = msg[0];
    //using the schema id we get the actual schema to decode with
    Schema schema = repo.schemaFor(schemaId);
    GenericDatumReader<GenericRecord> reader = repo.readerFor(schemaId);
    //TODO: test for handling of invalid id values (not in repo)
    //each message is assumed to have a header and body with specific fields
    Schema.Field headerField = schema.getField("header");
    Schema headerSchema = headerField.schema();
    List<Schema.Field> headerFields = headerSchema.getFields();
    Schema.Field bodyField = schema.getField("body");
    Schema bodySchema = bodyField.schema();
    List<Schema.Field> bodyFields = bodySchema.getFields();

    Decoder d = DecoderFactory.get().binaryDecoder(msg, 1, msg.length - 1, null);
    try {
      GenericRecord record = reader.read(null, d);
      GenericRecord header = (GenericRecord)record.get("header");
      GenericRecord body = (GenericRecord)record.get("body");

      multiPoint(header, body, headerFields, bodyFields);
//      log.trace("Stored msg:"+record);
    } catch (Exception e) {
      log.error("Error while processing received Kafka msg. Skipping this msg:"+ Arrays.toString(msg), e);
    }
  }
 
開發者ID:mukatee,項目名稱:kafka-consumer,代碼行數:33,代碼來源:InFluxAvroConsumer.java

示例8: process

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
/**
   * Process a binary message from Kafka.
   *
   * @param msg The Kafka message.
   */
  public void process(byte[] msg) {
    //always starts with the Avro schema id for decoding
    byte schemaId = msg[0];
    //get the matching schema to decode with
    Schema schema = repo.schemaFor(schemaId);
    GenericDatumReader<GenericRecord> reader = repo.readerFor(schemaId);
    //TODO: test for handling of invalid id values (not in repo)
    Schema.Field headerField = schema.getField("header");
    Schema headerSchema = headerField.schema();
    List<Schema.Field> headerFields = headerSchema.getFields();
    Schema.Field bodyField = schema.getField("body");
    Schema bodySchema = bodyField.schema();
    List<Schema.Field> bodyFields = bodySchema.getFields();

    Decoder d = DecoderFactory.get().binaryDecoder(msg, 1, msg.length - 1, null);
    try {
      GenericRecord record = reader.read(null, d);
      GenericRecord header = (GenericRecord) record.get("header");
      GenericRecord body = (GenericRecord) record.get("body");

      //store the decoded message into Cassandra
      store(schemaId, header, body, headerFields, bodyFields);
//      log.trace("Stored msg:"+record);
    } catch (IOException e) {
      log.error("Error while processing received Kafka msg. Skipping this msg:" + Arrays.toString(msg), e);
    }
  }
 
開發者ID:mukatee,項目名稱:kafka-consumer,代碼行數:33,代碼來源:CassandaAvroConsumer.java

示例9: genericRecordFromBytes

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
private static <T> T genericRecordFromBytes(byte[] bytes, Schema schema) throws IOException {
  BinaryDecoder binDecoder = DecoderFactory.defaultFactory().createBinaryDecoder(bytes, null);
  GenericDatumReader<T> reader = new GenericDatumReader<>(schema);
  return reader.read(null, binDecoder);
}
 
開發者ID:srinipunuru,項目名稱:samza-sql-tools,代碼行數:6,代碼來源:AvroSerDeFactory.java

示例10: toRecord

import org.apache.avro.generic.GenericDatumReader; //導入方法依賴的package包/類
public static Record toRecord(Schema schema, BSONObject object, ClassLoader classLoader) throws IOException {
  GenericDatumReader<Record> reader = new GenericDatumReader<>(schema);
  return reader.read(null, new DocumentDecoder(schema, object, classLoader));
}
 
開發者ID:tfeng,項目名稱:toolbox,代碼行數:5,代碼來源:RecordConverter.java


注:本文中的org.apache.avro.generic.GenericDatumReader.read方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。