當前位置: 首頁>>代碼示例>>Java>>正文


Java GenericData類代碼示例

本文整理匯總了Java中org.apache.avro.generic.GenericData的典型用法代碼示例。如果您正苦於以下問題:Java GenericData類的具體用法?Java GenericData怎麽用?Java GenericData使用的例子?那麽, 這裏精選的類代碼示例或許可以為您提供幫助。


GenericData類屬於org.apache.avro.generic包,在下文中一共展示了GenericData類的15個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: readAvroFile

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
/**
 * Reads in binary Avro-encoded entities using the schema stored in the file
 * and prints them out.
 */
public static void readAvroFile(File file) throws IOException {
	GenericDatumReader datum = new GenericDatumReader();
	DataFileReader reader = new DataFileReader(file, datum);

	GenericData.Record record = new GenericData.Record(reader.getSchema());
	while (reader.hasNext()) {
		reader.next(record);
		System.out.println("Name " + record.get("name") + " on "
				+ record.get("Meetup_date") + " attending "
				+ record.get("going") + " organized by  "
				+ record.get("organizer") + " on  " + record.get("topics"));
	}

	reader.close();
}
 
開發者ID:airisdata,項目名稱:avroparquet,代碼行數:20,代碼來源:StorageFormatUtils.java

示例2: write

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
@Override
public Object write( final Object obj ) throws IOException{
  GenericRecord record = new GenericData.Record( avroSchema );
  if( ! ( obj instanceof Map ) ){
    return record;
  }

  Map<Object,Object> mapObj = (Map<Object,Object>)obj;

  for( KeyAndFormatter childFormatter : childContainer ){
    childFormatter.clear();
    record.put( childFormatter.getName() , childFormatter.get( mapObj ) );
  }

  return record;
}
 
開發者ID:yahoojapan,項目名稱:dataplatform-schema-lib,代碼行數:17,代碼來源:AvroRecordFormatter.java

示例3: updateTopicProcessor

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
/**
 * update processor class, method and db writer for each topic
 */
public void updateTopicProcessor() {
  for (String topic : _topics.keySet()) {
    try {
      // get the processor class and method
      final Class processorClass = Class.forName(_topics.get(topic).processor);
      _topicProcessorClass.put(topic, processorClass.newInstance());

      final Method method = processorClass.getDeclaredMethod("process", GenericData.Record.class, String.class);
      _topicProcessorMethod.put(topic, method);

      // get the database writer
      final DatabaseWriter dw = new DatabaseWriter(JdbcUtil.wherehowsJdbcTemplate, _topics.get(topic).dbTable);
      _topicDbWriter.put(topic, dw);
    } catch (Exception e) {
      Logger.error("Fail to create Processor for topic: " + topic, e);
      _topicProcessorClass.remove(topic);
      _topicProcessorMethod.remove(topic);
      _topicDbWriter.remove(topic);
    }
  }
}
 
開發者ID:thomas-young-2013,項目名稱:wherehowsX,代碼行數:25,代碼來源:KafkaConfig.java

示例4: process

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
/**
 * Process a Gobblin tracking event audit record
 * @param record
 * @param topic
 * @return null
 * @throws Exception
 */
public Record process(GenericData.Record record, String topic)
    throws Exception {

  if (record != null && record.get("name") != null) {
    final String name = record.get("name").toString();
    // only handle "DaliLimitedRetentionAuditor","DaliAutoPurgeAuditor" and "DsIgnoreIDPCAuditor"
    if (name.equals(DALI_LIMITED_RETENTION_AUDITOR)
        || name.equals(DALI_AUTOPURGED_AUDITOR)
        || name.equals(DS_IGNORE_IDPC_AUDITOR)) {
      Long timestamp = (Long) record.get("timestamp");
      Map<String, String> metadata = StringUtil.convertObjectMapToStringMap(record.get("metadata"));

      String hasError = metadata.get("HasError");
      if (!hasError.equalsIgnoreCase("true")) {
        String datasetPath = metadata.get("DatasetPath");
        String datasetUrn = DATASET_URN_PREFIX + (datasetPath.startsWith("/") ? "" : "/") + datasetPath;
        String ownerUrns = metadata.get("OwnerURNs");
        DatasetInfoDao.updateKafkaDatasetOwner(datasetUrn, ownerUrns, DATASET_OWNER_SOURCE, timestamp);
      }
    }
  }
  return null;
}
 
開發者ID:thomas-young-2013,項目名稱:wherehowsX,代碼行數:31,代碼來源:GobblinTrackingAuditProcessor.java

示例5: createDataFile

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
private static Path createDataFile() throws IOException {
    File avroFile = File.createTempFile("test-", "." + FILE_EXTENSION);
    DatumWriter<GenericRecord> writer = new GenericDatumWriter<>(schema);
    try (DataFileWriter<GenericRecord> dataFileWriter = new DataFileWriter<>(writer)) {
        dataFileWriter.setFlushOnEveryBlock(true);
        dataFileWriter.setSyncInterval(32);
        dataFileWriter.create(schema, avroFile);

        IntStream.range(0, NUM_RECORDS).forEach(index -> {
            GenericRecord datum = new GenericData.Record(schema);
            datum.put(FIELD_INDEX, index);
            datum.put(FIELD_NAME, String.format("%d_name_%s", index, UUID.randomUUID()));
            datum.put(FIELD_SURNAME, String.format("%d_surname_%s", index, UUID.randomUUID()));
            try {
                OFFSETS_BY_INDEX.put(index, dataFileWriter.sync() - 16L);
                dataFileWriter.append(datum);
            } catch (IOException ioe) {
                throw new RuntimeException(ioe);
            }
        });
    }
    Path path = new Path(new Path(fsUri), avroFile.getName());
    fs.moveFromLocalFile(new Path(avroFile.getAbsolutePath()), path);
    return path;
}
 
開發者ID:mmolimar,項目名稱:kafka-connect-fs,代碼行數:26,代碼來源:AvroFileReaderTest.java

示例6: testIncompatibleSchemas

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
@Test
public void testIncompatibleSchemas() throws EventDeliveryException {
  final DatasetSink sink = sink(in, config);

  GenericRecordBuilder builder = new GenericRecordBuilder(
      INCOMPATIBLE_SCHEMA);
  GenericData.Record rec = builder.set("username", "koala").build();
  putToChannel(in, event(rec, INCOMPATIBLE_SCHEMA, null, false));

  // run the sink
  sink.start();
  assertThrows("Should fail", EventDeliveryException.class,
      new Callable() {
        @Override
        public Object call() throws EventDeliveryException {
          sink.process();
          return null;
        }
      });
  sink.stop();

  Assert.assertEquals("Should have rolled back",
      expected.size() + 1, remaining(in));
}
 
開發者ID:moueimei,項目名稱:flume-release-1.7.0,代碼行數:25,代碼來源:TestDatasetSink.java

示例7: createParquetFile

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
/**
 * Create a data file that gets exported to the db.
 * @param fileNum the number of the file (for multi-file export)
 * @param numRecords how many records to write to the file.
 */
protected void createParquetFile(int fileNum, int numRecords,
    ColumnGenerator... extraCols) throws IOException {

  String uri = "dataset:file:" + getTablePath();
  Schema schema = buildSchema(extraCols);
  DatasetDescriptor descriptor = new DatasetDescriptor.Builder()
    .schema(schema)
    .format(Formats.PARQUET)
    .build();
  Dataset dataset = Datasets.create(uri, descriptor);
  DatasetWriter writer = dataset.newWriter();
  try {
    for (int i = 0; i < numRecords; i++) {
      GenericRecord record = new GenericData.Record(schema);
      record.put("id", i);
      record.put("msg", getMsgPrefix() + i);
      addExtraColumns(record, i, extraCols);
      writer.write(record);
    }
  } finally {
    writer.close();
  }
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:29,代碼來源:TestParquetExport.java

示例8: testParquetRecordsNotSupported

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
public void testParquetRecordsNotSupported() throws IOException, SQLException {
  String[] argv = {};
  final int TOTAL_RECORDS = 1;

  Schema schema =  Schema.createRecord("nestedrecord", null, null, false);
  schema.setFields(Lists.newArrayList(buildField("myint",
      Schema.Type.INT)));
  GenericRecord record = new GenericData.Record(schema);
  record.put("myint", 100);
  // DB type is not used so can be anything:
  ColumnGenerator gen = colGenerator(record, schema, null, "VARCHAR(64)");
  createParquetFile(0, TOTAL_RECORDS,  gen);
  createTable(gen);
  try {
    runExport(getArgv(true, 10, 10, newStrArray(argv, "-m", "" + 1)));
    fail("Parquet records can not be exported.");
  } catch (Exception e) {
    // expected
    assertTrue(true);
  }
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:22,代碼來源:TestParquetExport.java

示例9: testAvroRecordsNotSupported

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
public void testAvroRecordsNotSupported() throws IOException, SQLException {
  String[] argv = {};
  final int TOTAL_RECORDS = 1;

  Schema schema =  Schema.createRecord("nestedrecord", null, null, false);
  schema.setFields(Lists.newArrayList(buildAvroField("myint",
      Schema.Type.INT)));
  GenericRecord record = new GenericData.Record(schema);
  record.put("myint", 100);
  // DB type is not used so can be anything:
  ColumnGenerator gen = colGenerator(record, schema, null, "VARCHAR(64)");
  createAvroFile(0, TOTAL_RECORDS,  gen);
  createTable(gen);
  try {
    runExport(getArgv(true, 10, 10, newStrArray(argv, "-m", "" + 1)));
    fail("Avro records can not be exported.");
  } catch (Exception e) {
    // expected
    assertTrue(true);
  }
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:22,代碼來源:TestAvroExport.java

示例10: convertToAvroRecord

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
private GenericRecord convertToAvroRecord(Schema avroRecordSchema, Object[] values) {
  // TODO can be improve to create once and reuse
  GenericRecord avroRec = new GenericData.Record(avroRecordSchema);
  List<ColumnConverterDescriptor> columnConverters = converterDescriptor.getColumnConverters();
  if (values.length != columnConverters.size()) {
    // mismatch schema
    // TODO better exception
    throw new RuntimeException("Expecting " + columnConverters.size() + " fields, received "
        + values.length + " values");
  }
  for (int i = 0; i < values.length; i++) {
    Object value = values[i];
    ColumnConverterDescriptor columnConverterDescriptor = columnConverters.get(i);
    Object valueToWrite = columnConverterDescriptor.getWritable(value);
    avroRec.put(columnConverterDescriptor.getColumnName(), valueToWrite);
  }
  return avroRec;
}
 
開發者ID:ampool,項目名稱:monarch,代碼行數:19,代碼來源:ParquetWriterWrapper.java

示例11: bussinessDeal

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
/**
 * 進行必要的業務處理
 * 
 * @param transceiver
 * @throws IOException
 */
private void bussinessDeal(Transceiver transceiver) throws IOException {
    // 2.獲取協議
    Protocol protocol = Protocol.parse(this.getClass().getResourceAsStream("/Members.avpr"));
    // 3.根據協議和通訊構造請求對象
    GenericRequestor requestor = new GenericRequestor(protocol, transceiver);
    // 4.根據schema獲取messages主節點內容
    GenericRecord loginGr = new GenericData.Record(protocol.getMessages().get("login").getRequest());
    // 5.在根據協議裏麵獲取request中的schema
    GenericRecord mGr = new GenericData.Record(protocol.getType("Members"));
    // 6.設置request中的請求數據
    mGr.put("userName", "rita");
    mGr.put("userPwd", "123456");
    // 7、把二級內容加入到一級message的主節點中
    loginGr.put("m", mGr);
    // 8.設置完畢後,請求方法,正式發送訪問請求信息,並得到響應內容
    Object retObj = requestor.request("login", loginGr);
    // 9.進行解析操作
    GenericRecord upGr = (GenericRecord) retObj;
    System.out.println(upGr.get("msg"));
}
 
開發者ID:lrtdc,項目名稱:book_ldrtc,代碼行數:27,代碼來源:MemberServerConsumer.java

示例12: MemberInfoDynSer

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
/**
 * 動態序列化:通過動態解析Schema文件進行內容設置,並序列化內容
 * 
 * @throws IOException
 */
public void MemberInfoDynSer() throws IOException {
    // 1.解析schema文件內容
    Parser parser = new Parser();
    Schema mSchema = parser.parse(this.getClass().getResourceAsStream("/Members.avsc"));
    // 2.構建數據寫對象
    DatumWriter<GenericRecord> mGr = new SpecificDatumWriter<GenericRecord>(mSchema);
    DataFileWriter<GenericRecord> mDfw = new DataFileWriter<GenericRecord>(mGr);
    // 3.創建序列化文件
    mDfw.create(mSchema, new File("/Users/a/Desktop/tmp/members.avro"));
    // 4.添加序列化數據
    for (int i = 0; i < 20; i++) {
        GenericRecord gr = new GenericData.Record(mSchema);
        int r = i * new Random().nextInt(50);
        gr.put("userName", "light-" + r);
        gr.put("userPwd", "2016-" + r);
        gr.put("realName", "滔滔" + r + "號");
        mDfw.append(gr);
    }
    // 5.關閉數據文件寫對象
    mDfw.close();
    System.out.println("Dyn Builder Ser Start Complete.");
}
 
開發者ID:lrtdc,項目名稱:book_ldrtc,代碼行數:28,代碼來源:MemberServerProvider.java

示例13: serialize

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
/**
 * 
 * @param t
 * @return 
 */
@Override
public byte[] serialize(Tuple4<String, String, String, String> t) {

  if (!initialized) {
    parser = new Schema.Parser();
    schema = parser.parse(schemaJson);
    recordInjection = GenericAvroCodecs.toBinary(schema);
    initialized = true;
  }
  GenericData.Record avroRecord = new GenericData.Record(schema);
  for (int i = 0; i < t.getArity() - 1; i += 2) {
    avroRecord.put(t.getField(i).toString(), t.getField(i + 1).toString());
  }

  byte[] bytes = recordInjection.apply(avroRecord);
  return bytes;
}
 
開發者ID:hopshadoop,項目名稱:hops-util,代碼行數:23,代碼來源:AvroDeserializer.java

示例14: get

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
public static Object get(String fieldName, GenericData.Record record, Object defaultValue)
{
   Schema decodedWithSchema = record.getSchema();

   Optional<Schema.Field> field = decodedWithSchema.getFields().stream()
      .filter(i -> i.name().equals(fieldName) || i.aliases().contains(fieldName))
      .findFirst();

   if(field.isPresent())
   {
      return record.get(field.get().pos());
   }
   else
   {
      return defaultValue;
   }
}
 
開發者ID:developerSid,項目名稱:AwesomeJavaLibraryExamples,代碼行數:18,代碼來源:AvroUtils.java

示例15: serialize

import org.apache.avro.generic.GenericData; //導入依賴的package包/類
public Record serialize(AdvancedEmployee employee)
{
   Record record = new Record(schema);

   AvroUtils.put("name", employee.getName(), record);
   AvroUtils.put("age", employee.getAge(), record);
   AvroUtils.put("gender", employee.getGender(), record);

   int numberOfEmails = (employee.getMails() != null) ? employee.getMails().size() : 0;
   GenericData.Array<Utf8> emails = new GenericData.Array<>(numberOfEmails, schema.getField("emails").schema());

   for(int i = 0; i < numberOfEmails; ++i)
   {
      emails.add(new Utf8(employee.getMails().get(i)));
   }

   record.put("emails", emails);

   return record;
}
 
開發者ID:developerSid,項目名稱:AwesomeJavaLibraryExamples,代碼行數:21,代碼來源:AvroWriteSerializer.java


注:本文中的org.apache.avro.generic.GenericData類示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。