當前位置: 首頁>>代碼示例>>Java>>正文


Java GenericData.Record方法代碼示例

本文整理匯總了Java中org.apache.avro.generic.GenericData.Record方法的典型用法代碼示例。如果您正苦於以下問題:Java GenericData.Record方法的具體用法?Java GenericData.Record怎麽用?Java GenericData.Record使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在org.apache.avro.generic.GenericData的用法示例。


在下文中一共展示了GenericData.Record方法的15個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: testParquetRecordsNotSupported

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
public void testParquetRecordsNotSupported() throws IOException, SQLException {
  String[] argv = {};
  final int TOTAL_RECORDS = 1;

  Schema schema =  Schema.createRecord("nestedrecord", null, null, false);
  schema.setFields(Lists.newArrayList(buildField("myint",
      Schema.Type.INT)));
  GenericRecord record = new GenericData.Record(schema);
  record.put("myint", 100);
  // DB type is not used so can be anything:
  ColumnGenerator gen = colGenerator(record, schema, null, "VARCHAR(64)");
  createParquetFile(0, TOTAL_RECORDS,  gen);
  createTable(gen);
  try {
    runExport(getArgv(true, 10, 10, newStrArray(argv, "-m", "" + 1)));
    fail("Parquet records can not be exported.");
  } catch (Exception e) {
    // expected
    assertTrue(true);
  }
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:22,代碼來源:TestParquetExport.java

示例2: testIncompatibleSchemas

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
@Test
public void testIncompatibleSchemas() throws EventDeliveryException {
  final DatasetSink sink = sink(in, config);

  GenericRecordBuilder builder = new GenericRecordBuilder(
      INCOMPATIBLE_SCHEMA);
  GenericData.Record rec = builder.set("username", "koala").build();
  putToChannel(in, event(rec, INCOMPATIBLE_SCHEMA, null, false));

  // run the sink
  sink.start();
  assertThrows("Should fail", EventDeliveryException.class,
      new Callable() {
        @Override
        public Object call() throws EventDeliveryException {
          sink.process();
          return null;
        }
      });
  sink.stop();

  Assert.assertEquals("Should have rolled back",
      expected.size() + 1, remaining(in));
}
 
開發者ID:moueimei,項目名稱:flume-release-1.7.0,代碼行數:25,代碼來源:TestDatasetSink.java

示例3: createParquetFile

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
/**
 * Create a data file that gets exported to the db.
 * @param fileNum the number of the file (for multi-file export)
 * @param numRecords how many records to write to the file.
 */
protected void createParquetFile(int fileNum, int numRecords,
    ColumnGenerator... extraCols) throws IOException {

  String uri = "dataset:file:" + getTablePath();
  Schema schema = buildSchema(extraCols);
  DatasetDescriptor descriptor = new DatasetDescriptor.Builder()
    .schema(schema)
    .format(Formats.PARQUET)
    .build();
  Dataset dataset = Datasets.create(uri, descriptor);
  DatasetWriter writer = dataset.newWriter();
  try {
    for (int i = 0; i < numRecords; i++) {
      GenericRecord record = new GenericData.Record(schema);
      record.put("id", i);
      record.put("msg", getMsgPrefix() + i);
      addExtraColumns(record, i, extraCols);
      writer.write(record);
    }
  } finally {
    writer.close();
  }
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:29,代碼來源:TestParquetExport.java

示例4: serialize

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
@Override
public byte[] serialize(final String topic, final GenericRow genericRow) {
  if (genericRow == null) {
    return null;
  }
  try {
    GenericRecord avroRecord = new GenericData.Record(avroSchema);
    for (int i = 0; i < genericRow.getColumns().size(); i++) {
      if (fields.get(i).schema().getType() == Schema.Type.ARRAY) {
        avroRecord.put(fields.get(i).name(), Arrays.asList((Object[]) genericRow.getColumns().get(i)));
      } else {
        avroRecord.put(fields.get(i).name(), genericRow.getColumns().get(i));
      }
    }
    return kafkaAvroSerializer.serialize(topic, avroRecord);
  } catch (Exception e) {
    throw new SerializationException(e);
  }
}
 
開發者ID:confluentinc,項目名稱:ksql,代碼行數:20,代碼來源:KsqlGenericRowAvroSerializer.java

示例5: readWithDifferentSchema

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
/**
 * Reads in binary Avro-encoded entities using a schema that is different
 * from the writer's schema.
 * 
 */
public static void readWithDifferentSchema(File file, Schema newSchema)
		throws IOException {
	GenericDatumReader datum = new GenericDatumReader(newSchema);
	DataFileReader reader = new DataFileReader(file, datum);

	GenericData.Record record = new GenericData.Record(newSchema);
	while (reader.hasNext()) {
		reader.next(record);
		System.out.println("Name " + record.get("name") + " on "
				+ record.get("Meetup_date") + " attending "
				+ record.get("attendance") + " organized by  "
				+ record.get("organizer") 
				+ " at  " + record.get("location"));
	}

	reader.close();
}
 
開發者ID:airisdata,項目名稱:avroparquet,代碼行數:23,代碼來源:StorageFormatUtils.java

示例6: getSerializedRow

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
private byte[] getSerializedRow(String topicName, SchemaRegistryClient schemaRegistryClient,
                                Schema rowAvroSchema,
                                GenericRow
    genericRow) {
  Map map = new HashMap();
  // Automatically register the schema in the Schema Registry if it has not been registered.
  map.put(AbstractKafkaAvroSerDeConfig.AUTO_REGISTER_SCHEMAS, true);
  map.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "");
  KafkaAvroSerializer kafkaAvroSerializer = new KafkaAvroSerializer(schemaRegistryClient, map);
  GenericRecord avroRecord = new GenericData.Record(rowAvroSchema);
  List<Schema.Field> fields = rowAvroSchema.getFields();
  for (int i = 0; i < genericRow.getColumns().size(); i++) {
    if (fields.get(i).schema().getType() == Schema.Type.ARRAY) {
      avroRecord.put(fields.get(i).name(), Arrays.asList((Object[]) genericRow.getColumns().get(i)));
    } else {
      avroRecord.put(fields.get(i).name(), genericRow.getColumns().get(i));
    }
  }

  return kafkaAvroSerializer.serialize(topicName, avroRecord);
}
 
開發者ID:confluentinc,項目名稱:ksql,代碼行數:22,代碼來源:KsqlGenericRowAvroDeserializerTest.java

示例7: serialize

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
/**
 * 
 * @param t
 * @return 
 */
@Override
public byte[] serialize(Tuple4<String, String, String, String> t) {

  if (!initialized) {
    parser = new Schema.Parser();
    schema = parser.parse(schemaJson);
    recordInjection = GenericAvroCodecs.toBinary(schema);
    initialized = true;
  }
  GenericData.Record avroRecord = new GenericData.Record(schema);
  for (int i = 0; i < t.getArity() - 1; i += 2) {
    avroRecord.put(t.getField(i).toString(), t.getField(i + 1).toString());
  }

  byte[] bytes = recordInjection.apply(avroRecord);
  return bytes;
}
 
開發者ID:hopshadoop,項目名稱:hops-util,代碼行數:23,代碼來源:AvroDeserializer.java

示例8: initializeRecord

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
@Before
public void initializeRecord() {
  // Create a record with a given JSON schema.
  GenericRecord record = new GenericData.Record(new Schema.Parser().parse(BILLING_EVENT_SCHEMA));
  record.put("id", "1");
  record.put("billingTime", 1508835963000000L);
  record.put("eventTime", 1484870383000000L);
  record.put("registrarId", "myRegistrar");
  record.put("billingId", "12345-CRRHELLO");
  record.put("tld", "test");
  record.put("action", "RENEW");
  record.put("domain", "example.test");
  record.put("repositoryId", "123456");
  record.put("years", 5);
  record.put("currency", "USD");
  record.put("amount", 20.5);
  record.put("flags", "AUTO_RENEW SYNTHETIC");
  schemaAndRecord = new SchemaAndRecord(record, null);
}
 
開發者ID:google,項目名稱:nomulus,代碼行數:20,代碼來源:BillingEventTest.java

示例9: get

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
public static Object get(String fieldName, GenericData.Record record, Object defaultValue)
{
   Schema decodedWithSchema = record.getSchema();

   Optional<Schema.Field> field = decodedWithSchema.getFields().stream()
      .filter(i -> i.name().equals(fieldName) || i.aliases().contains(fieldName))
      .findFirst();

   if(field.isPresent())
   {
      return record.get(field.get().pos());
   }
   else
   {
      return defaultValue;
   }
}
 
開發者ID:developerSid,項目名稱:AwesomeJavaLibraryExamples,代碼行數:18,代碼來源:AvroUtils.java

示例10: genericEncoderV1GenericDecoderV2

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
@Test
public void genericEncoderV1GenericDecoderV2() throws Exception{
	Schema reader = load("users_v2.schema");
	Schema writer = load("users_v1.schema");
	SchemaRegistryClient client = mock(SchemaRegistryClient.class);
	AvroCodec codec = new AvroCodec();
	codec.setReaderSchema(reader);
	codec.setSchemaRegistryClient(client);
	when(client.register(any())).thenReturn(2);
	when(client.fetch(eq(2))).thenReturn(writer);
	GenericRecord record = new GenericData.Record(writer);
	record.put("name","joe");
	record.put("favoriteNumber",42);
	record.put("favoriteColor","blue");
	byte[] results = codec.encode(record);
	GenericRecord decoded = codec.decode(results,GenericRecord.class);
	Assert.assertEquals(record.get("name").toString(),decoded.get("name").toString());
	Assert.assertEquals("NYC",decoded.get("favoritePlace").toString());
}
 
開發者ID:viniciusccarvalho,項目名稱:schema-evolution-samples,代碼行數:20,代碼來源:AvroCodecTests.java

示例11: AvroKeyValueWriter

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
AvroKeyValueWriter(Schema keySchema, Schema valueSchema,
		CodecFactory compressionCodec, OutputStream outputStream,
		int syncInterval) throws IOException {
	// Create the generic record schema for the key/value pair.
	mKeyValuePairSchema = AvroKeyValue
			.getSchema(keySchema, valueSchema);

	// Create an Avro container file and a writer to it.
	DatumWriter<GenericRecord> genericDatumWriter = new GenericDatumWriter<GenericRecord>(
			mKeyValuePairSchema);
	mAvroFileWriter = new DataFileWriter<GenericRecord>(
			genericDatumWriter);
	mAvroFileWriter.setCodec(compressionCodec);
	mAvroFileWriter.setSyncInterval(syncInterval);
	mAvroFileWriter.create(mKeyValuePairSchema, outputStream);

	// Create a reusable output record.
	mOutputRecord = new AvroKeyValue<Object, Object>(
			new GenericData.Record(mKeyValuePairSchema));
}
 
開發者ID:axbaretto,項目名稱:flink,代碼行數:21,代碼來源:AvroKeyValueSinkWriter.java

示例12: genericEncoderV1GenericDecoderV1

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
@Test
public void genericEncoderV1GenericDecoderV1() throws Exception{
	Schema schema = load("users_v1.schema");
	SchemaRegistryClient client = mock(SchemaRegistryClient.class);
	AvroCodec codec = new AvroCodec();
	codec.setSchemaRegistryClient(client);
	when(client.register(any())).thenReturn(1);
	when(client.fetch(eq(1))).thenReturn(schema);
	GenericRecord record = new GenericData.Record(schema);
	record.put("name","joe");
	record.put("favoriteNumber",42);
	record.put("favoriteColor","blue");
	byte[] results = codec.encode(record);
	GenericRecord decoded = codec.decode(results,GenericRecord.class);
	Assert.assertEquals(record.get("name").toString(),decoded.get("name").toString());
}
 
開發者ID:viniciusccarvalho,項目名稱:schema-evolution-samples,代碼行數:17,代碼來源:AvroCodecTests.java

示例13: genericEncoderV1SpecificDecoderV1

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
@Test
public void genericEncoderV1SpecificDecoderV1() throws Exception{
	Schema schema = load("users_v1.schema");
	SchemaRegistryClient client = mock(SchemaRegistryClient.class);
	AvroCodec codec = new AvroCodec();
	codec.setSchemaRegistryClient(client);
	when(client.register(any())).thenReturn(1);
	when(client.fetch(eq(1))).thenReturn(schema);
	GenericRecord record = new GenericData.Record(schema);
	record.put("name","joe");
	record.put("favoriteNumber",42);
	record.put("favoriteColor","blue");
	byte[] results = codec.encode(record);
	User decoded = codec.decode(results,User.class);
	Assert.assertEquals(record.get("name").toString(),decoded.getName().toString());

}
 
開發者ID:viniciusccarvalho,項目名稱:schema-evolution-samples,代碼行數:18,代碼來源:AvroCodecTests.java

示例14: testDeleteProject

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
/**
 * Checks {@link JiraDeleteWriter#write(Object)} deletes project from Jira server
 * 
 * @throws IOException
 */
public void testDeleteProject() throws IOException {
    IndexedRecord deleteProjectRecord = new GenericData.Record(DELETE_SCHEMA);
    deleteProjectRecord.put(0, "ITP");
    
    JiraWriter deleteProjectWriter = JiraTestsHelper.createWriter(HOST_PORT, USER, PASS, Resource.PROJECT, Action.DELETE);
    
    deleteProjectWriter.open("delProj");
    try {
        deleteProjectWriter.write(deleteProjectRecord);
    } catch (DataRejectException e) {
        String rejectError = e.getRejectInfo().get("error").toString();
        LOG.error(rejectError);
        collector.addError(new Throwable(rejectError));
    }
}
 
開發者ID:Talend,項目名稱:components,代碼行數:21,代碼來源:JiraWritersTestIT.java

示例15: testSyncLeadREST

import org.apache.avro.generic.GenericData; //導入方法依賴的package包/類
@Test
public void testSyncLeadREST() throws Exception {
    props = getRESTProperties();
    props.operationType.setValue(OperationType.createOrUpdate);
    props.lookupField.setValue(RESTLookupFields.email);
    props.deDupeEnabled.setValue(false);
    props.batchSize.setValue(1);
    props.connection.timeout.setValue(10000);
    // test attributes
    List<Field> fields = new ArrayList<>();
    Field field = new Schema.Field("accountType", Schema.create(Schema.Type.STRING), null, (Object) null);
    fields.add(field);
    Schema s = MarketoUtils.newSchema(props.schemaInput.schema.getValue(), "leadAttribute", fields);
    props.schemaInput.schema.setValue(s);
    props.updateOutputSchemas();
    //
    IndexedRecord record = new GenericData.Record(s);
    record.put(0, null);
    record.put(1, "[email protected]");
    record.put(2, "Foreig+nPerson_Sys)Id  FIRSTN1");
    record.put(3, "SFDC41 LAST0");// CUSTOM, SFDC, NETSUITE;
    record.put(4, "Anti conservative0");
    //
    testSyncLead(record);
}
 
開發者ID:Talend,項目名稱:components,代碼行數:26,代碼來源:MarketoOutputWriterTestIT.java


注:本文中的org.apache.avro.generic.GenericData.Record方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。