当前位置: 首页>>代码示例>>Java>>正文


Java Serializer类代码示例

本文整理汇总了Java中org.apache.hadoop.hive.serde2.Serializer的典型用法代码示例。如果您正苦于以下问题:Java Serializer类的具体用法?Java Serializer怎么用?Java Serializer使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


Serializer类属于org.apache.hadoop.hive.serde2包,在下文中一共展示了Serializer类的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: createMultiStripeFile

import org.apache.hadoop.hive.serde2.Serializer; //导入依赖的package包/类
private static void createMultiStripeFile(File file)
        throws IOException, ReflectiveOperationException, SerDeException
{
    FileSinkOperator.RecordWriter writer = createOrcRecordWriter(file, ORC_12, OrcTester.Compression.NONE, javaLongObjectInspector);

    @SuppressWarnings("deprecation") Serializer serde = new OrcSerde();
    SettableStructObjectInspector objectInspector = createSettableStructObjectInspector("test", javaLongObjectInspector);
    Object row = objectInspector.create();
    StructField field = objectInspector.getAllStructFieldRefs().get(0);

    for (int i = 0; i < 300; i += 3) {
        if ((i > 0) && (i % 60 == 0)) {
            flushWriter(writer);
        }

        objectInspector.setStructFieldData(row, field, (long) i);
        Writable record = serde.serialize(row, objectInspector);
        writer.write(record);
    }

    writer.close(false);
}
 
开发者ID:y-lan,项目名称:presto,代码行数:23,代码来源:TestOrcReaderPositions.java

示例2: createSequentialFile

import org.apache.hadoop.hive.serde2.Serializer; //导入依赖的package包/类
private static void createSequentialFile(File file, int count)
        throws IOException, ReflectiveOperationException, SerDeException
{
    FileSinkOperator.RecordWriter writer = createOrcRecordWriter(file, ORC_12, OrcTester.Compression.NONE, javaLongObjectInspector);

    @SuppressWarnings("deprecation") Serializer serde = new OrcSerde();
    SettableStructObjectInspector objectInspector = createSettableStructObjectInspector("test", javaLongObjectInspector);
    Object row = objectInspector.create();
    StructField field = objectInspector.getAllStructFieldRefs().get(0);

    for (int i = 0; i < count; i++) {
        objectInspector.setStructFieldData(row, field, (long) i);
        Writable record = serde.serialize(row, objectInspector);
        writer.write(record);
    }

    writer.close(false);
}
 
开发者ID:y-lan,项目名称:presto,代码行数:19,代码来源:TestOrcReaderPositions.java

示例3: initializeSerializer

import org.apache.hadoop.hive.serde2.Serializer; //导入依赖的package包/类
@SuppressWarnings("deprecation")
private static Serializer initializeSerializer(Configuration conf, Properties properties, String serializerName)
{
    try {
        Serializer result = (Serializer) Class.forName(serializerName).getConstructor().newInstance();
        result.initialize(conf, properties);
        return result;
    }
    catch (SerDeException | ReflectiveOperationException e) {
        throw Throwables.propagate(e);
    }
}
 
开发者ID:y-lan,项目名称:presto,代码行数:13,代码来源:HivePageSink.java

示例4: writeLineItems

import org.apache.hadoop.hive.serde2.Serializer; //导入依赖的package包/类
public static DataSize writeLineItems(
        File outputFile,
        HiveOutputFormat<?, ?> outputFormat,
        @SuppressWarnings("deprecation") Serializer serializer,
        CompressionType compressionType,
        List<? extends TpchColumn<?>> columns)
        throws Exception
{
    RecordWriter recordWriter = createRecordWriter(columns, outputFile, outputFormat, compressionType);

    SettableStructObjectInspector objectInspector = getStandardStructObjectInspector(transform(columns, input -> input.getColumnName()), transform(columns, input -> getObjectInspector(input)));

    Object row = objectInspector.create();

    List<StructField> fields = ImmutableList.copyOf(objectInspector.getAllStructFieldRefs());

    for (LineItem lineItem : new LineItemGenerator(1, 1, 1)) {
        objectInspector.setStructFieldData(row, fields.get(0), lineItem.getOrderKey());
        objectInspector.setStructFieldData(row, fields.get(1), lineItem.getPartKey());
        objectInspector.setStructFieldData(row, fields.get(2), lineItem.getSupplierKey());
        objectInspector.setStructFieldData(row, fields.get(3), lineItem.getLineNumber());
        objectInspector.setStructFieldData(row, fields.get(4), lineItem.getQuantity());
        objectInspector.setStructFieldData(row, fields.get(5), lineItem.getExtendedPrice());
        objectInspector.setStructFieldData(row, fields.get(6), lineItem.getDiscount());
        objectInspector.setStructFieldData(row, fields.get(7), lineItem.getTax());
        objectInspector.setStructFieldData(row, fields.get(8), lineItem.getReturnFlag());
        objectInspector.setStructFieldData(row, fields.get(9), lineItem.getStatus());
        objectInspector.setStructFieldData(row, fields.get(10), lineItem.getShipDate());
        objectInspector.setStructFieldData(row, fields.get(11), lineItem.getCommitDate());
        objectInspector.setStructFieldData(row, fields.get(12), lineItem.getReceiptDate());
        objectInspector.setStructFieldData(row, fields.get(13), lineItem.getShipInstructions());
        objectInspector.setStructFieldData(row, fields.get(14), lineItem.getShipMode());
        objectInspector.setStructFieldData(row, fields.get(15), lineItem.getComment());

        Writable record = serializer.serialize(row, objectInspector);
        recordWriter.write(record);
    }

    recordWriter.close(false);
    return getFileSize(outputFile);
}
 
开发者ID:y-lan,项目名称:presto,代码行数:42,代码来源:BenchmarkHiveFileFormats.java


注:本文中的org.apache.hadoop.hive.serde2.Serializer类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。