当前位置: 首页>>代码示例>>Java>>正文


Java SerializationSchema类代码示例

本文整理汇总了Java中org.apache.flink.streaming.util.serialization.SerializationSchema的典型用法代码示例。如果您正苦于以下问题:Java SerializationSchema类的具体用法?Java SerializationSchema怎么用?Java SerializationSchema使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


SerializationSchema类属于org.apache.flink.streaming.util.serialization包,在下文中一共展示了SerializationSchema类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: FlinkKafkaProducer

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
@Deprecated
public FlinkKafkaProducer(String brokerList, String topicId, SerializationSchema<IN> serializationSchema) {
	super(topicId, new KeyedSerializationSchemaWrapper<>(serializationSchema), getPropertiesFromBrokerList(brokerList), null);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:5,代码来源:FlinkKafkaProducer.java

示例2: AMQSinkConfig

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
public AMQSinkConfig(ActiveMQConnectionFactory connectionFactory, String queueName,
                    SerializationSchema<IN> serializationSchema, boolean persistentDelivery,
                    DestinationType destinationType) {
    this.connectionFactory = Preconditions.checkNotNull(connectionFactory, "connectionFactory not set");
    this.queueName = Preconditions.checkNotNull(queueName, "destinationName not set");
    this.serializationSchema = Preconditions.checkNotNull(serializationSchema, "serializationSchema not set");
    this.persistentDelivery = persistentDelivery;
    this.destinationType = Preconditions.checkNotNull(destinationType, "destinationType");
}
 
开发者ID:apache,项目名称:bahir-flink,代码行数:10,代码来源:AMQSinkConfig.java

示例3: AisMessagesToFileSinkWriter

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
public AisMessagesToFileSinkWriter(String filePath,
    SerializationSchema<AisMessage> aisMessageSchema) {
  this.serializationSchema = aisMessageSchema;
  this.filePath = filePath;
}
 
开发者ID:ehabqadah,项目名称:in-situ-processing-datAcron,代码行数:6,代码来源:AisMessagesToFileSinkWriter.java

示例4: FlinkProducer

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
public FlinkProducer(String topic,
        SerializationSchema serializationSchema,
        Properties props) {
  super(topic, new KeyedSerializationSchemaWrapper<>(serializationSchema),
          props);

}
 
开发者ID:hopshadoop,项目名称:hops-util,代码行数:8,代码来源:FlinkProducer.java

示例5: getSerializationSchema

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
public SerializationSchema<IN> getSerializationSchema() {
    return serializationSchema;
}
 
开发者ID:apache,项目名称:bahir-flink,代码行数:4,代码来源:AMQSinkConfig.java

示例6: setSerializationSchema

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
public AMQSinkConfigBuilder<IN> setSerializationSchema(SerializationSchema<IN> serializationSchema) {
    this.serializationSchema = Preconditions.checkNotNull(serializationSchema);
    return this;
}
 
开发者ID:apache,项目名称:bahir-flink,代码行数:5,代码来源:AMQSinkConfig.java

示例7: FlumeSink

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
public FlumeSink(String host, int port, SerializationSchema<IN> schema) {
    this.host = host;
    this.port = port;
    this.schema = schema;
}
 
开发者ID:apache,项目名称:bahir-flink,代码行数:6,代码来源:FlumeSink.java

示例8: createKafkaProducer

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
@Override
protected FlinkKafkaProducerBase<Tuple2<Boolean, Row>> createKafkaProducer(String topic, Properties properties, SerializationSchema<Tuple2<Boolean, Row>> serializationSchema, FlinkKafkaPartitioner<Tuple2<Boolean, Row>> partitioner) {
	return new FlinkKafkaProducer09<>(topic, serializationSchema, properties, partitioner);
}
 
开发者ID:datafibers-community,项目名称:df_data_service,代码行数:5,代码来源:Kafka09AvroTableSink.java

示例9: createSerializationSchema

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
@Override
protected SerializationSchema<Tuple2<Boolean, Row>> createSerializationSchema(Properties properties) {
	// TODO Auto-generated method stub
	return new AvroRowSerializationSchema(properties);
}
 
开发者ID:datafibers-community,项目名称:df_data_service,代码行数:6,代码来源:Kafka09AvroTableSink.java

示例10: createSerializationSchema

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
protected SerializationSchema<Tuple2<Boolean, Row>> createSerializationSchema(Properties properties) {
	return new AvroRowSerializationSchema(properties);
}
 
开发者ID:datafibers-community,项目名称:df_data_service,代码行数:4,代码来源:KafkaAvroTableSink.java

示例11: FlumeSink

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
public FlumeSink(String host, int port, SerializationSchema<IN> schema) {
	this.host = host;
	this.port = port;
	this.schema = schema;
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:6,代码来源:FlumeSink.java

示例12: createKafkaProducer

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
@Override
protected FlinkKafkaProducerBase<Row> createKafkaProducer(String topic, Properties properties, SerializationSchema<Row> serializationSchema, KafkaPartitioner<Row> partitioner) {
	return new FlinkKafkaProducer09<>(topic, serializationSchema, properties, partitioner);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:5,代码来源:Kafka09JsonTableSink.java

示例13: getSerializationSchema

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
@Override
@SuppressWarnings("unchecked")
protected SerializationSchema<Row> getSerializationSchema() {
	return new JsonRowSerializationSchema(FIELD_NAMES);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:6,代码来源:Kafka09JsonTableSinkTest.java

示例14: createKafkaProducer

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
@Override
protected FlinkKafkaProducerBase<Row> createKafkaProducer(String topic, Properties properties, SerializationSchema<Row> serializationSchema, KafkaPartitioner<Row> partitioner) {
	return new FlinkKafkaProducer08<>(topic, serializationSchema, properties, partitioner);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:5,代码来源:Kafka08JsonTableSink.java

示例15: createSerializationSchema

import org.apache.flink.streaming.util.serialization.SerializationSchema; //导入依赖的package包/类
@Override
protected SerializationSchema<Row> createSerializationSchema(String[] fieldNames) {
	return new JsonRowSerializationSchema(fieldNames);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:5,代码来源:KafkaJsonTableSink.java


注:本文中的org.apache.flink.streaming.util.serialization.SerializationSchema类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。