當前位置: 首頁>>代碼示例>>Java>>正文


Java TypeInformation類代碼示例

本文整理匯總了Java中org.apache.flink.api.common.typeinfo.TypeInformation的典型用法代碼示例。如果您正苦於以下問題:Java TypeInformation類的具體用法?Java TypeInformation怎麽用?Java TypeInformation使用的例子?那麽, 這裏精選的類代碼示例或許可以為您提供幫助。


TypeInformation類屬於org.apache.flink.api.common.typeinfo包,在下文中一共展示了TypeInformation類的15個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: open

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
@Override
public void open(Configuration config) {
  ValueStateDescriptor<AbstractStatisticsWrapper<AisMessage>> descriptor =
      new ValueStateDescriptor<AbstractStatisticsWrapper<AisMessage>>("trajectoryStatistics",
          TypeInformation.of(new TypeHint<AbstractStatisticsWrapper<AisMessage>>() {}));

  statisticsOfTrajectory = getRuntimeContext().getState(descriptor);

}
 
開發者ID:ehabqadah,項目名稱:in-situ-processing-datAcron,代碼行數:10,代碼來源:AisStreamEnricher.java

示例2: createFlatAvroSchema

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
/**
 * Creates a flat Avro Schema for testing.
 */
public static Schema createFlatAvroSchema(String[] fieldNames, TypeInformation[] fieldTypes) {
	final SchemaBuilder.FieldAssembler<Schema> fieldAssembler = SchemaBuilder
		.record("BasicAvroRecord")
		.namespace(NAMESPACE)
		.fields();

	final Schema nullSchema = Schema.create(Schema.Type.NULL);

	for (int i = 0; i < fieldNames.length; i++) {
		Schema schema = ReflectData.get().getSchema(fieldTypes[i].getTypeClass());
		Schema unionSchema = Schema.createUnion(Arrays.asList(nullSchema, schema));
		fieldAssembler.name(fieldNames[i]).type(unionSchema).noDefault();
	}

	return fieldAssembler.endRecord();
}
 
開發者ID:axbaretto,項目名稱:flink,代碼行數:20,代碼來源:AvroTestUtils.java

示例3: getTypeInformation

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
@SuppressWarnings("unchecked")
private <OUT> TypeInformation<OUT> getTypeInformation(EsperSelectFunction<OUT> esperSelectFunction) {
    try {
        TypeExtractionUtils.LambdaExecutable lambdaExecutable = TypeExtractionUtils.checkAndExtractLambda(esperSelectFunction);
        if (esperSelectFunction instanceof ResultTypeQueryable) {
            return ((ResultTypeQueryable<OUT>) esperSelectFunction).getProducedType();
        }
        if (lambdaExecutable != null) {
            Type type = lambdaExecutable.getReturnType();
            return (TypeInformation<OUT>) TypeExtractor.createTypeInfo(type);
        }
        else {
            return TypeExtractor.createTypeInfo(esperSelectFunction, EsperSelectFunction.class, esperSelectFunction.getClass(), 0);
        }
    } catch (TypeExtractionException e) {
        throw new InvalidTypesException("Could not extract types.", e);
    }
}
 
開發者ID:phil3k3,項目名稱:flink-esper,代碼行數:19,代碼來源:EsperStream.java

示例4: testCompile

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
@Test
public void testCompile() throws IOException {
  RowTypeInfo schema = new RowTypeInfo(new TypeInformation[]{BasicTypeInfo.INT_TYPE_INFO}, new String[] {"id"});
  MockExternalCatalogTable inputTable = new MockExternalCatalogTable(schema, Collections.singletonList(Row.of(1)));
  MockExternalCatalogTable outputTable = new MockExternalCatalogTable(schema, new ArrayList<>());
  SingleLevelMemoryCatalog input = new SingleLevelMemoryCatalog("input",
      Collections.singletonMap("foo", inputTable));
  SingleLevelMemoryCatalog output = new SingleLevelMemoryCatalog("output",
      Collections.singletonMap("bar", outputTable));
  JobDescriptor job = new JobDescriptor(
      Collections.singletonMap("input", input),
      Collections.emptyMap(),
      output,
      1,
      "SELECT * FROM input.foo");
  CompilationResult res = new ContainedExecutor().run(job);
  assertNull(res.remoteThrowable());
  assertNotNull(res.jobGraph());
}
 
開發者ID:uber,項目名稱:AthenaX,代碼行數:20,代碼來源:ProcessExecutorTest.java

示例5: testInvalidSql

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
@Test
public void testInvalidSql() throws IOException {
  RowTypeInfo schema = new RowTypeInfo(new TypeInformation[]{BasicTypeInfo.INT_TYPE_INFO}, new String[] {"id"});
  MockExternalCatalogTable inputTable = new MockExternalCatalogTable(schema, Collections.singletonList(Row.of(1)));
  MockExternalCatalogTable outputTable = new MockExternalCatalogTable(schema, new ArrayList<>());
  SingleLevelMemoryCatalog input = new SingleLevelMemoryCatalog("input",
      Collections.singletonMap("foo", inputTable));
  SingleLevelMemoryCatalog output = new SingleLevelMemoryCatalog("output",
      Collections.singletonMap("bar", outputTable));
  JobDescriptor job = new JobDescriptor(
      Collections.singletonMap("input", input),
      Collections.emptyMap(),
      output,
      1,
      "SELECT2 * FROM input.foo");
  CompilationResult res = new ContainedExecutor().run(job);
  assertNull(res.jobGraph());
  assertTrue(res.remoteThrowable() instanceof SqlParserException);
}
 
開發者ID:uber,項目名稱:AthenaX,代碼行數:20,代碼來源:ProcessExecutorTest.java

示例6: PravegaDeserializationSchema

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
/**
 * Creates a new PravegaDeserializationSchema using the given Pravega serializer, and the
 * type described by the type class.
 *
 * <p>Use this constructor if the produced type is not generic and can be fully described by
 * a class. If the type is generic, use the {@link #PravegaDeserializationSchema(TypeHint, Serializer)}
 * constructor instead.
 * 
 * @param typeClass  The class describing the deserialized type.
 * @param serializer The serializer to deserialize the byte messages.
 */
public PravegaDeserializationSchema(Class<T> typeClass, Serializer<T> serializer) {
    checkNotNull(typeClass);
    checkSerializer(serializer);

    this.serializer = serializer;

    try {
        this.typeInfo = TypeInformation.of(typeClass);
    } catch (InvalidTypesException e) {
        throw new IllegalArgumentException(
                "Due to Java's type erasure, the generic type information cannot be properly inferred. " + 
                "Please pass a 'TypeHint' instead of a class to describe the type. " +
                "For example, to describe 'Tuple2<String, String>' as a generic type, use " +
                "'new PravegaDeserializationSchema<>(new TypeHint<Tuple2<String, String>>(){}, serializer);'"
        );
    }
}
 
開發者ID:pravega,項目名稱:flink-connectors,代碼行數:29,代碼來源:PravegaDeserializationSchema.java

示例7: configure

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
@Override
public FlinkPravegaTableSink configure(String[] fieldNames, TypeInformation<?>[] fieldTypes) {

    // called to configure the sink with a specific subset of fields

    FlinkPravegaTableSink copy = createCopy();
    copy.fieldNames = checkNotNull(fieldNames, "fieldNames");
    copy.fieldTypes = checkNotNull(fieldTypes, "fieldTypes");
    Preconditions.checkArgument(fieldNames.length == fieldTypes.length,
            "Number of provided field names and types does not match.");

    copy.serializationSchema = serializationSchemaFactory.apply(fieldNames);
    copy.eventRouter = new RowBasedRouter(routingKeyFieldName, fieldNames, fieldTypes);

    return copy;
}
 
開發者ID:pravega,項目名稱:flink-connectors,代碼行數:17,代碼來源:FlinkPravegaTableSink.java

示例8: pruneOutput

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
private <T> void pruneOutput(
    DataSet<WindowedValue<RawUnionValue>> taggedDataSet,
    FlinkBatchTranslationContext context,
    int integerTag,
    PCollection<T> collection) {
  TypeInformation<WindowedValue<T>> outputType = context.getTypeInfo(collection);

  FlinkMultiOutputPruningFunction<T> pruningFunction =
      new FlinkMultiOutputPruningFunction<>(integerTag);

  FlatMapOperator<WindowedValue<RawUnionValue>, WindowedValue<T>> pruningOperator =
      new FlatMapOperator<>(
          taggedDataSet,
          outputType,
          pruningFunction,
          collection.getName());

  context.setOutputDataSet(collection, pruningOperator);
}
 
開發者ID:apache,項目名稱:beam,代碼行數:20,代碼來源:FlinkBatchTransformTranslators.java

示例9: fromCollection

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
/**
 * Creates a data stream from the given non-empty collection.
 *
 * <p>Note that this operation will result in a non-parallel data stream source,
 * i.e., a data stream source with parallelism one.
 *
 * @param data
 * 		The collection of elements to create the data stream from
 * @param typeInfo
 * 		The TypeInformation for the produced data stream
 * @param <OUT>
 * 		The type of the returned data stream
 * @return The data stream representing the given collection
 */
public <OUT> DataStreamSource<OUT> fromCollection(Collection<OUT> data, TypeInformation<OUT> typeInfo) {
	Preconditions.checkNotNull(data, "Collection must not be null");

	// must not have null elements and mixed elements
	FromElementsFunction.checkCollection(data, typeInfo.getTypeClass());

	SourceFunction<OUT> function;
	try {
		function = new FromElementsFunction<>(typeInfo.createSerializer(getConfig()), data);
	}
	catch (IOException e) {
		throw new RuntimeException(e.getMessage(), e);
	}
	return addSource(function, "Collection Source", typeInfo).setParallelism(1);
}
 
開發者ID:axbaretto,項目名稱:flink,代碼行數:30,代碼來源:StreamExecutionEnvironment.java

示例10: testFoldWithEvictor

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
@Test
@SuppressWarnings({"rawtypes", "unchecked"})
public void testFoldWithEvictor() throws Exception {
	StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
	env.setStreamTimeCharacteristic(TimeCharacteristic.IngestionTime);

	DataStream<Tuple2<String, Integer>> source = env.fromElements(Tuple2.of("hello", 1), Tuple2.of("hello", 2));

	DataStream<Tuple3<String, String, Integer>> window1 = source
			.windowAll(SlidingEventTimeWindows.of(Time.of(1, TimeUnit.SECONDS), Time.of(100, TimeUnit.MILLISECONDS)))
			.evictor(CountEvictor.of(100))
			.fold(new Tuple3<>("", "", 1), new DummyFolder());

	OneInputTransformation<Tuple2<String, Integer>, Tuple3<String, String, Integer>> transform =
			(OneInputTransformation<Tuple2<String, Integer>, Tuple3<String, String, Integer>>) window1.getTransformation();
	OneInputStreamOperator<Tuple2<String, Integer>, Tuple3<String, String, Integer>> operator = transform.getOperator();
	Assert.assertTrue(operator instanceof EvictingWindowOperator);
	EvictingWindowOperator<String, Tuple2<String, Integer>, ?, ?> winOperator = (EvictingWindowOperator<String, Tuple2<String, Integer>, ?, ?>) operator;
	Assert.assertTrue(winOperator.getTrigger() instanceof EventTimeTrigger);
	Assert.assertTrue(winOperator.getWindowAssigner() instanceof SlidingEventTimeWindows);
	Assert.assertTrue(winOperator.getEvictor() instanceof CountEvictor);
	Assert.assertTrue(winOperator.getStateDescriptor() instanceof ListStateDescriptor);

	winOperator.setOutputType((TypeInformation) window1.getType(), new ExecutionConfig());
	processElementAndEnsureOutput(winOperator, winOperator.getKeySelector(), BasicTypeInfo.STRING_TYPE_INFO, new Tuple2<>("hello", 1));
}
 
開發者ID:axbaretto,項目名稱:flink,代碼行數:27,代碼來源:AllWindowTranslationTest.java

示例11: getQualifierKeys

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
/**
 * Returns the HBase identifiers of all registered column qualifiers for a specific column family.
 *
 * @param family The name of the column family for which the column qualifier identifiers are returned.
 * @return The HBase identifiers of all registered column qualifiers for a specific column family.
 */
byte[][] getQualifierKeys(String family) {
	Map<String, TypeInformation<?>> qualifierMap = familyMap.get(family);

	if (qualifierMap == null) {
		throw new IllegalArgumentException("Family " + family + " does not exist in schema.");
	}
	Charset c = Charset.forName(charset);

	byte[][] qualifierKeys = new byte[qualifierMap.size()][];
	int i = 0;
	for (String name : qualifierMap.keySet()) {
		qualifierKeys[i++] = name.getBytes(c);
	}
	return qualifierKeys;
}
 
開發者ID:axbaretto,項目名稱:flink,代碼行數:22,代碼來源:HBaseTableSchema.java

示例12: testTumblingEventTimeWindowsReduce

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
@Test
@SuppressWarnings("unchecked")
public void testTumblingEventTimeWindowsReduce() throws Exception {
	closeCalled.set(0);

	final int windowSize = 3;

	TypeInformation<Tuple2<String, Integer>> inputType = TypeInfoParser.parse("Tuple2<String, Integer>");

	ReducingStateDescriptor<Tuple2<String, Integer>> stateDesc = new ReducingStateDescriptor<>("window-contents",
			new SumReducer(),
			inputType.createSerializer(new ExecutionConfig()));

	WindowOperator<String, Tuple2<String, Integer>, Tuple2<String, Integer>, Tuple2<String, Integer>, TimeWindow> operator = new WindowOperator<>(
			TumblingEventTimeWindows.of(Time.of(windowSize, TimeUnit.SECONDS)),
			new TimeWindow.Serializer(),
			new TupleKeySelector(),
			BasicTypeInfo.STRING_TYPE_INFO.createSerializer(new ExecutionConfig()),
			stateDesc,
			new InternalSingleValueWindowFunction<>(new PassThroughWindowFunction<String, TimeWindow, Tuple2<String, Integer>>()),
			EventTimeTrigger.create(),
			0,
			null /* late data output tag */);

	testTumblingEventTimeWindows(operator);
}
 
開發者ID:axbaretto,項目名稱:flink,代碼行數:27,代碼來源:WindowOperatorTest.java

示例13: appendKeyExtractor

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
@SuppressWarnings("unchecked")
public static <T, K1, K2> org.apache.flink.api.common.operators.Operator<Tuple3<K1, K2, T>> appendKeyExtractor(
		org.apache.flink.api.common.operators.Operator<T> input,
		SelectorFunctionKeys<T, K1> key1,
		SelectorFunctionKeys<T, K2> key2) {

	TypeInformation<T> inputType = key1.getInputType();
	TypeInformation<Tuple3<K1, K2, T>> typeInfoWithKey = createTypeWithKey(key1, key2);
	TwoKeyExtractingMapper<T, K1, K2> extractor =
			new TwoKeyExtractingMapper<>(key1.getKeyExtractor(), key2.getKeyExtractor());

	MapOperatorBase<T, Tuple3<K1, K2, T>, MapFunction<T, Tuple3<K1, K2, T>>> mapper =
			new MapOperatorBase<T, Tuple3<K1, K2, T>, MapFunction<T, Tuple3<K1, K2, T>>>(
					extractor,
					new UnaryOperatorInformation<>(inputType, typeInfoWithKey),
					"Key Extractor"
			);

	mapper.setInput(input);
	mapper.setParallelism(input.getParallelism());

	return mapper;
}
 
開發者ID:axbaretto,項目名稱:flink,代碼行數:24,代碼來源:KeyFunctions.java

示例14: testBasicType

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
@SuppressWarnings({ "rawtypes", "unchecked" })
@Test
public void testBasicType() {
	// use getGroupReduceReturnTypes()
	RichGroupReduceFunction<?, ?> function = new RichGroupReduceFunction<Boolean, Boolean>() {
		private static final long serialVersionUID = 1L;

		@Override
		public void reduce(Iterable<Boolean> values, Collector<Boolean> out) throws Exception {
			// nothing to do
		}
	};

	TypeInformation<?> ti = TypeExtractor.getGroupReduceReturnTypes(function, (TypeInformation) TypeInfoParser.parse("Boolean"));

	Assert.assertTrue(ti.isBasicType());
	Assert.assertEquals(BasicTypeInfo.BOOLEAN_TYPE_INFO, ti);
	Assert.assertEquals(Boolean.class, ti.getTypeClass());

	// use getForClass()
	Assert.assertTrue(TypeExtractor.getForClass(Boolean.class).isBasicType());
	Assert.assertEquals(ti, TypeExtractor.getForClass(Boolean.class));

	// use getForObject()
	Assert.assertEquals(BasicTypeInfo.BOOLEAN_TYPE_INFO, TypeExtractor.getForObject(true));
}
 
開發者ID:axbaretto,項目名稱:flink,代碼行數:27,代碼來源:TypeExtractorTest.java

示例15: asFlinkTuples

import org.apache.flink.api.common.typeinfo.TypeInformation; //導入依賴的package包/類
/**
 * Specifies that the InputFormat returns Flink tuples instead of
 * {@link org.apache.hive.hcatalog.data.HCatRecord}.
 *
 * <p>Note: Flink tuples might only support a limited number of fields (depending on the API).
 *
 * @return This InputFormat.
 * @throws org.apache.hive.hcatalog.common.HCatException
 */
public HCatInputFormatBase<T> asFlinkTuples() throws HCatException {

	// build type information
	int numFields = outputSchema.getFields().size();
	if (numFields > this.getMaxFlinkTupleSize()) {
		throw new IllegalArgumentException("Only up to " + this.getMaxFlinkTupleSize() +
				" fields can be returned as Flink tuples.");
	}

	TypeInformation[] fieldTypes = new TypeInformation[numFields];
	fieldNames = new String[numFields];
	for (String fieldName : outputSchema.getFieldNames()) {
		HCatFieldSchema field = outputSchema.get(fieldName);

		int fieldPos = outputSchema.getPosition(fieldName);
		TypeInformation fieldType = getFieldType(field);

		fieldTypes[fieldPos] = fieldType;
		fieldNames[fieldPos] = fieldName;

	}
	this.resultType = new TupleTypeInfo(fieldTypes);

	return this;
}
 
開發者ID:axbaretto,項目名稱:flink,代碼行數:35,代碼來源:HCatInputFormatBase.java


注:本文中的org.apache.flink.api.common.typeinfo.TypeInformation類示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。