当前位置: 首页>>代码示例>>Java>>正文


Java JoinFunction类代码示例

本文整理汇总了Java中org.apache.flink.api.common.functions.JoinFunction的典型用法代码示例。如果您正苦于以下问题:Java JoinFunction类的具体用法?Java JoinFunction怎么用?Java JoinFunction使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


JoinFunction类属于org.apache.flink.api.common.functions包,在下文中一共展示了JoinFunction类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: executeTaskWithGenerator

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
private void executeTaskWithGenerator(
		JoinFunction<Tuple2<Integer, Integer>, Tuple2<Integer, Integer>, Tuple2<Integer, Integer>> joiner,
		int keys, int vals, int msecsTillCanceling, int maxTimeTillCanceled) throws Exception {
	UniformIntTupleGenerator g = new UniformIntTupleGenerator(keys, vals, false);
	ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
	DataSet<Tuple2<Integer, Integer>> input1 = env.createInput(new UniformIntTupleGeneratorInputFormat(keys, vals));
	DataSet<Tuple2<Integer, Integer>> input2 = env.createInput(new UniformIntTupleGeneratorInputFormat(keys, vals));

	input1.join(input2, JoinOperatorBase.JoinHint.REPARTITION_SORT_MERGE)
			.where(0)
			.equalTo(0)
			.with(joiner)
			.output(new DiscardingOutputFormat<Tuple2<Integer, Integer>>());

	env.setParallelism(parallelism);

	runAndCancelJob(env.createProgramPlan(), msecsTillCanceling, maxTimeTillCanceled);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:19,代码来源:JoinCancelingITCase.java

示例2: getJoinReturnTypes

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
@PublicEvolving
public static <IN1, IN2, OUT> TypeInformation<OUT> getJoinReturnTypes(JoinFunction<IN1, IN2, OUT> joinInterface,
		TypeInformation<IN1> in1Type, TypeInformation<IN2> in2Type, String functionName, boolean allowMissing)
{
	return getBinaryOperatorReturnType(
		(Function) joinInterface,
		JoinFunction.class,
		0,
		1,
		2,
		new int[]{0},
		new int[]{1},
		NO_INDEX,
		in1Type,
		in2Type,
		functionName,
		allowMissing);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:19,代码来源:TypeExtractor.java

示例3: testInputInferenceWithCustomTupleAndRichFunction

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
@Test
public void testInputInferenceWithCustomTupleAndRichFunction() {
	JoinFunction<CustomTuple2WithArray<Long>, CustomTuple2WithArray<Long>, CustomTuple2WithArray<Long>> function = new JoinWithCustomTuple2WithArray<>();

	TypeInformation<?> ti = TypeExtractor.getJoinReturnTypes(
		function,
		new TypeHint<CustomTuple2WithArray<Long>>(){}.getTypeInfo(),
		new TypeHint<CustomTuple2WithArray<Long>>(){}.getTypeInfo());

	Assert.assertTrue(ti.isTupleType());
	TupleTypeInfo<?> tti = (TupleTypeInfo<?>) ti;
	Assert.assertEquals(BasicTypeInfo.LONG_TYPE_INFO, tti.getTypeAt(1));

	Assert.assertTrue(tti.getTypeAt(0) instanceof ObjectArrayTypeInfo<?, ?>);
	ObjectArrayTypeInfo<?, ?> oati = (ObjectArrayTypeInfo<?, ?>) tti.getTypeAt(0);
	Assert.assertEquals(BasicTypeInfo.LONG_TYPE_INFO, oati.getComponentInfo());
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:18,代码来源:TypeExtractorTest.java

示例4: apply

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
/**
 * Completes the join operation with the user function that is executed
 * for each combination of elements with the same key in a window.
 *
 * <p>Note: This method's return type does not support setting an operator-specific parallelism.
 * Due to binary backwards compatibility, this cannot be altered. Use the {@link #with(JoinFunction)}
 * method to set an operator-specific parallelism.
 */
public <T> DataStream<T> apply(JoinFunction<T1, T2, T> function) {
	TypeInformation<T> resultType = TypeExtractor.getBinaryOperatorReturnType(
		function,
		JoinFunction.class,
		0,
		1,
		2,
		new int[]{0},
		new int[]{1},
		TypeExtractor.NO_INDEX,
		input1.getType(),
		input2.getType(),
		"Join",
		false);

	return apply(function, resultType);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:26,代码来源:JoinedStreams.java

示例5: runWindowJoin

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
public static DataStream<Tuple3<String, Integer, Integer>> runWindowJoin(
		DataStream<Tuple2<String, Integer>> grades,
		DataStream<Tuple2<String, Integer>> salaries,
		long windowSize) {

	return grades.join(salaries)
			.where(new NameKeySelector())
			.equalTo(new NameKeySelector())

			.window(TumblingEventTimeWindows.of(Time.milliseconds(windowSize)))

			.apply(new JoinFunction<Tuple2<String, Integer>, Tuple2<String, Integer>, Tuple3<String, Integer, Integer>>() {

				@Override
				public Tuple3<String, Integer, Integer> join(
								Tuple2<String, Integer> first,
								Tuple2<String, Integer> second) {
					return new Tuple3<String, Integer, Integer>(first.f0, first.f1, second.f1);
				}
			});
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:22,代码来源:WindowJoin.java

示例6: StreamJoinOperator

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
public StreamJoinOperator(JoinFunction<IN1, IN2, OUT> userFunction,
                          KeySelector<IN1, K> keySelector1,
                          KeySelector<IN2, K> keySelector2,
                          long stream1WindowLength,
                          long stream2WindowLength,
                          TypeSerializer<IN1> inputSerializer1,
                          TypeSerializer<IN2> inputSerializer2) {
    super(userFunction);
    this.keySelector1 = requireNonNull(keySelector1);
    this.keySelector2 = requireNonNull(keySelector2);

    this.stream1WindowLength = requireNonNull(stream1WindowLength);
    this.stream2WindowLength = requireNonNull(stream2WindowLength);

    this.inputSerializer1 = requireNonNull(inputSerializer1);
    this.inputSerializer2 = requireNonNull(inputSerializer2);
}
 
开发者ID:wangyangjun,项目名称:StreamBench,代码行数:18,代码来源:StreamJoinOperator.java

示例7: getJoinReturnTypes

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
public static <IN1, IN2, OUT> TypeInformation<OUT> getJoinReturnTypes(JoinFunction<IN1, IN2, OUT> joinInterface,
		TypeInformation<IN1> in1Type, TypeInformation<IN2> in2Type) {
	return getBinaryOperatorReturnType((Function) joinInterface, JoinFunction.class, false, false, in1Type, in2Type);
}
 
开发者ID:citlab,项目名称:vs.msc.ws14,代码行数:5,代码来源:TypeExtractor.java

示例8: EquiJoin

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
public EquiJoin(DataSet<I1> input1, DataSet<I2> input2,
		Keys<I1> keys1, Keys<I2> keys2, FlatJoinFunction<I1, I2, OUT> generatedFunction, JoinFunction<I1, I2, OUT> function,
		TypeInformation<OUT> returnType, JoinHint hint)
{
	super(input1, input2, keys1, keys2, returnType, hint);

	if (function == null) {
		throw new NullPointerException();
	}

	this.function = generatedFunction;

	if (!(generatedFunction instanceof ProjectFlatJoinFunction)) {
		extractSemanticAnnotationsFromUdf(function.getClass());
	} else {
		generateProjectionProperties(((ProjectFlatJoinFunction<?, ?, ?>) generatedFunction));
	}
}
 
开发者ID:citlab,项目名称:vs.msc.ws14,代码行数:19,代码来源:JoinOperator.java

示例9: executeTask

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
private void executeTask(JoinFunction<Tuple2<Integer, Integer>, Tuple2<Integer, Integer>, Tuple2<Integer, Integer>> joiner, boolean slow, int parallelism) throws Exception {
	ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
	DataSet<Tuple2<Integer, Integer>> input1 = env.createInput(new InfiniteIntegerTupleInputFormat(slow));
	DataSet<Tuple2<Integer, Integer>> input2 = env.createInput(new InfiniteIntegerTupleInputFormat(slow));

	input1.join(input2, JoinOperatorBase.JoinHint.REPARTITION_SORT_MERGE)
			.where(0)
			.equalTo(0)
			.with(joiner)
			.output(new DiscardingOutputFormat<Tuple2<Integer, Integer>>());

	env.setParallelism(parallelism);

	runAndCancelJob(env.createProgramPlan(), 5 * 1000, 10 * 1000);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:16,代码来源:JoinCancelingITCase.java

示例10: testProgram

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
@Override
protected void testProgram() throws Exception {

	ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();

	DataSet<Tuple1<Long>> initialVertices = env.readCsvFile(verticesPath).fieldDelimiter(" ").types(Long.class).name("Vertices");

	DataSet<Tuple2<Long, Long>> edges = env.readCsvFile(edgesPath).fieldDelimiter(" ").types(Long.class, Long.class).name("Edges");

	DataSet<Tuple2<Long, Long>> verticesWithId = initialVertices.map(new MapFunction<Tuple1<Long>, Tuple2<Long, Long>>() {
		@Override
		public Tuple2<Long, Long> map(Tuple1<Long> value) throws Exception {
			return new Tuple2<>(value.f0, value.f0);
		}
	}).name("Assign Vertex Ids");

	DeltaIteration<Tuple2<Long, Long>, Tuple2<Long, Long>> iteration = verticesWithId.iterateDelta(verticesWithId, MAX_ITERATIONS, 0);

	JoinOperator<Tuple2<Long, Long>, Tuple2<Long, Long>, Tuple2<Long, Long>> joinWithNeighbors = iteration.getWorkset()
			.join(edges).where(0).equalTo(0)
			.with(new JoinFunction<Tuple2<Long, Long>, Tuple2<Long, Long>, Tuple2<Long, Long>>() {
				@Override
				public Tuple2<Long, Long> join(Tuple2<Long, Long> first, Tuple2<Long, Long> second) throws Exception {
					return new Tuple2<>(second.f1, first.f1);
				}
			})
			.name("Join Candidate Id With Neighbor");

	CoGroupOperator<Tuple2<Long, Long>, Tuple2<Long, Long>, Tuple2<Long, Long>> minAndUpdate = joinWithNeighbors
			.coGroup(iteration.getSolutionSet()).where(0).equalTo(0)
			.with(new MinIdAndUpdate())
			.name("min Id and Update");

	iteration.closeWith(minAndUpdate, minAndUpdate).writeAsCsv(resultPath, "\n", " ").name("Result");

	env.execute("Workset Connected Components");
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:38,代码来源:CoGroupConnectedComponentsITCase.java

示例11: testJoinLambda

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
@Test
public void testJoinLambda() {
	JoinFunction<Tuple2<Tuple1<Integer>, Boolean>, Tuple2<Tuple1<Integer>, Double>, Tuple2<Tuple1<Integer>, String>> f = (i1, i2) -> null;

	TypeInformation<?> ti = TypeExtractor.getJoinReturnTypes(f, TypeInfoParser.parse("Tuple2<Tuple1<Integer>, Boolean>"), TypeInfoParser.parse("Tuple2<Tuple1<Integer>, Double>"));
	if (!(ti instanceof MissingTypeInfo)) {
		Assert.assertTrue(ti.isTupleType());
		Assert.assertEquals(2, ti.getArity());
		Assert.assertTrue(((TupleTypeInfo<?>) ti).getTypeAt(0).isTupleType());
		Assert.assertEquals(((TupleTypeInfo<?>) ti).getTypeAt(1), BasicTypeInfo.STRING_TYPE_INFO);
	}
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:13,代码来源:LambdaExtractionTest.java

示例12: EquiJoin

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
public EquiJoin(DataSet<I1> input1, DataSet<I2> input2,
		Keys<I1> keys1, Keys<I2> keys2, FlatJoinFunction<I1, I2, OUT> generatedFunction, JoinFunction<I1, I2, OUT> function,
		TypeInformation<OUT> returnType, JoinHint hint, String joinLocationName, JoinType type) {
	super(input1, input2, keys1, keys2, returnType, hint, type);

	this.joinLocationName = joinLocationName;

	if (function == null) {
		throw new NullPointerException();
	}

	this.function = generatedFunction;

	UdfOperatorUtils.analyzeDualInputUdf(this, JoinFunction.class, joinLocationName, function, keys1, keys2);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:16,代码来源:JoinOperator.java

示例13: with

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
public <R> EquiJoin<I1, I2, R> with(JoinFunction<I1, I2, R> function) {
	if (function == null) {
		throw new NullPointerException("Join function must not be null.");
	}
	FlatJoinFunction<I1, I2, R> generatedFunction = new WrappingFlatJoinFunction<>(clean(function));
	TypeInformation<R> returnType = TypeExtractor.getJoinReturnTypes(function, getInput1Type(), getInput2Type(), Utils.getCallLocationName(), true);
	return new EquiJoin<>(getInput1(), getInput2(), getKeys1(), getKeys2(), generatedFunction, function, returnType, getJoinHint(), Utils.getCallLocationName(), joinType);
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:9,代码来源:JoinOperator.java

示例14: testConnectedComponentsExamplesNeighborWithComponentIDJoin

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
@Test
public void testConnectedComponentsExamplesNeighborWithComponentIDJoin() {
	compareAnalyzerResultWithAnnotationsDualInput(JoinFunction.class, NeighborWithComponentIDJoin.class,
			"Tuple2<Long, Long>",
			"Tuple2<Long, Long>",
			"Tuple2<Long, Long>");
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:8,代码来源:UdfAnalyzerExamplesTest.java

示例15: getDistinctEdgeIntersection

import org.apache.flink.api.common.functions.JoinFunction; //导入依赖的package包/类
/**
 * Computes the intersection between the edge set and the given edge set. For all matching pairs,
 * only one edge will be in the resulting data set.
 *
 * @param edges edges to compute intersection with
 * @return edge set containing one edge for all matching pairs of the same edge
 */
private DataSet<Edge<K, EV>> getDistinctEdgeIntersection(DataSet<Edge<K, EV>> edges) {
	return this.getEdges()
			.join(edges)
			.where(0, 1, 2)
			.equalTo(0, 1, 2)
			.with(new JoinFunction<Edge<K, EV>, Edge<K, EV>, Edge<K, EV>>() {
				@Override
				public Edge<K, EV> join(Edge<K, EV> first, Edge<K, EV> second) throws Exception {
					return first;
				}
			}).withForwardedFieldsFirst("*").name("Intersect edges")
			.distinct()
				.name("Edges");
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:22,代码来源:Graph.java


注:本文中的org.apache.flink.api.common.functions.JoinFunction类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。