当前位置: 首页>>代码示例>>Java>>正文


Java IterativeStream.closeWith方法代码示例

本文整理汇总了Java中org.apache.flink.streaming.api.datastream.IterativeStream.closeWith方法的典型用法代码示例。如果您正苦于以下问题:Java IterativeStream.closeWith方法的具体用法?Java IterativeStream.closeWith怎么用?Java IterativeStream.closeWith使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.apache.flink.streaming.api.datastream.IterativeStream的用法示例。


在下文中一共展示了IterativeStream.closeWith方法的5个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: main

import org.apache.flink.streaming.api.datastream.IterativeStream; //导入方法依赖的package包/类
public static void main(String[] args) throws Exception {

		// Set up the environment
		if(!parseParameters(args)) {
			return;
		}

		StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

		DataStream<Tuple2<Long, Long>> edges = getEdgesDataSet(env);

		IterativeStream<Tuple2<Long, Long>> iteration = edges.iterate();
		DataStream<Tuple2<Long, Long>> result = iteration.closeWith(
				iteration.keyBy(0).flatMap(new AssignComponents()));

		// Emit the results
		result.print();

		env.execute("Streaming Connected Components");
	}
 
开发者ID:vasia,项目名称:gelly-streaming,代码行数:21,代码来源:IterativeConnectedComponents.java

示例2: testClosingFromOutOfLoop

import org.apache.flink.streaming.api.datastream.IterativeStream; //导入方法依赖的package包/类
@Test(expected = UnsupportedOperationException.class)
public void testClosingFromOutOfLoop() throws Exception {

	// this test verifies that we cannot close an iteration with a DataStream that does not
	// have the iteration in its predecessors

	StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

	// introduce dummy mapper to get to correct parallelism
	DataStream<Integer> source = env.fromElements(1, 10).map(noOpIntMap);

	IterativeStream<Integer> iter1 = source.iterate();
	IterativeStream<Integer> iter2 = source.iterate();

	iter2.closeWith(iter1.map(noOpIntMap));

}
 
开发者ID:axbaretto,项目名称:flink,代码行数:18,代码来源:IterateITCase.java

示例3: main

import org.apache.flink.streaming.api.datastream.IterativeStream; //导入方法依赖的package包/类
public static void main(String[] args) throws Exception {

		// Checking input parameters
		final ParameterTool params = ParameterTool.fromArgs(args);

		// set up input for the stream of integer pairs

		// obtain execution environment and set setBufferTimeout to 1 to enable
		// continuous flushing of the output buffers (lowest latency)
		StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment()
				.setBufferTimeout(1);

		// make parameters available in the web interface
		env.getConfig().setGlobalJobParameters(params);

		// create input stream of integer pairs
		DataStream<Tuple2<Integer, Integer>> inputStream;
		if (params.has("input")) {
			inputStream = env.readTextFile(params.get("input")).map(new FibonacciInputMap());
		} else {
			System.out.println("Executing Iterate example with default input data set.");
			System.out.println("Use --input to specify file input.");
			inputStream = env.addSource(new RandomFibonacciSource());
		}

		// create an iterative data stream from the input with 5 second timeout
		IterativeStream<Tuple5<Integer, Integer, Integer, Integer, Integer>> it = inputStream.map(new InputMap())
				.iterate(5000);

		// apply the step function to get the next Fibonacci number
		// increment the counter and split the output with the output selector
		SplitStream<Tuple5<Integer, Integer, Integer, Integer, Integer>> step = it.map(new Step())
				.split(new MySelector());

		// close the iteration by selecting the tuples that were directed to the
		// 'iterate' channel in the output selector
		it.closeWith(step.select("iterate"));

		// to produce the final output select the tuples directed to the
		// 'output' channel then get the input pairs that have the greatest iteration counter
		// on a 1 second sliding window
		DataStream<Tuple2<Tuple2<Integer, Integer>, Integer>> numbers = step.select("output")
				.map(new OutputMap());

		// emit results
		if (params.has("output")) {
			numbers.writeAsText(params.get("output"));
		} else {
			System.out.println("Printing result to stdout. Use --output to specify output path.");
			numbers.print();
		}

		// execute the program
		env.execute("Streaming Iteration Example");
	}
 
开发者ID:axbaretto,项目名称:flink,代码行数:56,代码来源:IterateExample.java

示例4: testDoubleClosing

import org.apache.flink.streaming.api.datastream.IterativeStream; //导入方法依赖的package包/类
@Test
public void testDoubleClosing() throws Exception {

	StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

	// introduce dummy mapper to get to correct parallelism
	DataStream<Integer> source = env.fromElements(1, 10).map(noOpIntMap);

	IterativeStream<Integer> iter1 = source.iterate();

	iter1.closeWith(iter1.map(noOpIntMap));
	iter1.closeWith(iter1.map(noOpIntMap));
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:14,代码来源:IterateITCase.java

示例5: testDifferingParallelism

import org.apache.flink.streaming.api.datastream.IterativeStream; //导入方法依赖的package包/类
@Test(expected = UnsupportedOperationException.class)
public void testDifferingParallelism() throws Exception {

	StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

	// introduce dummy mapper to get to correct parallelism
	DataStream<Integer> source = env.fromElements(1, 10)
			.map(noOpIntMap);

	IterativeStream<Integer> iter1 = source.iterate();

	iter1.closeWith(iter1.map(noOpIntMap).setParallelism(parallelism / 2));

}
 
开发者ID:axbaretto,项目名称:flink,代码行数:15,代码来源:IterateITCase.java


注:本文中的org.apache.flink.streaming.api.datastream.IterativeStream.closeWith方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。