当前位置: 首页>>代码示例>>Java>>正文


Java MapReduceDriver.run方法代码示例

本文整理汇总了Java中org.apache.hadoop.mrunit.mapreduce.MapReduceDriver.run方法的典型用法代码示例。如果您正苦于以下问题:Java MapReduceDriver.run方法的具体用法?Java MapReduceDriver.run怎么用?Java MapReduceDriver.run使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.apache.hadoop.mrunit.mapreduce.MapReduceDriver的用法示例。


在下文中一共展示了MapReduceDriver.run方法的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: verifyMapReduce

import org.apache.hadoop.mrunit.mapreduce.MapReduceDriver; //导入方法依赖的package包/类
public static void verifyMapReduce(SmartMapper mapper, SmartReducer reducer, Object key, Object input)
    throws Exception
{
  MapDriver mapDriver = new MapDriver();
  mapDriver.setMapper(mapper);
  MapReduceDriver mapReduceDriver = new MapReduceDriver();
  mapReduceDriver.setMapper(mapper);
  Object writableKey = WritableUtils.createWritable(key, mapper.getKeyInType());
  Object writableValue = WritableUtils.createWritable(input, mapper.getValueInType());
  mapDriver.withInput(writableKey, writableValue);
  List results = mapDriver.run();
  Collections.sort(results, PairComparer.INSTANCE);
  mapReduceDriver = new MapReduceDriver<LongWritable, Text, Text, LongWritable, Text, LongWritable>();
  writableKey = WritableUtils.createWritable(key, mapper.getKeyInType());
  writableValue = WritableUtils.createWritable(input, mapper.getValueInType());
  mapReduceDriver.withInput(writableKey, writableValue);
  mapReduceDriver.setMapper(mapper);
  mapReduceDriver.setReducer(reducer);
  List finalResults = mapReduceDriver.run();
  String text = String.format("[%s]\n\n -> maps via %s to -> \n\n%s\n\n -> reduces via %s to -> \n\n%s", input,
      mapper.getClass().getSimpleName(), ArrayUtils.toString(results, Echo.INSTANCE),
      reducer.getClass().getSimpleName(), ArrayUtils.toString(finalResults, Echo.INSTANCE));
  Approvals.verify(text);
}
 
开发者ID:approvals,项目名称:ApprovalTests.Java,代码行数:25,代码来源:HadoopApprovals.java

示例2: runWithGraph

import org.apache.hadoop.mrunit.mapreduce.MapReduceDriver; //导入方法依赖的package包/类
public static Map<Long, FaunusVertex> runWithGraph(final Map<Long, FaunusVertex> graph, final MapReduceDriver driver) throws IOException {
    driver.resetOutput();
    driver.resetExpectedCounters();
    driver.getConfiguration().setBoolean(HadoopCompiler.TESTING, true);
    for (final FaunusVertex vertex : graph.values()) {
        driver.withInput(NullWritable.get(), vertex);
    }

    final Map<Long, FaunusVertex> map = new HashMap<Long, FaunusVertex>();
    for (final Object pair : driver.run()) {
        map.put(((Pair<NullWritable, FaunusVertex>) pair).getSecond().getLongId(), ((Pair<NullWritable, FaunusVertex>) pair).getSecond());
    }
    return map;
}
 
开发者ID:graben1437,项目名称:titan0.5.4-hbase1.1.1-custom,代码行数:15,代码来源:BaseTest.java

示例3: runWithGraphNoIndex

import org.apache.hadoop.mrunit.mapreduce.MapReduceDriver; //导入方法依赖的package包/类
public static List runWithGraphNoIndex(final Map<Long, FaunusVertex> graph, final MapReduceDriver driver) throws IOException {
    driver.resetOutput();
    driver.resetExpectedCounters();
    driver.getConfiguration().setBoolean(HadoopCompiler.TESTING, true);
    for (final Vertex vertex : graph.values()) {
        driver.withInput(NullWritable.get(), vertex);
    }
    return driver.run();
}
 
开发者ID:graben1437,项目名称:titan0.5.4-hbase1.1.1-custom,代码行数:10,代码来源:BaseTest.java

示例4: run

import org.apache.hadoop.mrunit.mapreduce.MapReduceDriver; //导入方法依赖的package包/类
public static Map<Long, FaunusVertex> run(final MapReduceDriver driver) throws IOException {
    final Map<Long, FaunusVertex> map = new HashMap<Long, FaunusVertex>();
    for (final Object object : driver.run()) {
        Pair<NullWritable, FaunusVertex> pair = (Pair<NullWritable, FaunusVertex>) object;
        map.put(pair.getSecond().getLongId(), pair.getSecond());
    }
    return map;
}
 
开发者ID:graben1437,项目名称:titan0.5.4-hbase1.1.1-custom,代码行数:9,代码来源:BaseTest.java


注:本文中的org.apache.hadoop.mrunit.mapreduce.MapReduceDriver.run方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。