当前位置: 首页>>代码示例>>Java>>正文


Java Export类代码示例

本文整理汇总了Java中org.apache.hadoop.hbase.mapreduce.Export的典型用法代码示例。如果您正苦于以下问题:Java Export类的具体用法?Java Export怎么用?Java Export使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


Export类属于org.apache.hadoop.hbase.mapreduce包,在下文中一共展示了Export类的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: main

import org.apache.hadoop.hbase.mapreduce.Export; //导入依赖的package包/类
public static void main(String[] args) {
  ProgramDriver programDriver = new ProgramDriver();
  int exitCode = -1;
  try {
    programDriver.addClass("wordcount-hbase", WordCountHBase.class,
        "A map/reduce program that counts the words in the input files.");
    programDriver.addClass("export-table", Export.class,
        "A map/reduce program that exports a table to a file.");
    //programDriver.addClass("cellcounter", CellCounter.class, "Count them cells!");
    programDriver.driver(args);
    exitCode = programDriver.run(args);
  } catch (Throwable e) {
    e.printStackTrace();
  }
  System.exit(exitCode);
}
 
开发者ID:GoogleCloudPlatform,项目名称:cloud-bigtable-examples,代码行数:17,代码来源:WordCountDriver.java

示例2: main

import org.apache.hadoop.hbase.mapreduce.Export; //导入依赖的package包/类
public static void main(String[] args) {
  ProgramDriver programDriver = new ProgramDriver();
  int exitCode = -1;
  try {
    programDriver.addClass("export-table", Export.class,
        "A map/reduce program that exports a table to a file.");
    programDriver.addClass("import-table", Import.class,
        "A map/reduce program that imports a table to a file.");
    programDriver.driver(args);
    exitCode = programDriver.run(args);
  } catch (Throwable e) {
    e.printStackTrace();
  }
  System.exit(exitCode);
}
 
开发者ID:dmmcerlean,项目名称:cloud-bigtable-client,代码行数:16,代码来源:Driver.java

示例3: testMapReduce

import org.apache.hadoop.hbase.mapreduce.Export; //导入依赖的package包/类
@Test
@Category(KnownGap.class)
public void testMapReduce() throws IOException, ClassNotFoundException, InterruptedException {
  Table oldTable = getConnection().getTable(TABLE_NAME);

  // Put a value.
  byte[] rowKey = dataHelper.randomData("testrow-");
  byte[] qual = dataHelper.randomData("testQualifier-");
  byte[] value = dataHelper.randomData("testValue-");
  Put put = new Put(rowKey);
  put.addColumn(COLUMN_FAMILY, qual, value);
  oldTable.put(put);

  // Assert the value is there.
  Get get = new Get(rowKey);
  Result result = oldTable.get(get);
  List<Cell> cells = result.listCells();
  Assert.assertEquals(1, cells.size());
  Assert.assertArrayEquals(CellUtil.cloneValue(cells.get(0)), value);

  // Run the export.
  Configuration conf = getConnection().getConfiguration();

  //conf.set("fs.defaultFS", "file:///");
  FileSystem dfs = IntegrationTests.getMiniCluster().getFileSystem();
  String tempDir = "hdfs://" + dfs.getCanonicalServiceName() + "/tmp/backup";

  String[] args = new String[]{
      TABLE_NAME.getNameAsString(),
      tempDir
  };
  Job job = Export.createSubmittableJob(conf, args);
  // So it looks for jars in the local FS, not HDFS.
  job.getConfiguration().set("fs.defaultFS", "file:///");
  Assert.assertTrue(job.waitForCompletion(true));

  // Create new table.
  TableName newTableName = IntegrationTests.newTestTableName();
  Table newTable = getConnection().getTable(newTableName);

  // Change for method in IntegrationTests
  Admin admin = getConnection().getAdmin();
  HColumnDescriptor hcd = new HColumnDescriptor(IntegrationTests.COLUMN_FAMILY);
  HTableDescriptor htd = new HTableDescriptor(newTableName);
  htd.addFamily(hcd);
  admin.createTable(htd);

  // Run the import.
  args = new String[]{
      newTableName.getNameAsString(),
      tempDir
  };
  job = Import.createSubmittableJob(conf, args);
  job.getConfiguration().set("fs.defaultFS", "file:///");
  Assert.assertTrue(job.waitForCompletion(true));

  // Assert the value is there.
  get = new Get(rowKey);
  result = newTable.get(get);
  cells = result.listCells();
  Assert.assertEquals(1, cells.size());
  Assert.assertArrayEquals(CellUtil.cloneValue(cells.get(0)), value);
}
 
开发者ID:dmmcerlean,项目名称:cloud-bigtable-client,代码行数:64,代码来源:TestImport.java


注:本文中的org.apache.hadoop.hbase.mapreduce.Export类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。