當前位置: 首頁>>代碼示例>>Java>>正文


Java GenericOptionsParser.printGenericCommandUsage方法代碼示例

本文整理匯總了Java中org.apache.hadoop.util.GenericOptionsParser.printGenericCommandUsage方法的典型用法代碼示例。如果您正苦於以下問題:Java GenericOptionsParser.printGenericCommandUsage方法的具體用法?Java GenericOptionsParser.printGenericCommandUsage怎麽用?Java GenericOptionsParser.printGenericCommandUsage使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在org.apache.hadoop.util.GenericOptionsParser的用法示例。


在下文中一共展示了GenericOptionsParser.printGenericCommandUsage方法的8個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: printUsage

import org.apache.hadoop.util.GenericOptionsParser; //導入方法依賴的package包/類
void printUsage() {
  // The CLI package should do this for us, but I can't figure out how
  // to make it print something reasonable.
  System.out.println("bin/hadoop pipes");
  System.out.println("  [-input <path>] // Input directory");
  System.out.println("  [-output <path>] // Output directory");
  System.out.println("  [-jar <jar file> // jar filename");
  System.out.println("  [-inputformat <class>] // InputFormat class");
  System.out.println("  [-map <class>] // Java Map class");
  System.out.println("  [-partitioner <class>] // Java Partitioner");
  System.out.println("  [-reduce <class>] // Java Reduce class");
  System.out.println("  [-writer <class>] // Java RecordWriter");
  System.out.println("  [-program <executable>] // executable URI");
  System.out.println("  [-reduces <num>] // number of reduces");
  System.out.println("  [-lazyOutput <true/false>] // createOutputLazily");
  System.out.println();
  GenericOptionsParser.printGenericCommandUsage(System.out);
}
 
開發者ID:naver,項目名稱:hadoop,代碼行數:19,代碼來源:Submitter.java

示例2: run

import org.apache.hadoop.util.GenericOptionsParser; //導入方法依賴的package包/類
public int run(String[] args) throws Exception {
  if (args.length != 0) {
    System.err.println(format("%s [genericOptions]", NAME));
    System.err.println("  Runs ImportTsv integration tests against a distributed cluster.");
    System.err.println();
    GenericOptionsParser.printGenericCommandUsage(System.err);
    return 1;
  }

  // adding more test methods? Don't forget to add them here... or consider doing what
  // IntegrationTestsDriver does.
  provisionCluster();
  testGenerateAndLoad();
  releaseCluster();

  return 0;
}
 
開發者ID:fengchen8086,項目名稱:ditb,代碼行數:18,代碼來源:IntegrationTestImportTsv.java

示例3: printUsage

import org.apache.hadoop.util.GenericOptionsParser; //導入方法依賴的package包/類
protected static int printUsage() {
  System.err.println(
      "Usage: [-m <maps>] number of mappers (default: " + NUM_MAPS_DEFAULT +
          ")\n" +
      "     [-v] timeline service version\n" +
      "     [-mtype <mapper type in integer>]\n" +
      "          1. simple entity write mapper\n" +
      "          2. jobhistory files replay mapper\n" +
      "     [-s <(KBs)test>] number of KB per put (mtype=1, default: " +
           SimpleEntityWriterV1.KBS_SENT_DEFAULT + " KB)\n" +
      "     [-t] package sending iterations per mapper (mtype=1, default: " +
           SimpleEntityWriterV1.TEST_TIMES_DEFAULT + ")\n" +
      "     [-d <path>] root path of job history files (mtype=2)\n" +
      "     [-r <replay mode>] (mtype=2)\n" +
      "          1. write all entities for a job in one put (default)\n" +
      "          2. write one entity at a time\n");
  GenericOptionsParser.printGenericCommandUsage(System.err);
  return -1;
}
 
開發者ID:aliyun-beta,項目名稱:aliyun-oss-hadoop-fs,代碼行數:20,代碼來源:TimelineServicePerformance.java

示例4: printUsage

import org.apache.hadoop.util.GenericOptionsParser; //導入方法依賴的package包/類
void printUsage() {
  // The CLI package should do this for us, but I can't figure out how
  // to make it print something reasonable.
  System.out.println("Usage: pipes ");
  System.out.println("  [-input <path>] // Input directory");
  System.out.println("  [-output <path>] // Output directory");
  System.out.println("  [-jar <jar file> // jar filename");
  System.out.println("  [-inputformat <class>] // InputFormat class");
  System.out.println("  [-map <class>] // Java Map class");
  System.out.println("  [-partitioner <class>] // Java Partitioner");
  System.out.println("  [-reduce <class>] // Java Reduce class");
  System.out.println("  [-writer <class>] // Java RecordWriter");
  System.out.println("  [-program <executable>] // executable URI");
  System.out.println("  [-reduces <num>] // number of reduces");
  System.out.println("  [-lazyOutput <true/false>] // createOutputLazily");
  System.out.println();
  GenericOptionsParser.printGenericCommandUsage(System.out);
}
 
開發者ID:aliyun-beta,項目名稱:aliyun-oss-hadoop-fs,代碼行數:19,代碼來源:Submitter.java

示例5: printUsage

import org.apache.hadoop.util.GenericOptionsParser; //導入方法依賴的package包/類
private static void printUsage(PrintStream err) {
  err.println("fetchdt retrieves delegation tokens from the NameNode");
  err.println();
  err.println("fetchdt <opts> <token file>");
  err.println("Options:");
  err.println("  --webservice <url>  Url to contact NN on (starts with " +
          "http:// or https://)");
  err.println("  --renewer <name>    Name of the delegation token renewer");
  err.println("  --cancel            Cancel the delegation token");
  err.println("  --renew             Renew the delegation token.  " +
          "Delegation " + "token must have been fetched using the --renewer" +
          " <name> option.");
  err.println("  --print             Print the delegation token");
  err.println();
  GenericOptionsParser.printGenericCommandUsage(err);
  ExitUtil.terminate(1);
}
 
開發者ID:aliyun-beta,項目名稱:aliyun-oss-hadoop-fs,代碼行數:18,代碼來源:DelegationTokenFetcher.java

示例6: printUsage

import org.apache.hadoop.util.GenericOptionsParser; //導入方法依賴的package包/類
protected static int printUsage() {
  System.err.println(
  "Usage: [-m <maps>] [-r <reduces>]\n" +
  "       [-keepmap <percent>] [-keepred <percent>]\n" +
  "       [-indir <path>] [-outdir <path]\n" +
  "       [-inFormat[Indirect] <InputFormat>] [-outFormat <OutputFormat>]\n" +
  "       [-outKey <WritableComparable>] [-outValue <Writable>]\n");
  GenericOptionsParser.printGenericCommandUsage(System.err);
  return -1;
}
 
開發者ID:naver,項目名稱:hadoop,代碼行數:11,代碼來源:GenericMRLoadGenerator.java

示例7: printUsage

import org.apache.hadoop.util.GenericOptionsParser; //導入方法依賴的package包/類
private static void printUsage(PrintStream err) {
  err.println("fetchdt retrieves delegation tokens from the NameNode");
  err.println();
  err.println("fetchdt <opts> <token file>");
  err.println("Options:");
  err.println("  --webservice <url>  Url to contact NN on");
  err.println("  --renewer <name>    Name of the delegation token renewer");
  err.println("  --cancel            Cancel the delegation token");
  err.println("  --renew             Renew the delegation token.  Delegation " 
  		+ "token must have been fetched using the --renewer <name> option.");
  err.println("  --print             Print the delegation token");
  err.println();
  GenericOptionsParser.printGenericCommandUsage(err);
  ExitUtil.terminate(1);    
}
 
開發者ID:naver,項目名稱:hadoop,代碼行數:16,代碼來源:DelegationTokenFetcher.java

示例8: createValueAggregatorJob

import org.apache.hadoop.util.GenericOptionsParser; //導入方法依賴的package包/類
/**
 * Create an Aggregate based map/reduce job.
 * 
 * @param conf The configuration for job
 * @param args the arguments used for job creation. Generic hadoop
 * arguments are accepted.
 * @return a Job object ready for submission.
 * 
 * @throws IOException
 * @see GenericOptionsParser
 */
public static Job createValueAggregatorJob(Configuration conf, String args[])
    throws IOException {

  GenericOptionsParser genericParser 
    = new GenericOptionsParser(conf, args);
  args = genericParser.getRemainingArgs();
  
  if (args.length < 2) {
    System.out.println("usage: inputDirs outDir "
        + "[numOfReducer [textinputformat|seq [specfile [jobName]]]]");
    GenericOptionsParser.printGenericCommandUsage(System.out);
    System.exit(2);
  }
  String inputDir = args[0];
  String outputDir = args[1];
  int numOfReducers = 1;
  if (args.length > 2) {
    numOfReducers = Integer.parseInt(args[2]);
  }

  Class<? extends InputFormat> theInputFormat = null;
  if (args.length > 3 && 
      args[3].compareToIgnoreCase("textinputformat") == 0) {
    theInputFormat = TextInputFormat.class;
  } else {
    theInputFormat = SequenceFileInputFormat.class;
  }

  Path specFile = null;

  if (args.length > 4) {
    specFile = new Path(args[4]);
  }

  String jobName = "";
  
  if (args.length > 5) {
    jobName = args[5];
  }

  if (specFile != null) {
    conf.addResource(specFile);
  }
  String userJarFile = conf.get(ValueAggregatorJobBase.USER_JAR);
  if (userJarFile != null) {
    conf.set(MRJobConfig.JAR, userJarFile);
  }

  Job theJob = Job.getInstance(conf);
  if (userJarFile == null) {
    theJob.setJarByClass(ValueAggregator.class);
  } 
  theJob.setJobName("ValueAggregatorJob: " + jobName);

  FileInputFormat.addInputPaths(theJob, inputDir);

  theJob.setInputFormatClass(theInputFormat);
  
  theJob.setMapperClass(ValueAggregatorMapper.class);
  FileOutputFormat.setOutputPath(theJob, new Path(outputDir));
  theJob.setOutputFormatClass(TextOutputFormat.class);
  theJob.setMapOutputKeyClass(Text.class);
  theJob.setMapOutputValueClass(Text.class);
  theJob.setOutputKeyClass(Text.class);
  theJob.setOutputValueClass(Text.class);
  theJob.setReducerClass(ValueAggregatorReducer.class);
  theJob.setCombinerClass(ValueAggregatorCombiner.class);
  theJob.setNumReduceTasks(numOfReducers);
  return theJob;
}
 
開發者ID:aliyun-beta,項目名稱:aliyun-oss-hadoop-fs,代碼行數:82,代碼來源:ValueAggregatorJob.java


注:本文中的org.apache.hadoop.util.GenericOptionsParser.printGenericCommandUsage方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。