当前位置: 首页>>代码示例>>Java>>正文


Java MapTaskStatistics类代码示例

本文整理汇总了Java中org.apache.hadoop.vaidya.statistics.job.MapTaskStatistics的典型用法代码示例。如果您正苦于以下问题:Java MapTaskStatistics类的具体用法?Java MapTaskStatistics怎么用?Java MapTaskStatistics使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。


MapTaskStatistics类属于org.apache.hadoop.vaidya.statistics.job包,在下文中一共展示了MapTaskStatistics类的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: evaluate

import org.apache.hadoop.vaidya.statistics.job.MapTaskStatistics; //导入依赖的package包/类
@Override
public double evaluate(JobStatistics job) {

  /*
   * Set the this._job
   */
  this._job = job;
    
  /*
   * Read the Normalization Factor
   */
  double normF = getInputElementDoubleValue("NormalizationFactor", 3.0);
  
  /*
   * Get the sorted map task list by number MapTaskKeys.OUTPUT_BYTES
   */
  List<MapTaskStatistics> smTaskList = job.getMapTaskList(MapTaskKeys.FILE_BYTES_WRITTEN, KeyDataType.LONG);
  int size = smTaskList.size();
  long numLocalBytesWrittenByMaps = 0;
  for (int i=0; i<size; i++) {
    numLocalBytesWrittenByMaps += smTaskList.get(i).getLongValue(MapTaskKeys.FILE_BYTES_WRITTEN);
  }
  this._numLocalBytesWrittenByMaps = numLocalBytesWrittenByMaps;
  
  /*
   * Map only job vs. map reduce job
   * For MapReduce job MAP_OUTPUT_BYTES are normally written by maps on local disk, so they are subtracted
   * from the localBytesWrittenByMaps.
   */
  if (job.getLongValue(JobKeys.TOTAL_REDUCES) > 0) {
    this._impact = (this._numLocalBytesWrittenByMaps - job.getLongValue(JobKeys.MAP_OUTPUT_BYTES))/job.getLongValue(JobKeys.MAP_OUTPUT_BYTES);
  } else {
    this._impact = this._numLocalBytesWrittenByMaps/job.getLongValue(JobKeys.MAP_OUTPUT_BYTES);
  }
  
  if (this._impact > normF) {
    this._impact = 1.0;
  } else {
    this._impact = this._impact/normF;
  }
  
  return this._impact;
  
}
 
开发者ID:Nextzero,项目名称:hadoop-2.6.0-cdh5.4.3,代码行数:45,代码来源:MapSideDiskSpill.java

示例2: evaluate

import org.apache.hadoop.vaidya.statistics.job.MapTaskStatistics; //导入依赖的package包/类
@Override
public double evaluate(JobStatistics job) {

  /*
   * Set the this._job
   */
  this._job = job;
    
  /*
   * Read the Normalization Factor
   */
  double normF = getInputElementDoubleValue("NormalizationFactor", 3.0);
  
  /*
   * Get the sorted reduce task list by number MapTaskKeys.OUTPUT_BYTES
   */
  List<MapTaskStatistics> srTaskList = job.getMapTaskList(MapTaskKeys.LOCAL_BYTES_WRITTEN, KeyDataType.LONG);
  int size = srTaskList.size();
  long numLocalBytesWrittenByMaps = 0;
  for (int i=0; i<size; i++) {
    numLocalBytesWrittenByMaps += srTaskList.get(i).getLongValue(MapTaskKeys.LOCAL_BYTES_WRITTEN);
  }
  this._numLocalBytesWrittenByMaps = numLocalBytesWrittenByMaps;
  
  /*
   * Map only job vs. map reduce job
   */
  if (job.getLongValue(JobKeys.TOTAL_REDUCES) > 0) {
    this._impact = (this._numLocalBytesWrittenByMaps - job.getLongValue(JobKeys.MAP_OUTPUT_BYTES))/job.getLongValue(JobKeys.MAP_OUTPUT_BYTES);
  } else {
    this._impact = this._numLocalBytesWrittenByMaps/job.getLongValue(JobKeys.MAP_OUTPUT_BYTES);
  }
  
  if (this._impact > normF) {
    this._impact = 1.0;
  } else {
    this._impact = this._impact/normF;
  }
  
  return this._impact;
  
}
 
开发者ID:thisisvoa,项目名称:hadoop-0.20,代码行数:43,代码来源:MapSideDiskSpill.java


注:本文中的org.apache.hadoop.vaidya.statistics.job.MapTaskStatistics类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。