当前位置: 首页>>代码示例>>Java>>正文


Java UTF8.toString方法代码示例

本文整理汇总了Java中org.apache.hadoop.io.UTF8.toString方法的典型用法代码示例。如果您正苦于以下问题:Java UTF8.toString方法的具体用法?Java UTF8.toString怎么用?Java UTF8.toString使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.apache.hadoop.io.UTF8的用法示例。


在下文中一共展示了UTF8.toString方法的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: map

import org.apache.hadoop.io.UTF8; //导入方法依赖的package包/类
/**
 * Map file name and offset into statistical data.
 * <p>
 * The map task is to get the 
 * <tt>key</tt>, which contains the file name, and the 
 * <tt>value</tt>, which is the offset within the file.
 * 
 * The parameters are passed to the abstract method 
 * {@link #doIO(Reporter,String,long)}, which performs the io operation, 
 * usually read or write data, and then 
 * {@link #collectStats(OutputCollector,String,long,Object)} 
 * is called to prepare stat data for a subsequent reducer.
 */
public void map(UTF8 key, 
                LongWritable value,
                OutputCollector<UTF8, UTF8> output, 
                Reporter reporter) throws IOException {
  String name = key.toString();
  long longValue = value.get();
  
  reporter.setStatus("starting " + name + " ::host = " + hostName);
  
  long tStart = System.currentTimeMillis();
  Object statValue = doIO(reporter, name, longValue);
  long tEnd = System.currentTimeMillis();
  long execTime = tEnd - tStart;
  collectStats(output, name, execTime, statValue);
  
  reporter.setStatus("finished " + name + " ::host = " + hostName);
}
 
开发者ID:koichi626,项目名称:hadoop-gpu,代码行数:31,代码来源:IOMapperBase.java

示例2: map

import org.apache.hadoop.io.UTF8; //导入方法依赖的package包/类
public void map(UTF8 key, LongWritable value,
                OutputCollector<UTF8, LongWritable> collector,
                Reporter reporter)
  throws IOException {
  
  String name = key.toString();
  long size = value.get();
  long seed = Long.parseLong(name);

  random.setSeed(seed);
  reporter.setStatus("creating " + name);

  // write to temp file initially to permit parallel execution
  Path tempFile = new Path(DATA_DIR, name+suffix);
  OutputStream out = fs.create(tempFile);

  long written = 0;
  try {
    while (written < size) {
      if (fastCheck) {
        Arrays.fill(buffer, (byte)random.nextInt(Byte.MAX_VALUE));
      } else {
        random.nextBytes(buffer);
      }
      long remains = size - written;
      int length = (remains<=buffer.length) ? (int)remains : buffer.length;
      out.write(buffer, 0, length);
      written += length;
      reporter.setStatus("writing "+name+"@"+written+"/"+size);
    }
  } finally {
    out.close();
  }
  // rename to final location
  fs.rename(tempFile, new Path(DATA_DIR, name));

  collector.collect(new UTF8("bytes"), new LongWritable(written));

  reporter.setStatus("wrote " + name);
}
 
开发者ID:Nextzero,项目名称:hadoop-2.6.0-cdh5.4.3,代码行数:41,代码来源:TestFileSystem.java

示例3: reduce

import org.apache.hadoop.io.UTF8; //导入方法依赖的package包/类
public void reduce(UTF8 key, 
                   Iterator<UTF8> values,
                   OutputCollector<UTF8, UTF8> output, 
                   Reporter reporter
                   ) throws IOException {
  String field = key.toString();

  reporter.setStatus("starting " + field + " ::host = " + hostName);

  // concatenate strings
  if (field.startsWith("s:")) {
    String sSum = "";
    while (values.hasNext())
      sSum += values.next().toString() + ";";
    output.collect(key, new UTF8(sSum));
    reporter.setStatus("finished " + field + " ::host = " + hostName);
    return;
  }
  // sum long values
  if (field.startsWith("f:")) {
    float fSum = 0;
    while (values.hasNext())
      fSum += Float.parseFloat(values.next().toString());
    output.collect(key, new UTF8(String.valueOf(fSum)));
    reporter.setStatus("finished " + field + " ::host = " + hostName);
    return;
  }
  // sum long values
  if (field.startsWith("l:")) {
    long lSum = 0;
    while (values.hasNext()) {
      lSum += Long.parseLong(values.next().toString());
    }
    output.collect(key, new UTF8(String.valueOf(lSum)));
  }
  reporter.setStatus("finished " + field + " ::host = " + hostName);
}
 
开发者ID:koichi626,项目名称:hadoop-gpu,代码行数:38,代码来源:AccumulatingReducer.java

示例4: readString

import org.apache.hadoop.io.UTF8; //导入方法依赖的package包/类
@SuppressWarnings("deprecation")
public static String readString(DataInputStream in) throws IOException {
  UTF8 ustr = TL_DATA.get().U_STR;
  ustr.readFields(in);
  return ustr.toString();
}
 
开发者ID:rhli,项目名称:hadoop-EAR,代码行数:7,代码来源:FSImageSerialization.java


注:本文中的org.apache.hadoop.io.UTF8.toString方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。