当前位置: 首页>>代码示例>>Java>>正文


Java ConfEntryInfo类代码示例

本文整理汇总了Java中org.apache.hadoop.mapreduce.v2.app.webapp.dao.ConfEntryInfo的典型用法代码示例。如果您正苦于以下问题:Java ConfEntryInfo类的具体用法?Java ConfEntryInfo怎么用?Java ConfEntryInfo使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。


ConfEntryInfo类属于org.apache.hadoop.mapreduce.v2.app.webapp.dao包,在下文中一共展示了ConfEntryInfo类的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: render

import org.apache.hadoop.mapreduce.v2.app.webapp.dao.ConfEntryInfo; //导入依赖的package包/类
@Override protected void render(Block html) {
  String jid = $(JOB_ID);
  if (jid.isEmpty()) {
    html.
      p()._("Sorry, can't do anything without a JobID.")._();
    return;
  }
  JobId jobID = MRApps.toJobID(jid);
  Job job = appContext.getJob(jobID);
  if (job == null) {
    html.
      p()._("Sorry, ", jid, " not found.")._();
    return;
  }
  Path confPath = job.getConfFile();
  try {
    ConfInfo info = new ConfInfo(job);

    html.div().h3(confPath.toString())._();
    TBODY<TABLE<Hamlet>> tbody = html.
      // Tasks table
    table("#conf").
      thead().
        tr().
          th(_TH, "key").
          th(_TH, "value").
          th(_TH, "source chain").
        _().
      _().
    tbody();
    for (ConfEntryInfo entry : info.getProperties()) {
      StringBuffer buffer = new StringBuffer();
      String[] sources = entry.getSource();
      //Skip the last entry, because it is always the same HDFS file, and
      // output them in reverse order so most recent is output first
      boolean first = true;
      for(int i = (sources.length  - 2); i >= 0; i--) {
        if(!first) {
          // \u2B05 is an arrow <--
          buffer.append(" \u2B05 ");
        }
        first = false;
        buffer.append(sources[i]);
      }
      tbody.
        tr().
          td(entry.getName()).
          td(entry.getValue()).
          td(buffer.toString()).
        _();
    }
    tbody._().
    tfoot().
      tr().
        th().input("search_init").$type(InputType.text).$name("key").$value("key")._()._().
        th().input("search_init").$type(InputType.text).$name("value").$value("value")._()._().
        th().input("search_init").$type(InputType.text).$name("source chain").$value("source chain")._()._().
        _().
      _().
    _();
  } catch(IOException e) {
    LOG.error("Error while reading "+confPath, e);
    html.p()._("Sorry got an error while reading conf file. ",confPath);
  }
}
 
开发者ID:naver,项目名称:hadoop,代码行数:66,代码来源:ConfBlock.java

示例2: getAcls

import org.apache.hadoop.mapreduce.v2.app.webapp.dao.ConfEntryInfo; //导入依赖的package包/类
public ArrayList<ConfEntryInfo> getAcls() {
  return acls;
}
 
开发者ID:naver,项目名称:hadoop,代码行数:4,代码来源:JobInfo.java

示例3: render

import org.apache.hadoop.mapreduce.v2.app.webapp.dao.ConfEntryInfo; //导入依赖的package包/类
@Override protected void render(Block html) {
  String jid = $(JOB_ID);
  if (jid.isEmpty()) {
    html.
      p()._("Sorry, can't do anything without a JobID.")._();
    return;
  }
  JobId jobID = MRApps.toJobID(jid);
  Job job = appContext.getJob(jobID);
  if (job == null) {
    html.
      p()._("Sorry, ", jid, " not found.")._();
    return;
  }
  Path confPath = job.getConfFile();
  try {
    ConfInfo info = new ConfInfo(job);

    html.div().h3(confPath.toString())._();
    TBODY<TABLE<Hamlet>> tbody = html.
      // Tasks table
    table("#conf").
      thead().
        tr().
          th(_TH, "key").
          th(_TH, "value").
          th(_TH, "source chain").
        _().
      _().
    tbody();
    for (ConfEntryInfo entry : info.getProperties()) {
      StringBuffer buffer = new StringBuffer();
      String[] sources = entry.getSource();
      //Skip the last entry, because it is always the same HDFS file, and
      // output them in reverse order so most recent is output first
      boolean first = true;
      for(int i = (sources.length  - 2); i >= 0; i--) {
        if(!first) {
          buffer.append(" <- ");
        }
        first = false;
        buffer.append(sources[i]);
      }
      tbody.
        tr().
          td(entry.getName()).
          td(entry.getValue()).
          td(buffer.toString()).
        _();
    }
    tbody._().
    tfoot().
      tr().
        th().input("search_init").$type(InputType.text).$name("key").$value("key")._()._().
        th().input("search_init").$type(InputType.text).$name("value").$value("value")._()._().
        th().input("search_init").$type(InputType.text).$name("source chain").$value("source chain")._()._().
        _().
      _().
    _();
  } catch(IOException e) {
    LOG.error("Error while reading "+confPath, e);
    html.p()._("Sorry got an error while reading conf file. ",confPath);
  }
}
 
开发者ID:aliyun-beta,项目名称:aliyun-oss-hadoop-fs,代码行数:65,代码来源:ConfBlock.java


注:本文中的org.apache.hadoop.mapreduce.v2.app.webapp.dao.ConfEntryInfo类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。