当前位置: 首页>>代码示例>>Java>>正文


Java ExprNodeDesc.getExprString方法代码示例

本文整理汇总了Java中org.apache.hadoop.hive.ql.plan.ExprNodeDesc.getExprString方法的典型用法代码示例。如果您正苦于以下问题:Java ExprNodeDesc.getExprString方法的具体用法?Java ExprNodeDesc.getExprString怎么用?Java ExprNodeDesc.getExprString使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.apache.hadoop.hive.ql.plan.ExprNodeDesc的用法示例。


在下文中一共展示了ExprNodeDesc.getExprString方法的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: pushFilters

import org.apache.hadoop.hive.ql.plan.ExprNodeDesc; //导入方法依赖的package包/类
private void pushFilters(final JobConf jobConf, final TableScanOperator tableScan) {

    final TableScanDesc scanDesc = tableScan.getConf();
    if (scanDesc == null) {
      LOG.debug("Not pushing filters because TableScanDesc is null");
      return;
    }

    // construct column name list for reference by filter push down
    Utilities.setColumnNameList(jobConf, tableScan);

    // push down filters
    final ExprNodeDesc filterExpr = scanDesc.getFilterExpr();
    if (filterExpr == null) {
      LOG.debug("Not pushing filters because FilterExpr is null");
      return;
    }

    final String filterText = filterExpr.getExprString();
    final String filterExprSerialized = Utilities.serializeExpression(filterExpr);
    jobConf.set(
            TableScanDesc.FILTER_TEXT_CONF_STR,
            filterText);
    jobConf.set(
            TableScanDesc.FILTER_EXPR_CONF_STR,
            filterExprSerialized);
  }
 
开发者ID:apache,项目名称:parquet-mr,代码行数:28,代码来源:Hive012Binding.java

示例2: parseFilterPredicate

import org.apache.hadoop.hive.ql.plan.ExprNodeDesc; //导入方法依赖的package包/类
/**
 * Look for a filter predicate pushed down by the StorageHandler. If a
 * filter was pushed down, the filter expression and the list of indexed
 * columns should be set in the JobConf properties. If either is not set, we
 * can't deal with the filter here so return null. If both are present in
 * the JobConf, translate the filter expression into a list of C*
 * IndexExpressions which we'll later use in queries. The filter expression
 * should translate exactly to IndexExpressions, as our
 * HiveStoragePredicateHandler implementation has already done this once. As
 * an additional check, if this is no longer the case & there is some
 * residual predicate after translation, throw an Exception.
 *
 * @param jobConf Job Configuration
 * @return C* IndexExpressions representing the pushed down filter or null
 * pushdown is not possible
 * @throws java.io.IOException if there are problems deserializing from the
 * JobConf
 */
private List<IndexExpression> parseFilterPredicate(JobConf jobConf) throws IOException {
    String filterExprSerialized = jobConf.get(TableScanDesc.FILTER_EXPR_CONF_STR);
    if (filterExprSerialized == null) {
        return null;
    }

    ExprNodeDesc filterExpr = Utilities.deserializeExpression(filterExprSerialized, jobConf);
    String encodedIndexedColumns = jobConf.get(AbstractCassandraSerDe.CASSANDRA_INDEXED_COLUMNS);
    Set<ColumnDef> indexedColumns = CqlPushdownPredicate.deserializeIndexedColumns(encodedIndexedColumns);
    if (indexedColumns.isEmpty()) {
        return null;
    }

    IndexPredicateAnalyzer analyzer = CqlPushdownPredicate.newIndexPredicateAnalyzer(indexedColumns);
    List<IndexSearchCondition> searchConditions = new ArrayList<IndexSearchCondition>();
    ExprNodeDesc residualPredicate = analyzer.analyzePredicate(filterExpr, searchConditions);

    // There should be no residual predicate since we already negotiated
    // that earlier in CqlStorageHandler.decomposePredicate.
    if (residualPredicate != null) {
        throw new RuntimeException("Unexpected residual predicate : " + residualPredicate.getExprString());
    }

    if (!searchConditions.isEmpty()) {
        return CqlPushdownPredicate.translateSearchConditions(searchConditions, indexedColumns);
    } else {
        throw new RuntimeException("At least one search condition expected in filter predicate");
    }
}
 
开发者ID:2013Commons,项目名称:hive-cassandra,代码行数:48,代码来源:HiveCqlInputFormat.java

示例3: parseFilterPredicate

import org.apache.hadoop.hive.ql.plan.ExprNodeDesc; //导入方法依赖的package包/类
/**
 * Look for a filter predicate pushed down by the StorageHandler. If a
 * filter was pushed down, the filter expression and the list of indexed
 * columns should be set in the JobConf properties. If either is not set, we
 * can't deal with the filter here so return null. If both are present in
 * the JobConf, translate the filter expression into a list of C*
 * IndexExpressions which we'll later use in queries. The filter expression
 * should translate exactly to IndexExpressions, as our
 * HiveStoragePredicateHandler implementation has already done this once. As
 * an additional check, if this is no longer the case & there is some
 * residual predicate after translation, throw an Exception.
 *
 * @param jobConf Job Configuration
 * @return C* IndexExpressions representing the pushed down filter or null
 * pushdown is not possible
 * @throws IOException if there are problems deserializing from the JobConf
 */
private List<IndexExpression> parseFilterPredicate(JobConf jobConf) throws IOException {
    String filterExprSerialized = jobConf.get(TableScanDesc.FILTER_EXPR_CONF_STR);
    if (filterExprSerialized == null) {
        return null;
    }

    ExprNodeDesc filterExpr = Utilities.deserializeExpression(filterExprSerialized, jobConf);
    String encodedIndexedColumns = jobConf.get(AbstractCassandraSerDe.CASSANDRA_INDEXED_COLUMNS);
    Set<ColumnDef> indexedColumns = CassandraPushdownPredicate.deserializeIndexedColumns(encodedIndexedColumns);
    if (indexedColumns.isEmpty()) {
        return null;
    }

    IndexPredicateAnalyzer analyzer = CassandraPushdownPredicate.newIndexPredicateAnalyzer(indexedColumns);
    List<IndexSearchCondition> searchConditions = new ArrayList<IndexSearchCondition>();
    ExprNodeDesc residualPredicate = analyzer.analyzePredicate(filterExpr, searchConditions);

    // There should be no residual predicate since we already negotiated
    // that earlier in CassandraStorageHandler.decomposePredicate.
    if (residualPredicate != null) {
        throw new RuntimeException("Unexpected residual predicate : " + residualPredicate.getExprString());
    }

    if (!searchConditions.isEmpty()) {
        return CassandraPushdownPredicate.translateSearchConditions(searchConditions, indexedColumns);
    } else {
        throw new RuntimeException("At least one search condition expected in filter predicate");
    }
}
 
开发者ID:2013Commons,项目名称:hive-cassandra,代码行数:47,代码来源:HiveCassandraStandardColumnInputFormat.java

示例4: getSearchConditions

import org.apache.hadoop.hive.ql.plan.ExprNodeDesc; //导入方法依赖的package包/类
/**
 *
 * @param conf JobConf
 * @return list of IndexSearchConditions from the filter expression.
 */
public List<IndexSearchCondition> getSearchConditions(JobConf conf) {
    List<IndexSearchCondition> sConditions = Lists.newArrayList();
    String filteredExprSerialized = conf.get(TableScanDesc.FILTER_EXPR_CONF_STR);
    if(filteredExprSerialized == null)
        return sConditions;
    ExprNodeDesc filterExpr = Utilities.deserializeExpression(filteredExprSerialized, conf);
    IndexPredicateAnalyzer analyzer = newAnalyzer(conf);
    ExprNodeDesc residual = analyzer.analyzePredicate(filterExpr, sConditions);
    if(residual != null)
        throw new RuntimeException("Unexpected residual predicate: " + residual.getExprString());
    return sConditions;
}
 
开发者ID:bfemiano,项目名称:accumulo-hive-storage-manager,代码行数:18,代码来源:AccumuloPredicateHandler.java


注:本文中的org.apache.hadoop.hive.ql.plan.ExprNodeDesc.getExprString方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。