当前位置: 首页>>代码示例>>Java>>正文


Java HiveDecimal.bigDecimalValue方法代码示例

本文整理汇总了Java中org.apache.hadoop.hive.common.type.HiveDecimal.bigDecimalValue方法的典型用法代码示例。如果您正苦于以下问题:Java HiveDecimal.bigDecimalValue方法的具体用法?Java HiveDecimal.bigDecimalValue怎么用?Java HiveDecimal.bigDecimalValue使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.apache.hadoop.hive.common.type.HiveDecimal的用法示例。


在下文中一共展示了HiveDecimal.bigDecimalValue方法的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: convertDecimalTypes

import org.apache.hadoop.hive.common.type.HiveDecimal; //导入方法依赖的package包/类
private Object convertDecimalTypes(Object val, String javaColType) {
  HiveDecimal hd = (HiveDecimal) val;
  BigDecimal bd = hd.bigDecimalValue();

  if (javaColType.equals(BIG_DECIMAL_TYPE)) {
    return bd;
  } else if (javaColType.equals(STRING_TYPE)) {
    String bdStr = null;
    if (bigDecimalFormatString) {
      bdStr = bd.toPlainString();
    } else {
      bdStr = bd.toString();
    }
    return bdStr;
  }
  return null;
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:18,代码来源:SqoopHCatExportHelper.java

示例2: getPartitionValue

import org.apache.hadoop.hive.common.type.HiveDecimal; //导入方法依赖的package包/类
private static PartitionValue getPartitionValue(FieldSchema partitionCol, String value) {
  final TypeInfo typeInfo = TypeInfoUtils.getTypeInfoFromTypeString(partitionCol.getType());
  PartitionValue out = new PartitionValue();
  out.setColumn(partitionCol.getName());

  if("__HIVE_DEFAULT_PARTITION__".equals(value)){
    return out;
  }

  switch (typeInfo.getCategory()) {
    case PRIMITIVE:
      final PrimitiveTypeInfo primitiveTypeInfo = (PrimitiveTypeInfo) typeInfo;
      switch (((PrimitiveTypeInfo) typeInfo).getPrimitiveCategory()) {
        case BINARY:
          return out.setBinaryValue(ByteString.copyFrom(value.getBytes()));
        case BOOLEAN:
          return out.setBitValue(Boolean.parseBoolean(value));
        case DOUBLE:
          return out.setDoubleValue(Double.parseDouble(value));
        case FLOAT:
          return out.setFloatValue(Float.parseFloat(value));
        case BYTE:
        case SHORT:
        case INT:
          return out.setIntValue(Integer.parseInt(value));
        case LONG:
          return out.setLongValue(Long.parseLong(value));
        case STRING:
        case VARCHAR:
          return out.setStringValue(value);
        case CHAR:
          return out.setStringValue(value.trim());
        case TIMESTAMP:
          return out.setLongValue(DateTimes.toMillisFromJdbcTimestamp(value));
        case DATE:
          return out.setLongValue(DateTimes.toMillisFromJdbcDate(value));
        case DECIMAL:
          DecimalTypeInfo decimalTypeInfo = (DecimalTypeInfo) typeInfo;
          if(decimalTypeInfo.getPrecision() > 38){
            throw UserException.unsupportedError()
              .message("Dremio only supports decimals up to 38 digits in precision. This Hive table has a partition value with scale of %d digits.", decimalTypeInfo.getPrecision())
              .build(logger);
          }
          HiveDecimal decimal = HiveDecimalUtils.enforcePrecisionScale(HiveDecimal.create(value), decimalTypeInfo.precision(), decimalTypeInfo.scale());
          final BigDecimal original = decimal.bigDecimalValue();
          // we can't just use unscaledValue() since BigDecimal doesn't store trailing zeroes and we need to ensure decoding includes the correct scale.
          final BigInteger unscaled = original.movePointRight(decimalTypeInfo.scale()).unscaledValue();
          return out.setBinaryValue(ByteString.copyFrom(DecimalTools.signExtend16(unscaled.toByteArray())));
        default:
          HiveUtilities.throwUnsupportedHiveDataTypeError(((PrimitiveTypeInfo) typeInfo).getPrimitiveCategory().toString());
      }
    default:
      HiveUtilities.throwUnsupportedHiveDataTypeError(typeInfo.getCategory().toString());
  }

  return null; // unreachable
}
 
开发者ID:dremio,项目名称:dremio-oss,代码行数:58,代码来源:DatasetBuilder.java

示例3: readBigDecimal

import org.apache.hadoop.hive.common.type.HiveDecimal; //导入方法依赖的package包/类
private static BigDecimal readBigDecimal(HiveDecimalWritable hiveDecimalWritable) {
	HiveDecimal hiveDecimal = hiveDecimalWritable.getHiveDecimal();
	return hiveDecimal.bigDecimalValue();
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:5,代码来源:OrcUtils.java


注:本文中的org.apache.hadoop.hive.common.type.HiveDecimal.bigDecimalValue方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。