当前位置: 首页>>代码示例>>Java>>正文


Java DataException类代码示例

本文整理汇总了Java中org.apache.kafka.connect.errors.DataException的典型用法代码示例。如果您正苦于以下问题:Java DataException类的具体用法?Java DataException怎么用?Java DataException使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


DataException类属于org.apache.kafka.connect.errors包,在下文中一共展示了DataException类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: toConnectData

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
@Override
public SchemaAndValue toConnectData(String topic, byte[] value) {
    JsonNode jsonValue;
    try {
        jsonValue = deserializer.deserialize(topic, value);
    } catch (SerializationException e) {
        throw new DataException("Converting byte[] to Kafka Connect data failed due to serialization error: ", e);
    }

    if (enableSchemas && (jsonValue == null || !jsonValue.isObject() || jsonValue.size() != 2 || !jsonValue.has("schema") || !jsonValue.has("payload")))
        throw new DataException("JsonConverter with schemas.enable requires \"schema\" and \"payload\" fields and may not contain additional fields." +
                " If you are trying to deserialize plain JSON data, set schemas.enable=false in your converter configuration.");

    // The deserialized data should either be an envelope object containing the schema and the payload or the schema
    // was stripped during serialization and we need to fill in an all-encompassing schema.
    if (!enableSchemas) {
        ObjectNode envelope = JsonNodeFactory.instance.objectNode();
        envelope.set("schema", null);
        envelope.set("payload", jsonValue);
        jsonValue = envelope;
    }

    return jsonToConnect(jsonValue);
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:25,代码来源:JsonConverter.java

示例2: readerWithProjection

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
@Test(expected = DataException.class)
public void readerWithProjection() throws Throwable {
    Map<String, Object> cfg = new HashMap<String, Object>() {{
        put(ParquetFileReader.FILE_READER_PARQUET_PROJECTION, projectionSchema.toString());
    }};
    reader = getReader(FileSystem.newInstance(fsUri, new Configuration()), dataFile, cfg);
    while (reader.hasNext()) {
        Struct record = reader.next();
        assertNotNull(record.schema().field(FIELD_INDEX));
        assertNotNull(record.schema().field(FIELD_NAME));
        assertNull(record.schema().field(FIELD_SURNAME));
    }

    reader = getReader(FileSystem.newInstance(fsUri, new Configuration()), dataFile, cfg);
    readAllData();
}
 
开发者ID:mmolimar,项目名称:kafka-connect-fs,代码行数:17,代码来源:ParquetFileReaderTest.java

示例3: toAmpool

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
public Object toAmpool(Object data, Schema fieldSchema) {
  if(!fieldSchema.isOptional()) {

    if(data == null)
      throw new DataException("error: schema not optional but data was null");

    return toAmpool(data);
  }

  if(data != null) {
    return toAmpool(data);
  }

  if(fieldSchema.defaultValue() != null) {
    return toAmpool(fieldSchema.defaultValue());
  }

  return null;
}
 
开发者ID:ampool,项目名称:monarch,代码行数:20,代码来源:SinkFieldConverter.java

示例4: buildWithSchema

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
private void buildWithSchema(Struct record, String fieldNamePrefix, Struct newRecord) {
    for (Field field : record.schema().fields()) {
        final String fieldName = fieldName(fieldNamePrefix, field.name());
        switch (field.schema().type()) {
            case INT8:
            case INT16:
            case INT32:
            case INT64:
            case FLOAT32:
            case FLOAT64:
            case BOOLEAN:
            case STRING:
            case BYTES:
                newRecord.put(fieldName, record.get(field));
                break;
            case STRUCT:
                buildWithSchema(record.getStruct(field.name()), fieldName, newRecord);
                break;
            default:
                throw new DataException("Flatten transformation does not support " + field.schema().type()
                        + " for record without schemas (for field " + fieldName + ").");
        }
    }
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:25,代码来源:Flatten.java

示例5: validateFormat

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
public static <K, V> void validateFormat(Map<K, V> offsetData) {
    // Both keys and values for offsets may be null. For values, this is a useful way to delete offsets or indicate
    // that there's not usable concept of offsets in your source system.
    if (offsetData == null)
        return;

    for (Map.Entry<K, V> entry : offsetData.entrySet()) {
        if (!(entry.getKey() instanceof String))
            throw new DataException("Offsets may only use String keys");

        Object value = entry.getValue();
        if (value == null)
            continue;
        Schema.Type schemaType = ConnectSchema.schemaType(value.getClass());
        if (schemaType == null)
            throw new DataException("Offsets may only contain primitive types as values, but field " + entry.getKey() + " contains " + value.getClass());
        if (!schemaType.isPrimitive())
            throw new DataException("Offsets may only contain primitive types as values, but field " + entry.getKey() + " contains " + schemaType);
    }
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:21,代码来源:OffsetUtils.java

示例6: processValue

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
private Object processValue(Schema schema, Object value) {
    switch (schema.type()) {
        case BOOLEAN:
        case FLOAT32:
        case FLOAT64:
        case INT8:
        case INT16:
        case INT32:
        case INT64:
        case BYTES:
        case STRING:
            return value;
        case MAP:
        case ARRAY:
        case STRUCT:
            return new DataException("Unsupported schema type: " + schema.type());
        default:
            throw new DataException("Unknown schema type: " + schema.type());
    }
}
 
开发者ID:fluent,项目名称:kafka-connect-fluentd,代码行数:21,代码来源:SchemafulRecordConverter.java

示例7: convert

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
@Override
@SuppressWarnings("unchecked")
public FluentdEventRecord convert(Schema schema, Object value) {
    if (value == null) {
        return null;
    }

    Map<String, Object> record;

    try {
        record = new ObjectMapper().readValue((String) value, LinkedHashMap.class);
    } catch (IOException e) {
        throw new DataException(e);
    }
    return new FluentdEventRecord(null, record);
}
 
开发者ID:fluent,项目名称:kafka-connect-fluentd,代码行数:17,代码来源:RawJsonStringRecordConverter.java

示例8: convert

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
@Override
public Object convert(Schema schema, JsonNode value) {
    if (!value.isObject())
        throw new DataException("Structs should be encoded as JSON objects, but found " + value.getNodeType());

    // We only have ISchema here but need Schema, so we need to materialize the actual schema. Using ISchema
    // avoids having to materialize the schema for non-Struct types but it cannot be avoided for Structs since
    // they require a schema to be provided at construction. However, the schema is only a SchemaBuilder during
    // translation of schemas to JSON; during the more common translation of data to JSON, the call to schema.schema()
    // just returns the schema Object and has no overhead.
    Struct result = new Struct(schema.schema());
    for (Field field : schema.fields())
        result.put(field, convertToConnect(field.schema(), value.get(field.name())));

    return result;
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:17,代码来源:JsonConverter.java

示例9: headers

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
static Map<String, Struct> headers(BasicProperties basicProperties) {
  Map<String, Object> input = basicProperties.getHeaders();
  Map<String, Struct> results = new LinkedHashMap<>();
  if (null != input) {
    for (Map.Entry<String, Object> kvp : input.entrySet()) {
      log.trace("headers() - key = '{}' value= '{}'", kvp.getKey(), kvp.getValue());
      final String field;
      final Object headerValue;

      if (kvp.getValue() instanceof LongString) {
        headerValue = kvp.getValue().toString();
      } else {
        headerValue = kvp.getValue();
      }

      if (!FIELD_LOOKUP.containsKey(headerValue.getClass())) {
        throw new DataException(
            String.format("Could not determine the type for field '%s' type '%s'", kvp.getKey(), headerValue.getClass().getName())
        );
      } else {
        field = FIELD_LOOKUP.get(headerValue.getClass());
      }

      log.trace("headers() - Storing value for header in field = '{}' as {}", field, field);

      Struct value = new Struct(SCHEMA_HEADER_VALUE)
          .put("type", field)
          .put(field, headerValue);
      results.put(kvp.getKey(), value);
    }
  }
  return results;
}
 
开发者ID:jcustenborder,项目名称:kafka-connect-rabbitmq,代码行数:34,代码来源:MessageConverter.java

示例10: castValueToType

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
private static Object castValueToType(Object value, Schema.Type targetType) {
    try {
        if (value == null) return null;

        Schema.Type inferredType = ConnectSchema.schemaType(value.getClass());
        if (inferredType == null) {
            throw new DataException("Cast transformation was passed a value of type " + value.getClass()
                    + " which is not supported by Connect's data API");
        }
        // Ensure the type we are trying to cast from is supported
        validCastType(inferredType, FieldType.INPUT);

        switch (targetType) {
            case INT8:
                return castToInt8(value);
            case INT16:
                return castToInt16(value);
            case INT32:
                return castToInt32(value);
            case INT64:
                return castToInt64(value);
            case FLOAT32:
                return castToFloat32(value);
            case FLOAT64:
                return castToFloat64(value);
            case BOOLEAN:
                return castToBoolean(value);
            case STRING:
                return castToString(value);
            default:
                throw new DataException(targetType.toString() + " is not supported in the Cast transformation.");
        }
    } catch (NumberFormatException e) {
        throw new DataException("Value (" + value.toString() + ") was out of range for requested data type", e);
    }
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:37,代码来源:Cast.java

示例11: castToInt8

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
private static byte castToInt8(Object value) {
    if (value instanceof Number)
        return ((Number) value).byteValue();
    else if (value instanceof Boolean)
        return ((boolean) value) ? (byte) 1 : (byte) 0;
    else if (value instanceof String)
        return Byte.parseByte((String) value);
    else
        throw new DataException("Unexpected type in Cast transformation: " + value.getClass());
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:11,代码来源:Cast.java

示例12: castToInt16

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
private static short castToInt16(Object value) {
    if (value instanceof Number)
        return ((Number) value).shortValue();
    else if (value instanceof Boolean)
        return ((boolean) value) ? (short) 1 : (short) 0;
    else if (value instanceof String)
        return Short.parseShort((String) value);
    else
        throw new DataException("Unexpected type in Cast transformation: " + value.getClass());
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:11,代码来源:Cast.java

示例13: castToInt32

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
private static int castToInt32(Object value) {
    if (value instanceof Number)
        return ((Number) value).intValue();
    else if (value instanceof Boolean)
        return ((boolean) value) ? 1 : 0;
    else if (value instanceof String)
        return Integer.parseInt((String) value);
    else
        throw new DataException("Unexpected type in Cast transformation: " + value.getClass());
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:11,代码来源:Cast.java

示例14: castToInt64

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
private static long castToInt64(Object value) {
    if (value instanceof Number)
        return ((Number) value).longValue();
    else if (value instanceof Boolean)
        return ((boolean) value) ? (long) 1 : (long) 0;
    else if (value instanceof String)
        return Long.parseLong((String) value);
    else
        throw new DataException("Unexpected type in Cast transformation: " + value.getClass());
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:11,代码来源:Cast.java

示例15: castToFloat64

import org.apache.kafka.connect.errors.DataException; //导入依赖的package包/类
private static double castToFloat64(Object value) {
    if (value instanceof Number)
        return ((Number) value).doubleValue();
    else if (value instanceof Boolean)
        return ((boolean) value) ? 1. : 0.;
    else if (value instanceof String)
        return Double.parseDouble((String) value);
    else
        throw new DataException("Unexpected type in Cast transformation: " + value.getClass());
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:11,代码来源:Cast.java


注:本文中的org.apache.kafka.connect.errors.DataException类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。