当前位置: 首页>>代码示例>>Java>>正文


Java ConnectSchema类代码示例

本文整理汇总了Java中org.apache.kafka.connect.data.ConnectSchema的典型用法代码示例。如果您正苦于以下问题:Java ConnectSchema类的具体用法?Java ConnectSchema怎么用?Java ConnectSchema使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


ConnectSchema类属于org.apache.kafka.connect.data包,在下文中一共展示了ConnectSchema类的6个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: apply

import org.apache.kafka.connect.data.ConnectSchema; //导入依赖的package包/类
@Override
public R apply(R record) {
    final Schema schema = operatingSchema(record);
    requireSchema(schema, "updating schema metadata");
    final boolean isArray = schema.type() == Schema.Type.ARRAY;
    final boolean isMap = schema.type() == Schema.Type.MAP;
    final Schema updatedSchema = new ConnectSchema(
            schema.type(),
            schema.isOptional(),
            schema.defaultValue(),
            schemaName != null ? schemaName : schema.name(),
            schemaVersion != null ? schemaVersion : schema.version(),
            schema.doc(),
            schema.parameters(),
            schema.fields(),
            isMap ? schema.keySchema() : null,
            isMap || isArray ? schema.valueSchema() : null
    );
    return newRecord(record, updatedSchema);
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:21,代码来源:SetSchemaMetadata.java

示例2: validateFormat

import org.apache.kafka.connect.data.ConnectSchema; //导入依赖的package包/类
public static <K, V> void validateFormat(Map<K, V> offsetData) {
    // Both keys and values for offsets may be null. For values, this is a useful way to delete offsets or indicate
    // that there's not usable concept of offsets in your source system.
    if (offsetData == null)
        return;

    for (Map.Entry<K, V> entry : offsetData.entrySet()) {
        if (!(entry.getKey() instanceof String))
            throw new DataException("Offsets may only use String keys");

        Object value = entry.getValue();
        if (value == null)
            continue;
        Schema.Type schemaType = ConnectSchema.schemaType(value.getClass());
        if (schemaType == null)
            throw new DataException("Offsets may only contain primitive types as values, but field " + entry.getKey() + " contains " + value.getClass());
        if (!schemaType.isPrimitive())
            throw new DataException("Offsets may only contain primitive types as values, but field " + entry.getKey() + " contains " + schemaType);
    }
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:21,代码来源:OffsetUtils.java

示例3: castValueToType

import org.apache.kafka.connect.data.ConnectSchema; //导入依赖的package包/类
private static Object castValueToType(Object value, Schema.Type targetType) {
    try {
        if (value == null) return null;

        Schema.Type inferredType = ConnectSchema.schemaType(value.getClass());
        if (inferredType == null) {
            throw new DataException("Cast transformation was passed a value of type " + value.getClass()
                    + " which is not supported by Connect's data API");
        }
        // Ensure the type we are trying to cast from is supported
        validCastType(inferredType, FieldType.INPUT);

        switch (targetType) {
            case INT8:
                return castToInt8(value);
            case INT16:
                return castToInt16(value);
            case INT32:
                return castToInt32(value);
            case INT64:
                return castToInt64(value);
            case FLOAT32:
                return castToFloat32(value);
            case FLOAT64:
                return castToFloat64(value);
            case BOOLEAN:
                return castToBoolean(value);
            case STRING:
                return castToString(value);
            default:
                throw new DataException(targetType.toString() + " is not supported in the Cast transformation.");
        }
    } catch (NumberFormatException e) {
        throw new DataException("Value (" + value.toString() + ") was out of range for requested data type", e);
    }
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:37,代码来源:Cast.java

示例4: applySchemaless

import org.apache.kafka.connect.data.ConnectSchema; //导入依赖的package包/类
private void applySchemaless(Map<String, Object> originalRecord, String fieldNamePrefix, Map<String, Object> newRecord) {
    for (Map.Entry<String, Object> entry : originalRecord.entrySet()) {
        final String fieldName = fieldName(fieldNamePrefix, entry.getKey());
        Object value = entry.getValue();
        if (value == null) {
            newRecord.put(fieldName(fieldNamePrefix, entry.getKey()), null);
            return;
        }

        Schema.Type inferredType = ConnectSchema.schemaType(value.getClass());
        if (inferredType == null) {
            throw new DataException("Flatten transformation was passed a value of type " + value.getClass()
                    + " which is not supported by Connect's data API");
        }
        switch (inferredType) {
            case INT8:
            case INT16:
            case INT32:
            case INT64:
            case FLOAT32:
            case FLOAT64:
            case BOOLEAN:
            case STRING:
            case BYTES:
                newRecord.put(fieldName(fieldNamePrefix, entry.getKey()), entry.getValue());
                break;
            case MAP:
                final Map<String, Object> fieldValue = requireMap(entry.getValue(), PURPOSE);
                applySchemaless(fieldValue, fieldName, newRecord);
                break;
            default:
                throw new DataException("Flatten transformation does not support " + entry.getValue().getClass()
                        + " for record without schemas (for field " + fieldName + ").");
        }
    }
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:37,代码来源:Flatten.java

示例5: convertKey

import org.apache.kafka.connect.data.ConnectSchema; //导入依赖的package包/类
private String convertKey(Schema keySchema, Object key) {
  if (key == null) {
    throw new ConnectException("Key is used as document id and can not be null.");
  }

  final Schema.Type schemaType;
  if (keySchema == null) {
    schemaType = ConnectSchema.schemaType(key.getClass());
    if (schemaType == null) {
      throw new DataException(
          "Java class "
          + key.getClass()
          + " does not have corresponding schema type."
      );
    }
  } else {
    schemaType = keySchema.type();
  }

  switch (schemaType) {
    case INT8:
    case INT16:
    case INT32:
    case INT64:
    case STRING:
      return String.valueOf(key);
    default:
      throw new DataException(schemaType.name() + " is not supported as the document id.");
  }
}
 
开发者ID:confluentinc,项目名称:kafka-connect-elasticsearch,代码行数:31,代码来源:DataConverter.java

示例6: testEnforceFieldTypeTaking1And1WithNull

import org.apache.kafka.connect.data.ConnectSchema; //导入依赖的package包/类
@Test
public void testEnforceFieldTypeTaking1And1WithNull() {
    SchemaBuilder schemaBuilder = SchemaBuilder.struct();
    GenericRowValueTypeEnforcer genericRowValueTypeEnforcer = new GenericRowValueTypeEnforcer(schemaBuilder);
    Schema.Type schema_Type = Schema.Type.BOOLEAN;
    ConnectSchema connectSchema = new ConnectSchema(schema_Type);

    try { 
      genericRowValueTypeEnforcer.enforceFieldType(connectSchema,  null);
      fail("Expecting exception: KsqlException");
    } catch(KsqlException e) {
       assertEquals(GenericRowValueTypeEnforcer.class.getName(), e.getStackTrace()[0].getClassName());
    }
}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:15,代码来源:GenericRowValueTypeEnforcerTest.java


注:本文中的org.apache.kafka.connect.data.ConnectSchema类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。