当前位置: 首页>>代码示例>>Java>>正文


Java StringUtils类代码示例

本文整理汇总了Java中org.elasticsearch.hadoop.util.StringUtils的典型用法代码示例。如果您正苦于以下问题:Java StringUtils类的具体用法?Java StringUtils怎么用?Java StringUtils使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。


StringUtils类属于org.elasticsearch.hadoop.util包,在下文中一共展示了StringUtils类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: openForRead

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
@Override
public TupleEntryIterator openForRead(FlowProcess<Properties> flowProcess, ScrollQuery input) throws IOException {
    if (input == null) {
        // get original copy
        Settings settings = CascadingUtils.addDefaultsToSettings(CascadingUtils.extractOriginalProperties(flowProcess.getConfigCopy()), tapProperties, log);

        // will be closed by the query is finished
        RestRepository client = new RestRepository(settings);
        Field mapping = client.getMapping();
        Collection<String> fields = CascadingUtils.fieldToAlias(settings, getSourceFields());

        // validate if possible
        FieldPresenceValidation validation = settings.getReadFieldExistanceValidation();
        if (validation.isRequired()) {
            MappingUtils.validateMapping(fields, mapping, validation, log);
        }

        input = QueryBuilder.query(settings).fields(StringUtils.concatenateAndUriEncode(fields,  ",")).
                build(client, new ScrollReader(new ScrollReaderConfig(new JdkValueReader(), mapping, settings)));
    }
    return new TupleEntrySchemeIterator<Properties, ScrollQuery>(flowProcess, getScheme(), input, getIdentifier());
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:23,代码来源:EsLocalTap.java

示例2: structObjectInspector

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
static StandardStructObjectInspector structObjectInspector(Properties tableProperties) {
    // extract column info - don't use Hive constants as they were renamed in 0.9 breaking compatibility
    // the column names are saved as the given inspector to #serialize doesn't preserves them (maybe because it's an external table)
    // use the class since StructType requires it ...
    List<String> columnNames = StringUtils.tokenize(tableProperties.getProperty(HiveConstants.COLUMNS), ",");
    List<TypeInfo> colTypes = TypeInfoUtils.getTypeInfosFromTypeString(tableProperties.getProperty(HiveConstants.COLUMNS_TYPES));

    // create a standard writable Object Inspector - used later on by serialization/deserialization
    List<ObjectInspector> inspectors = new ArrayList<ObjectInspector>();

    for (TypeInfo typeInfo : colTypes) {
        inspectors.add(TypeInfoUtils.getStandardWritableObjectInspectorFromTypeInfo(typeInfo));
    }

    return ObjectInspectorFactory.getStandardStructObjectInspector(columnNames, inspectors);
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:17,代码来源:HiveUtils.java

示例3: getSplits

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
@Override
public FileSplit[] getSplits(JobConf job, int numSplits) throws IOException {
    // first, merge input table properties (since there's no access to them ...)
    Settings settings = HadoopSettingsManager.loadFrom(job);
    //settings.merge(IOUtils.propsFromString(settings.getProperty(HiveConstants.INPUT_TBL_PROPERTIES)));

    Log log = LogFactory.getLog(getClass());
    // move on to initialization
    InitializationUtils.setValueReaderIfNotSet(settings, HiveValueReader.class, log);
    settings.setProperty(InternalConfigurationOptions.INTERNAL_ES_TARGET_FIELDS, StringUtils.concatenateAndUriEncode(HiveUtils.columnToAlias(settings), ","));
    // set read resource
    settings.setResourceRead(settings.getResourceRead());
    HiveUtils.init(settings, log);

    // decorate original splits as FileSplit
    InputSplit[] shardSplits = super.getSplits(job, numSplits);
    FileSplit[] wrappers = new FileSplit[shardSplits.length];
    Path path = new Path(job.get(HiveConstants.TABLE_LOCATION));
    for (int i = 0; i < wrappers.length; i++) {
        wrappers[i] = new EsHiveSplit(shardSplits[i], path);
    }
    return wrappers;
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:24,代码来源:EsHiveInputFormat.java

示例4: createPig

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
protected PigServer createPig() throws ExecException {
    HdpBootstrap.hackHadoopStagingOnWin();

    Properties properties = HdpBootstrap.asProperties(QueryTestParams.provisionQueries(HdpBootstrap.hadoopConfig()));
    String pigHost = properties.getProperty("pig");
    // remote Pig instance
    if (StringUtils.hasText(pigHost) && !"local".equals(pig)) {
        LogFactory.getLog(PigWrapper.class).info("Executing Pig in Map/Reduce mode");
        return new PigServer(ExecType.MAPREDUCE, properties);
    }

    // use local instance
    LogFactory.getLog(PigWrapper.class).info("Executing Pig in local mode");
    properties.put("mapred.job.tracker", "local");
    return new PigServer(ExecType.LOCAL, properties);
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:17,代码来源:PigWrapper.java

示例5: Node

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
public Node(String id, Map<String, Object> data) {
    this.id = id;
    name = data.get("name").toString();
    Object http = data.get("http_address");
    hasHttp = (http != null);

    Map<String, Object> attributes = (Map<String, Object>) data.get("attributes");
    if (attributes != null) {
        isClient = ("false".equals(attributes.get("data")) && "false".equals(attributes.get("master")));
        isData = !"false".equals(attributes.get("data"));
    }

    if (!hasHttp) {
        return;
    }

    IpAndPort ip = StringUtils.parseIpAddress(http.toString());
    ipAddress = ip.ip;
    httpPort = ip.port;
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:21,代码来源:Node.java

示例6: getParentPath

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
@Override
public String getParentPath() {
    if (currentPathCached == null) {
        if (currentPath.isEmpty()) {
            currentPathCached = StringUtils.EMPTY;
        }
        else {
            StringBuilder sb = new StringBuilder();
            for (String level : currentPath) {
                sb.append(level);
                sb.append(".");
            }
            sb.setLength(sb.length() - 1);
            currentPathCached = sb.toString();
        }
    }

    return currentPathCached;
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:20,代码来源:JacksonJsonGenerator.java

示例7: append

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
private void append(StringBuilder sb, List<Object> list, Object target) {
    for (Object object : list) {
        if (object instanceof FieldExtractor) {
            Object field = ((FieldExtractor) object).field(target);
            if (field == NOT_FOUND) {
                throw new EsHadoopIllegalArgumentException(String.format("Cannot find match for %s", pattern));
            }
            else {
                sb.append(StringUtils.jsonEncoding(field.toString()));
            }
        }
        else {
            sb.append(StringUtils.jsonEncoding(object.toString()));
        }
    }
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:17,代码来源:AbstractIndexExtractor.java

示例8: extractConstant

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
static Object extractConstant(String value, boolean autoQuote) {
    // check for quote and automatically add them, if needed for non-numbers
    if (autoQuote && !value.startsWith("\"") && !value.endsWith("\"")) {
        // constant values
        if (!("null".equals(value) || "true".equals(value) || "false".equals(value))) {
            // try number parsing
            if (value.startsWith("-")) {
                value = value.substring(1);
            }
            boolean isNumber = true;
            for (int i = 0; i < value.length(); i++) {
                if (!Character.isDigit(value.charAt(i))) {
                    isNumber = false;
                    break;
                }
            }
            if (!isNumber) {
                value = StringUtils.toJsonString(value);
            }

        }
    }
    return new RawJson(value);
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:25,代码来源:ExtractorUtils.java

示例9: doWrite

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
void doWrite(Object value) {
    // common-case - constants or JDK types
    if (value instanceof String || jsonInput || value instanceof Number || value instanceof Boolean || value == null) {
        String valueString = (value == null ? "null" : value.toString());
        if (value instanceof String && !jsonInput) {
            valueString = StringUtils.toJsonString(valueString);
        }

        pool.get().bytes(valueString);
    }
    else if (value instanceof RawJson) {
        pool.get().bytes(((RawJson) value).json());
    }
    // library specific type - use the value writer (a bit overkill but handles collections/arrays properly)
    else {
        BytesArray ba = pool.get();
        JacksonJsonGenerator generator = new JacksonJsonGenerator(new FastByteArrayOutputStream(ba));
        valueWriter.write(value, generator);
        generator.flush();
        generator.close();
    }
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:23,代码来源:AbstractBulkFactory.java

示例10: write

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
@Override
public void write(DataOutput out) throws IOException {
    out.writeUTF(nodeIp);
    out.writeInt(httpPort);
    out.writeUTF(nodeId);
    out.writeUTF(nodeName);
    out.writeUTF(shardId);
    out.writeBoolean(onlyNode);
    // avoid using writeUTF since the mapping can be longer than 65K
    byte[] utf = StringUtils.toUTF(mapping);
    out.writeInt(utf.length);
    out.write(utf);
    // same goes for settings
    utf = StringUtils.toUTF(settings);
    out.writeInt(utf.length);
    out.write(utf);
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:18,代码来源:EsInputFormat.java

示例11: readFields

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
@Override
public void readFields(DataInput in) throws IOException {
    nodeIp = in.readUTF();
    httpPort = in.readInt();
    nodeId = in.readUTF();
    nodeName = in.readUTF();
    shardId = in.readUTF();
    onlyNode = in.readBoolean();
    int length = in.readInt();
    byte[] utf = new byte[length];
    in.readFully(utf);
    mapping = StringUtils.asUTFString(utf);

    length = in.readInt();
    utf = new byte[length];
    in.readFully(utf);
    settings = StringUtils.asUTFString(utf);
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:19,代码来源:EsInputFormat.java

示例12: getTaskID

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
public static TaskID getTaskID(Configuration cfg) {
    // first try with the attempt since some Hadoop versions mix the two
    String taskAttemptId = HadoopCfgUtils.getTaskAttemptId(cfg);
    if (StringUtils.hasText(taskAttemptId)) {
        try {
            return TaskAttemptID.forName(taskAttemptId).getTaskID();
        } catch (IllegalArgumentException ex) {
            // the task attempt is invalid (Tez in particular uses the wrong string - see #346)
            // try to fallback to task id
            return parseTaskIdFromTaskAttemptId(taskAttemptId);
        }
    }
    String taskIdProp = HadoopCfgUtils.getTaskId(cfg);
    // double-check task id bug in Hadoop 2.5.x
    if (StringUtils.hasText(taskIdProp) && !taskIdProp.contains("attempt")) {
        return TaskID.forName(taskIdProp);
    }
    return null;
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:20,代码来源:HadoopCfgUtils.java

示例13: applyFilters

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
static BytesArray applyFilters(BytesArray bodyQuery, String... filters) {
    if (filters == null || filters.length == 0) {
        return bodyQuery;
    }

    String originalQuery = bodyQuery.toString();
    // remove leading/trailing { }
    int start = originalQuery.indexOf("{");
    int stop = originalQuery.lastIndexOf("}");

    String msg = String.format("Cannot apply filter(s) to what looks like an invalid DSL query (no leading/trailing { } ): '%s' ", originalQuery);
    Assert.isTrue(start >= 0, msg);
    Assert.isTrue(stop >= 0, msg);
    Assert.isTrue(stop - start > 0, msg);

    String nestedQuery = originalQuery.substring(start + 1, stop);

    // concatenate filters
    return new BytesArray(String.format(PUSH_DOWN, nestedQuery, StringUtils.concatenate(filters, ",")));
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:21,代码来源:QueryUtils.java

示例14: setValueWriterIfNotSet

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
public static boolean setValueWriterIfNotSet(Settings settings, Class<? extends ValueWriter<?>> clazz, Log log) {
    if (!StringUtils.hasText(settings.getSerializerValueWriterClassName())) {
        Log logger = (log != null ? log : LogFactory.getLog(clazz));

        String name = clazz.getName();
        if (settings.getInputAsJson()) {
            name = NoOpValueWriter.class.getName();
            if (logger.isDebugEnabled()) {
                logger.debug(String.format("Elasticsearch input marked as JSON; bypassing serialization through [%s] instead of [%s]", name, clazz));
            }
        }
        settings.setProperty(ConfigurationOptions.ES_SERIALIZATION_WRITER_VALUE_CLASS, name);
        if (logger.isDebugEnabled()) {
            logger.debug(String.format("Using pre-defined writer serializer [%s] as default", settings.getSerializerValueWriterClassName()));
        }
        return true;
    }

    return false;
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:21,代码来源:InitializationUtils.java

示例15: discoverNodes

import org.elasticsearch.hadoop.util.StringUtils; //导入依赖的package包/类
@SuppressWarnings({ "rawtypes", "unchecked" })
public List<String> discoverNodes() {
    String endpoint = "_nodes/transport";
    Map<String, Map> nodes = (Map<String, Map>) get(endpoint, "nodes");

    List<String> hosts = new ArrayList<String>(nodes.size());

    for (Map value : nodes.values()) {
        String inet = (String) value.get("http_address");
        if (StringUtils.hasText(inet)) {
            hosts.add(StringUtils.parseIpAddress(inet).toString());
        }
    }

    return hosts;
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:17,代码来源:RestClient.java


注:本文中的org.elasticsearch.hadoop.util.StringUtils类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。