當前位置: 首頁>>代碼示例>>Java>>正文


Java JoinedFlatTable類代碼示例

本文整理匯總了Java中org.apache.kylin.job.JoinedFlatTable的典型用法代碼示例。如果您正苦於以下問題:Java JoinedFlatTable類的具體用法?Java JoinedFlatTable怎麽用?Java JoinedFlatTable使用的例子?那麽, 這裏精選的類代碼示例或許可以為您提供幫助。


JoinedFlatTable類屬於org.apache.kylin.job包,在下文中一共展示了JoinedFlatTable類的15個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: addStepPhase1_CreateFlatTable

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
@Override
public void addStepPhase1_CreateFlatTable(DefaultChainedExecutable jobFlow) {
    final String cubeName = CubingExecutableUtil.getCubeName(jobFlow.getParams());
    final KylinConfig cubeConfig = CubeManager.getInstance(KylinConfig.getInstanceFromEnv()).getCube(cubeName)
            .getConfig();
    final String hiveInitStatements = JoinedFlatTable.generateHiveInitStatements(flatTableDatabase);

    // create flat table first
    addStepPhase1_DoCreateFlatTable(jobFlow);

    // then count and redistribute
    if (cubeConfig.isHiveRedistributeEnabled()) {
        jobFlow.addTask(createRedistributeFlatHiveTableStep(hiveInitStatements, cubeName));
    }

    // special for hive
    addStepPhase1_DoMaterializeLookupTable(jobFlow);
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:19,代碼來源:HiveMRInput.java

示例2: createSaveKafkaDataStep

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
private MapReduceExecutable createSaveKafkaDataStep(String jobId) {
    MapReduceExecutable result = new MapReduceExecutable();

    IJoinedFlatTableDesc flatHiveTableDesc = new CubeJoinedFlatTableDesc(seg);
    outputPath = JoinedFlatTable.getTableDir(flatHiveTableDesc, JobBuilderSupport.getJobWorkingDir(conf, jobId));
    result.setName("Save data from Kafka");
    result.setMapReduceJobClass(KafkaFlatTableJob.class);
    JobBuilderSupport jobBuilderSupport = new JobBuilderSupport(seg, "system");
    StringBuilder cmd = new StringBuilder();
    jobBuilderSupport.appendMapReduceParameters(cmd);
    JobBuilderSupport.appendExecCmdParameters(cmd, BatchConstants.ARG_CUBE_NAME, seg.getRealization().getName());
    JobBuilderSupport.appendExecCmdParameters(cmd, BatchConstants.ARG_OUTPUT, outputPath);
    JobBuilderSupport.appendExecCmdParameters(cmd, BatchConstants.ARG_SEGMENT_ID, seg.getUuid());
    JobBuilderSupport.appendExecCmdParameters(cmd, BatchConstants.ARG_JOB_NAME, "Kylin_Save_Kafka_Data_" + seg.getRealization().getName() + "_Step");

    result.setMapReduceParams(cmd.toString());
    return result;
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:19,代碼來源:KafkaMRInput.java

示例3: getSql

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
/**
 * Get SQL of a Cube
 *
 * @param cubeName Cube Name
 * @return
 * @throws IOException
 */
@RequestMapping(value = "/{cubeName}/sql", method = { RequestMethod.GET }, produces = { "application/json" })
@ResponseBody
public GeneralResponse getSql(@PathVariable String cubeName) {
    CubeInstance cube = cubeService.getCubeManager().getCube(cubeName);
    if (cube == null) {
        throw new InternalErrorException("Cannot find cube " + cubeName);
    }
    IJoinedFlatTableDesc flatTableDesc = EngineFactory.getJoinedFlatTableDesc(cube.getDescriptor());
    String sql = JoinedFlatTable.generateSelectDataStatement(flatTableDesc);

    GeneralResponse response = new GeneralResponse();
    response.setProperty("sql", sql);

    return response;
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:23,代碼來源:CubeController.java

示例4: addStepPhase1_DoCreateFlatTable

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
protected void addStepPhase1_DoCreateFlatTable(DefaultChainedExecutable jobFlow) {
    final String cubeName = CubingExecutableUtil.getCubeName(jobFlow.getParams());
    final String hiveInitStatements = JoinedFlatTable.generateHiveInitStatements(flatTableDatabase);
    final String jobWorkingDir = getJobWorkingDir(jobFlow);

    jobFlow.addTask(createFlatHiveTableStep(hiveInitStatements, jobWorkingDir, cubeName));
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:8,代碼來源:HiveMRInput.java

示例5: addStepPhase1_DoMaterializeLookupTable

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
protected void addStepPhase1_DoMaterializeLookupTable(DefaultChainedExecutable jobFlow) {
    final String hiveInitStatements = JoinedFlatTable.generateHiveInitStatements(flatTableDatabase);
    final String jobWorkingDir = getJobWorkingDir(jobFlow);

    AbstractExecutable task = createLookupHiveViewMaterializationStep(hiveInitStatements, jobWorkingDir);
    if (task != null) {
        jobFlow.addTask(task);
    }
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:10,代碼來源:HiveMRInput.java

示例6: createRedistributeFlatHiveTableStep

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
private AbstractExecutable createRedistributeFlatHiveTableStep(String hiveInitStatements, String cubeName) {
    RedistributeFlatHiveTableStep step = new RedistributeFlatHiveTableStep();
    step.setInitStatement(hiveInitStatements);
    step.setIntermediateTable(flatDesc.getTableName());
    step.setRedistributeDataStatement(JoinedFlatTable.generateRedistributeFlatTableStatement(flatDesc));
    CubingExecutableUtil.setCubeName(cubeName, step.getParams());
    step.setName(ExecutableConstants.STEP_NAME_REDISTRIBUTE_FLAT_HIVE_TABLE);
    return step;
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:10,代碼來源:HiveMRInput.java

示例7: createFlatHiveTableStep

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
private AbstractExecutable createFlatHiveTableStep(String hiveInitStatements, String jobWorkingDir,
        String cubeName) {
    //from hive to hive
    final String dropTableHql = JoinedFlatTable.generateDropTableStatement(flatDesc);
    final String createTableHql = JoinedFlatTable.generateCreateTableStatement(flatDesc, jobWorkingDir);
    String insertDataHqls = JoinedFlatTable.generateInsertDataStatement(flatDesc);

    CreateFlatHiveTableStep step = new CreateFlatHiveTableStep();
    step.setInitStatement(hiveInitStatements);
    step.setCreateTableStatement(dropTableHql + createTableHql + insertDataHqls);
    CubingExecutableUtil.setCubeName(cubeName, step.getParams());
    step.setName(ExecutableConstants.STEP_NAME_CREATE_FLAT_HIVE_TABLE);
    return step;
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:15,代碼來源:HiveMRInput.java

示例8: addStepPhase4_Cleanup

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
@Override
public void addStepPhase4_Cleanup(DefaultChainedExecutable jobFlow) {
    final String jobWorkingDir = getJobWorkingDir(jobFlow);

    GarbageCollectionStep step = new GarbageCollectionStep();
    step.setName(ExecutableConstants.STEP_NAME_HIVE_CLEANUP);
    step.setIntermediateTableIdentity(getIntermediateTableIdentity());
    step.setExternalDataPath(JoinedFlatTable.getTableDir(flatDesc, jobWorkingDir));
    step.setHiveViewIntermediateTableIdentities(hiveViewIntermediateTables);
    jobFlow.addTask(step);
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:12,代碼來源:HiveMRInput.java

示例9: addStepPhase1_DoCreateFlatTable

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
@Override
protected void addStepPhase1_DoCreateFlatTable(DefaultChainedExecutable jobFlow) {
    final String cubeName = CubingExecutableUtil.getCubeName(jobFlow.getParams());
    final String hiveInitStatements = JoinedFlatTable.generateHiveInitStatements(flatTableDatabase);
    final String jobWorkingDir = getJobWorkingDir(jobFlow);

    jobFlow.addTask(createSqoopToFlatHiveStep(jobWorkingDir, cubeName));
    jobFlow.addTask(createFlatHiveTableFromFiles(hiveInitStatements, jobWorkingDir));
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:10,代碼來源:JdbcHiveMRInput.java

示例10: createFlatHiveTableFromFiles

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
private AbstractExecutable createFlatHiveTableFromFiles(String hiveInitStatements, String jobWorkingDir) {
    final String dropTableHql = JoinedFlatTable.generateDropTableStatement(flatDesc);
    String filedDelimiter = getConfig().getJdbcSourceFieldDelimiter();
    // Sqoop does not support exporting SEQUENSEFILE to Hive now SQOOP-869
    final String createTableHql = JoinedFlatTable.generateCreateTableStatement(flatDesc, jobWorkingDir,
            "TEXTFILE", filedDelimiter);

    HiveCmdStep step = new HiveCmdStep();
    step.setCmd(hiveInitStatements + dropTableHql + createTableHql);
    step.setName(ExecutableConstants.STEP_NAME_CREATE_FLAT_HIVE_TABLE);
    return step;
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:13,代碼來源:JdbcHiveMRInput.java

示例11: configureJob

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
@Override
public void configureJob(Job job) {
    job.setInputFormatClass(SequenceFileInputFormat.class);
    String jobId = job.getConfiguration().get(BatchConstants.ARG_CUBING_JOB_ID);
    IJoinedFlatTableDesc flatHiveTableDesc = new CubeJoinedFlatTableDesc(cubeSegment);
    String inputPath = JoinedFlatTable.getTableDir(flatHiveTableDesc, JobBuilderSupport.getJobWorkingDir(conf, jobId));
    try {
        FileInputFormat.addInputPath(job, new Path(inputPath));
    } catch (IOException e) {
        throw new IllegalStateException(e);
    }
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:13,代碼來源:KafkaMRInput.java

示例12: testGenCreateTableDDL

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
@Test
public void testGenCreateTableDDL() {
    String ddl = JoinedFlatTable.generateCreateTableStatement(intermediateTableDesc, "/tmp", fakeJobUUID);
    System.out.println(ddl);

    System.out.println("The length for the ddl is " + ddl.length());
}
 
開發者ID:KylinOLAP,項目名稱:Kylin,代碼行數:8,代碼來源:JoinedFlatTableTest.java

示例13: testGenerateInsertSql

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
@Test
public void testGenerateInsertSql() throws IOException {
    String sqls = JoinedFlatTable.generateInsertDataStatement(intermediateTableDesc, fakeJobUUID, new JobEngineConfig(KylinConfig.getInstanceFromEnv()));
    System.out.println(sqls);

    int length = sqls.length();
    assertEquals(1155, length);
}
 
開發者ID:KylinOLAP,項目名稱:Kylin,代碼行數:9,代碼來源:JoinedFlatTableTest.java

示例14: getSql

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
/**
 * Get hive SQL of the cube
 *
 * @param cubeName Cube Name
 * @return
 * @throws UnknownHostException
 * @throws IOException
 */
@RequestMapping(value = "/{cubeName}/segs/{segmentName}/sql", method = {RequestMethod.GET})
@ResponseBody
public GeneralResponse getSql(@PathVariable String cubeName, @PathVariable String segmentName) {
    CubeInstance cube = cubeService.getCubeManager().getCube(cubeName);
    CubeDesc cubeDesc = cube.getDescriptor();
    CubeSegment cubeSegment = cube.getSegment(segmentName, SegmentStatusEnum.READY);
    CubeJoinedFlatTableDesc flatTableDesc = new CubeJoinedFlatTableDesc(cubeDesc, cubeSegment);
    String sql = JoinedFlatTable.generateSelectDataStatement(flatTableDesc);

    GeneralResponse repsonse = new GeneralResponse();
    repsonse.setProperty("sql", sql);

    return repsonse;
}
 
開發者ID:KylinOLAP,項目名稱:Kylin,代碼行數:23,代碼來源:CubeController.java

示例15: createSqoopToFlatHiveStep

import org.apache.kylin.job.JoinedFlatTable; //導入依賴的package包/類
private AbstractExecutable createSqoopToFlatHiveStep(String jobWorkingDir, String cubeName) {
    KylinConfig config = getConfig();
    PartitionDesc partitionDesc = flatDesc.getDataModel().getPartitionDesc();
    String partCol = null;
    String partitionString = null;

    if (partitionDesc.isPartitioned()) {
        partCol = partitionDesc.getPartitionDateColumn();//tablename.colname
        partitionString = partitionDesc.getPartitionConditionBuilder().buildDateRangeCondition(partitionDesc,
                flatDesc.getSegment(), flatDesc.getSegRange());
    }

    String splitTable;
    String splitColumn;
    String splitDatabase;
    TblColRef splitColRef = determineSplitColumn();
    splitTable = splitColRef.getTableRef().getTableName();
    splitColumn = splitColRef.getName();
    splitDatabase = splitColRef.getColumnDesc().getTable().getDatabase();

    //using sqoop to extract data from jdbc source and dump them to hive
    String selectSql = JoinedFlatTable.generateSelectDataStatement(flatDesc, true, new String[] { partCol });
    String hiveTable = flatDesc.getTableName();
    String connectionUrl = config.getJdbcSourceConnectionUrl();
    String driverClass = config.getJdbcSourceDriver();
    String jdbcUser = config.getJdbcSourceUser();
    String jdbcPass = config.getJdbcSourcePass();
    String sqoopHome = config.getSqoopHome();
    String filedDelimiter = config.getJdbcSourceFieldDelimiter();
    int mapperNum = config.getSqoopMapperNum();

    String bquery = String.format("SELECT min(%s), max(%s) FROM %s.%s", splitColumn, splitColumn, splitDatabase,
            splitTable);
    if (partitionString != null) {
        bquery += " WHERE " + partitionString;
    }

    String cmd = String.format(String.format(
            "%s/sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true "
                    + "--connect \"%s\" --driver %s --username %s --password %s --query \"%s AND \\$CONDITIONS\" "
                    + "--target-dir %s/%s --split-by %s.%s --boundary-query \"%s\" --null-string '' "
                    + "--fields-terminated-by '%s' --num-mappers %d",
            sqoopHome, connectionUrl, driverClass, jdbcUser, jdbcPass, selectSql, jobWorkingDir, hiveTable,
            splitTable, splitColumn, bquery, filedDelimiter, mapperNum));
    logger.debug(String.format("sqoop cmd:%s", cmd));
    CmdStep step = new CmdStep();
    step.setCmd(cmd);
    step.setName(ExecutableConstants.STEP_NAME_SQOOP_TO_FLAT_HIVE_TABLE);
    return step;
}
 
開發者ID:apache,項目名稱:kylin,代碼行數:51,代碼來源:JdbcHiveMRInput.java


注:本文中的org.apache.kylin.job.JoinedFlatTable類示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。