當前位置: 首頁>>代碼示例>>Java>>正文


Java HsqldbTestServer類代碼示例

本文整理匯總了Java中com.cloudera.sqoop.testutil.HsqldbTestServer的典型用法代碼示例。如果您正苦於以下問題:Java HsqldbTestServer類的具體用法?Java HsqldbTestServer怎麽用?Java HsqldbTestServer使用的例子?那麽, 這裏精選的類代碼示例或許可以為您提供幫助。


HsqldbTestServer類屬於com.cloudera.sqoop.testutil包,在下文中一共展示了HsqldbTestServer類的15個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: testUserMapping

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
public void testUserMapping() throws Exception {
  String[] args = {
      "--map-column-hive", "id=STRING,value=INTEGER",
  };
  Configuration conf = new Configuration();
  SqoopOptions options =
    new ImportTool().parseArguments(args, null, null, false);
  TableDefWriter writer = new TableDefWriter(options,
      null, HsqldbTestServer.getTableName(), "outputTable", conf, false);

  Map<String, Integer> colTypes = new SqlTypeMap<String, Integer>();
  colTypes.put("id", Types.INTEGER);
  colTypes.put("value", Types.VARCHAR);
  writer.setColumnTypes(colTypes);

  String createTable = writer.getCreateTableStmt();

  assertNotNull(createTable);

  assertTrue(createTable.contains("`id` STRING"));
  assertTrue(createTable.contains("`value` INTEGER"));

  assertFalse(createTable.contains("`id` INTEGER"));
  assertFalse(createTable.contains("`value` STRING"));
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:26,代碼來源:TestTableDefWriter.java

示例2: testUserMappingFailWhenCantBeApplied

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
public void testUserMappingFailWhenCantBeApplied() throws Exception {
  String[] args = {
      "--map-column-hive", "id=STRING,value=INTEGER",
  };
  Configuration conf = new Configuration();
  SqoopOptions options =
    new ImportTool().parseArguments(args, null, null, false);
  TableDefWriter writer = new TableDefWriter(options,
      null, HsqldbTestServer.getTableName(), "outputTable", conf, false);

  Map<String, Integer> colTypes = new SqlTypeMap<String, Integer>();
  colTypes.put("id", Types.INTEGER);
  writer.setColumnTypes(colTypes);

  try {
    String createTable = writer.getCreateTableStmt();
    fail("Expected failure on non applied mapping.");
  } catch(IllegalArgumentException iae) {
    // Expected, ok
  }
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:22,代碼來源:TestTableDefWriter.java

示例3: testHiveDatabase

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
public void testHiveDatabase() throws Exception {
  String[] args = {
      "--hive-database", "db",
  };
  Configuration conf = new Configuration();
  SqoopOptions options =
    new ImportTool().parseArguments(args, null, null, false);
  TableDefWriter writer = new TableDefWriter(options,
      null, HsqldbTestServer.getTableName(), "outputTable", conf, false);

  Map<String, Integer> colTypes = new SqlTypeMap<String, Integer>();
  writer.setColumnTypes(colTypes);

  String createTable = writer.getCreateTableStmt();
  assertNotNull(createTable);
  assertTrue(createTable.contains("`db`.`outputTable`"));

  String loadStmt = writer.getLoadDataStmt();
  assertNotNull(loadStmt);
  assertTrue(createTable.contains("`db`.`outputTable`"));
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:22,代碼來源:TestTableDefWriter.java

示例4: getOutputArgv

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
/**
 * Create the argv to pass to Sqoop.
 *
 * @return the argv as an array of strings.
 */
protected String[] getOutputArgv(boolean includeHadoopFlags,
        String[] extraArgs) {
  ArrayList<String> args = new ArrayList<String>();

  if (includeHadoopFlags) {
    CommonArgs.addHadoopFlags(args);
  }

  args.add("--table");
  args.add(getTableName());
  args.add("--connect");
  args.add(HsqldbTestServer.getUrl());
  args.add("--warehouse-dir");
  args.add(getWarehouseDir());
  args.add("--m");
  args.add("1");
  args.add("--split-by");
  args.add("INTFIELD1");
  args.add("--as-parquetfile");
  if (extraArgs != null) {
    args.addAll(Arrays.asList(extraArgs));
  }

  return args.toArray(new String[args.size()]);
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:31,代碼來源:TestParquetImport.java

示例5: getOutputQueryArgv

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
protected String[] getOutputQueryArgv(boolean includeHadoopFlags, String[] extraArgs) {
  ArrayList<String> args = new ArrayList<String>();

  if (includeHadoopFlags) {
    CommonArgs.addHadoopFlags(args);
  }

  args.add("--query");
  args.add("SELECT * FROM " + getTableName() + " WHERE $CONDITIONS");
  args.add("--connect");
  args.add(HsqldbTestServer.getUrl());
  args.add("--target-dir");
  args.add(getWarehouseDir() + "/" + getTableName());
  args.add("--m");
  args.add("1");
  args.add("--split-by");
  args.add("INTFIELD1");
  args.add("--as-parquetfile");
  if (extraArgs != null) {
    args.addAll(Arrays.asList(extraArgs));
  }

  return args.toArray(new String[args.size()]);
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:25,代碼來源:TestParquetImport.java

示例6: getOutputArgv

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
/**
 * Create the argv to pass to Sqoop.
 *
 * @return the argv as an array of strings.
 */
protected String[] getOutputArgv(boolean includeHadoopFlags,
        String[] extraArgs) {
  ArrayList<String> args = new ArrayList<String>();

  if (includeHadoopFlags) {
    CommonArgs.addHadoopFlags(args);
  }
  args.add("-m");
  args.add("1");
  args.add("--table");
  args.add(getTableName());
  args.add("--connect");
  args.add(HsqldbTestServer.getUrl());
  args.add("--warehouse-dir");
  args.add(getWarehouseDir());
  args.add("--split-by");
  args.add("INTFIELD1");
  args.add("--as-avrodatafile");
  if (extraArgs != null) {
    args.addAll(Arrays.asList(extraArgs));
  }

  return args.toArray(new String[0]);
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:30,代碼來源:TestAvroImport.java

示例7: getOutputArgv

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
/**
 * Create the argv to pass to Sqoop.
 *
 * @return the argv as an array of strings.
 */
protected ArrayList getOutputArgv(boolean includeHadoopFlags) {
  ArrayList<String> args = new ArrayList<String>();

  if (includeHadoopFlags) {
    CommonArgs.addHadoopFlags(args);
  }

  args.add("--table");
  args.add(HsqldbTestServer.getTableName());
  args.add("--connect");
  args.add(HsqldbTestServer.getUrl());
  args.add("--split-by");
  args.add("INTFIELD1");
  args.add("--as-sequencefile");

  return args;
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:23,代碼來源:TestTargetDir.java

示例8: testSetPackageName

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
/**
 * Test that we can generate code with a custom class name that includes a
 * package.
 */
@Test
public void testSetPackageName() {

  // Set the option strings in an "argv" to redirect our srcdir and bindir
  String [] argv = {
    "--bindir",
    JAR_GEN_DIR,
    "--outdir",
    CODE_GEN_DIR,
    "--package-name",
    OVERRIDE_PACKAGE_NAME,
  };

  runGenerationTest(argv, OVERRIDE_PACKAGE_NAME + "."
      + HsqldbTestServer.getTableName());
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:21,代碼來源:TestClassWriter.java

示例9: runFailedGenerationTest

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
private void runFailedGenerationTest(String [] argv,
    String classNameToCheck) {
  File codeGenDirFile = new File(CODE_GEN_DIR);
  File classGenDirFile = new File(JAR_GEN_DIR);

  try {
    options = new ImportTool().parseArguments(argv,
        null, options, true);
  } catch (Exception e) {
    LOG.error("Could not parse options: " + e.toString());
  }

  CompilationManager compileMgr = new CompilationManager(options);
  ClassWriter writer = new ClassWriter(options, manager,
      HsqldbTestServer.getTableName(), compileMgr);

  try {
    writer.generate();
    compileMgr.compile();
    fail("ORM class file generation succeeded when it was expected to fail");
  } catch (Exception ioe) {
    LOG.error("Got Exception from ORM generation as expected : "
      + ioe.toString());
  }
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:26,代碼來源:TestClassWriter.java

示例10: getOutputArgv

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
/**
 * Create the argv to pass to Sqoop.
 *
 * @return the argv as an array of strings.
 */
protected String[] getOutputArgv(boolean includeHadoopFlags) {
  ArrayList<String> args = new ArrayList<String>();

  if (includeHadoopFlags) {
    CommonArgs.addHadoopFlags(args);
  }

  args.add("--table");
  args.add(HsqldbTestServer.getTableName());
  args.add("--connect");
  args.add(HsqldbTestServer.getUrl());
  args.add("--warehouse-dir");
  args.add(getWarehouseDir());
  args.add("--split-by");
  args.add("INTFIELD1");
  args.add("--as-avrodatafile");

  return args.toArray(new String[0]);
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:25,代碼來源:TestAvroImportExportRoundtrip.java

示例11: getOutputArgvForQuery

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
/**
 * Create the argv to pass to Sqoop.
 *
 * @return the argv as an array of strings.
 */
protected String[] getOutputArgvForQuery(boolean includeHadoopFlags) {
  ArrayList<String> args = new ArrayList<String>();

  if (includeHadoopFlags) {
    CommonArgs.addHadoopFlags(args);
  }

  args.add("--query");
  args.add("select * from " + HsqldbTestServer.getTableName()
      + " where $CONDITIONS");
  args.add("--connect");
  args.add(HsqldbTestServer.getUrl());
  args.add("--target-dir");
  args.add(getWarehouseDir() + "/query_result");
  args.add("--split-by");
  args.add("INTFIELD1");
  args.add("--as-avrodatafile");

  return args.toArray(new String[0]);
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:26,代碼來源:TestAvroImportExportRoundtrip.java

示例12: checkFirstColumnSum

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
private void checkFirstColumnSum() throws SQLException {
  Connection conn = getConnection();

  PreparedStatement statement = conn.prepareStatement(
      "SELECT SUM(INTFIELD1) FROM " + getTableName(),
      ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_READ_ONLY);
  int actualVal = 0;
  try {
    ResultSet rs = statement.executeQuery();
    try {
      rs.next();
      actualVal = rs.getInt(1);
    } finally {
      rs.close();
    }
  } finally {
    statement.close();
  }

  assertEquals("First column column sum", HsqldbTestServer.getFirstColSum(),
      actualVal);
}
 
開發者ID:aliyun,項目名稱:aliyun-maxcompute-data-collectors,代碼行數:23,代碼來源:TestAvroImportExportRoundtrip.java

示例13: getOutputArgv

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
/**
 * Create the argv to pass to Sqoop.
 *
 * @return the argv as an array of strings.
 */
protected String[] getOutputArgv(boolean includeHadoopFlags,
        String[] extraArgs) {
  ArrayList<String> args = new ArrayList<String>();

  if (includeHadoopFlags) {
    CommonArgs.addHadoopFlags(args);
  }

  args.add("--table");
  args.add(getTableName());
  args.add("--connect");
  args.add(HsqldbTestServer.getUrl());
  args.add("--warehouse-dir");
  args.add(getWarehouseDir());
  args.add("--split-by");
  args.add("INTFIELD1");
  args.add("--as-avrodatafile");
  if (extraArgs != null) {
    args.addAll(Arrays.asList(extraArgs));
  }

  return args.toArray(new String[0]);
}
 
開發者ID:unicredit,項目名稱:zSqoop,代碼行數:29,代碼來源:TestAvroImport.java

示例14: getCreateTableArgv

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
/**
 * @return the argv to supply to a create-table only job for Hive imports.
 */
protected String [] getCreateTableArgv(boolean includeHadoopFlags,
    String [] moreArgs) {

  ArrayList<String> args = new ArrayList<String>();

  if (null != moreArgs) {
    for (String arg: moreArgs) {
      args.add(arg);
    }
  }

  args.add("--table");
  args.add(getTableName());
  args.add("--connect");
  args.add(HsqldbTestServer.getUrl());

  return args.toArray(new String[0]);
}
 
開發者ID:infinidb,項目名稱:sqoop,代碼行數:22,代碼來源:TestHiveImport.java

示例15: getCreateHiveTableArgs

import com.cloudera.sqoop.testutil.HsqldbTestServer; //導入依賴的package包/類
/**
 * @return the argv to supply to a ddl-executing-only job for Hive imports.
 */
protected String [] getCreateHiveTableArgs(String [] extraArgs) {
  ArrayList<String> args = new ArrayList<String>();

  args.add("--table");
  args.add(getTableName());
  args.add("--connect");
  args.add(HsqldbTestServer.getUrl());

  if (null != extraArgs) {
    for (String arg : extraArgs) {
      args.add(arg);
    }
  }

  return args.toArray(new String[0]);
}
 
開發者ID:infinidb,項目名稱:sqoop,代碼行數:20,代碼來源:TestHiveImport.java


注:本文中的com.cloudera.sqoop.testutil.HsqldbTestServer類示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。