当前位置: 首页>>代码示例>>Java>>正文


Java SparkTransformExecutor类代码示例

本文整理汇总了Java中org.datavec.spark.transform.SparkTransformExecutor的典型用法代码示例。如果您正苦于以下问题:Java SparkTransformExecutor类的具体用法?Java SparkTransformExecutor怎么用?Java SparkTransformExecutor使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


SparkTransformExecutor类属于org.datavec.spark.transform包,在下文中一共展示了SparkTransformExecutor类的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: call

import org.datavec.spark.transform.SparkTransformExecutor; //导入依赖的package包/类
@Override
public List<Writable> call(List<Writable> v1) throws Exception {
    if (SparkTransformExecutor.isTryCatch()) {
        try {
            return transform.map(v1);
        } catch (Exception e) {
            log.warn("Error occurred " + e + " on record " + v1);
            return new ArrayList<>();
        }
    }
    return transform.map(v1);
}
 
开发者ID:deeplearning4j,项目名称:DataVec,代码行数:13,代码来源:SparkTransformFunction.java

示例2: testConvertToSequenceLength1

import org.datavec.spark.transform.SparkTransformExecutor; //导入依赖的package包/类
@Test
public void testConvertToSequenceLength1(){

    Schema s = new Schema.Builder()
            .addColumnsString("string")
            .addColumnLong("long")
            .build();

    List<List<Writable>> allExamples = Arrays.asList(
            Arrays.<Writable>asList(new Text("a"), new LongWritable(0)),
            Arrays.<Writable>asList(new Text("b"), new LongWritable(1)),
            Arrays.<Writable>asList(new Text("c"), new LongWritable(2)));

    TransformProcess tp = new TransformProcess.Builder(s)
            .convertToSequence()
            .build();

    JavaRDD<List<Writable>> rdd = sc.parallelize(allExamples);

    JavaRDD<List<List<Writable>>> out = SparkTransformExecutor.executeToSequence(rdd, tp);

    List<List<List<Writable>>> out2 = out.collect();

    assertEquals(3, out2.size());

    for( int i=0; i<3; i++ ){
        assertTrue(out2.contains(Collections.singletonList(allExamples.get(i))));
    }
}
 
开发者ID:deeplearning4j,项目名称:DataVec,代码行数:30,代码来源:TestConvertToSequence.java

示例3: testCalculateSortedRank

import org.datavec.spark.transform.SparkTransformExecutor; //导入依赖的package包/类
@Test
public void testCalculateSortedRank() {

    List<List<Writable>> data = new ArrayList<>();
    data.add(Arrays.asList((Writable) new Text("0"), new DoubleWritable(0.0)));
    data.add(Arrays.asList((Writable) new Text("3"), new DoubleWritable(0.3)));
    data.add(Arrays.asList((Writable) new Text("2"), new DoubleWritable(0.2)));
    data.add(Arrays.asList((Writable) new Text("1"), new DoubleWritable(0.1)));

    JavaRDD<List<Writable>> rdd = sc.parallelize(data);

    Schema schema = new Schema.Builder().addColumnsString("TextCol").addColumnDouble("DoubleCol").build();

    TransformProcess tp = new TransformProcess.Builder(schema)
                    .calculateSortedRank("rank", "DoubleCol", new DoubleWritableComparator()).build();

    Schema outSchema = tp.getFinalSchema();
    assertEquals(3, outSchema.numColumns());
    assertEquals(Arrays.asList("TextCol", "DoubleCol", "rank"), outSchema.getColumnNames());
    assertEquals(Arrays.asList(ColumnType.String, ColumnType.Double, ColumnType.Long), outSchema.getColumnTypes());

    JavaRDD<List<Writable>> out = SparkTransformExecutor.execute(rdd, tp);

    List<List<Writable>> collected = out.collect();
    assertEquals(4, collected.size());
    for (int i = 0; i < 4; i++)
        assertEquals(3, collected.get(i).size());

    for (List<Writable> example : collected) {
        int exampleNum = example.get(0).toInt();
        int rank = example.get(2).toInt();
        assertEquals(exampleNum, rank);
    }
}
 
开发者ID:deeplearning4j,项目名称:DataVec,代码行数:35,代码来源:TestCalculateSortedRank.java

示例4: testConvertToSequenceCompoundKey

import org.datavec.spark.transform.SparkTransformExecutor; //导入依赖的package包/类
@Test
public void testConvertToSequenceCompoundKey() {

    Schema s = new Schema.Builder().addColumnsString("key1", "key2").addColumnLong("time").build();

    List<List<Writable>> allExamples =
                    Arrays.asList(Arrays.<Writable>asList(new Text("k1a"), new Text("k2a"), new LongWritable(10)),
                                    Arrays.<Writable>asList(new Text("k1b"), new Text("k2b"), new LongWritable(10)),
                                    Arrays.<Writable>asList(new Text("k1a"), new Text("k2a"),
                                                    new LongWritable(-10)),
                                    Arrays.<Writable>asList(new Text("k1b"), new Text("k2b"), new LongWritable(5)),
                                    Arrays.<Writable>asList(new Text("k1a"), new Text("k2a"), new LongWritable(0)));

    TransformProcess tp = new TransformProcess.Builder(s)
                    .convertToSequence(Arrays.asList("key1", "key2"), new NumericalColumnComparator("time"))
                    .build();

    JavaRDD<List<Writable>> rdd = sc.parallelize(allExamples);

    List<List<List<Writable>>> out = SparkTransformExecutor.executeToSequence(rdd, tp).collect();

    assertEquals(2, out.size());
    List<List<Writable>> seq0;
    List<List<Writable>> seq1;
    if (out.get(0).size() == 3) {
        seq0 = out.get(0);
        seq1 = out.get(1);
    } else {
        seq0 = out.get(1);
        seq1 = out.get(0);
    }

    List<List<Writable>> expSeq0 = Arrays.asList(
                    Arrays.<Writable>asList(new Text("k1a"), new Text("k2a"), new LongWritable(-10)),
                    Arrays.<Writable>asList(new Text("k1a"), new Text("k2a"), new LongWritable(0)),
                    Arrays.<Writable>asList(new Text("k1a"), new Text("k2a"), new LongWritable(10)));

    List<List<Writable>> expSeq1 = Arrays.asList(
                    Arrays.<Writable>asList(new Text("k1b"), new Text("k2b"), new LongWritable(5)),
                    Arrays.<Writable>asList(new Text("k1b"), new Text("k2b"), new LongWritable(10)));

    assertEquals(expSeq0, seq0);
    assertEquals(expSeq1, seq1);
}
 
开发者ID:deeplearning4j,项目名称:DataVec,代码行数:45,代码来源:TestConvertToSequence.java


注:本文中的org.datavec.spark.transform.SparkTransformExecutor类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。