当前位置: 首页>>代码示例>>Java>>正文


Java Keys类代码示例

本文整理汇总了Java中org.apache.beam.sdk.transforms.Keys的典型用法代码示例。如果您正苦于以下问题:Java Keys类的具体用法?Java Keys怎么用?Java Keys使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


Keys类属于org.apache.beam.sdk.transforms包,在下文中一共展示了Keys类的5个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: testTfIdf

import org.apache.beam.sdk.transforms.Keys; //导入依赖的package包/类
/** Test that the example runs. */
@Test
@Category(ValidatesRunner.class)
public void testTfIdf() throws Exception {

  pipeline.getCoderRegistry().registerCoderForClass(URI.class, StringDelegateCoder.of(URI.class));

  PCollection<KV<String, KV<URI, Double>>> wordToUriAndTfIdf = pipeline
      .apply(Create.of(
          KV.of(new URI("x"), "a b c d"),
          KV.of(new URI("y"), "a b c"),
          KV.of(new URI("z"), "a m n")))
      .apply(new TfIdf.ComputeTfIdf());

  PCollection<String> words = wordToUriAndTfIdf
      .apply(Keys.<String>create())
      .apply(Distinct.<String>create());

  PAssert.that(words).containsInAnyOrder(Arrays.asList("a", "m", "n", "b", "c", "d"));

  pipeline.run().waitUntilFinish();
}
 
开发者ID:apache,项目名称:beam,代码行数:23,代码来源:TfIdfTest.java

示例2: traverseMultipleTimesThrows

import org.apache.beam.sdk.transforms.Keys; //导入依赖的package包/类
@Test
public void traverseMultipleTimesThrows() {
  p.apply(
          Create.of(KV.of(1, (Void) null), KV.of(2, (Void) null), KV.of(3, (Void) null))
              .withCoder(KvCoder.of(VarIntCoder.of(), VoidCoder.of())))
      .apply(GroupByKey.<Integer, Void>create())
      .apply(Keys.<Integer>create());

  p.traverseTopologically(visitor);
  thrown.expect(IllegalStateException.class);
  thrown.expectMessage("already been finalized");
  thrown.expectMessage(KeyedPValueTrackingVisitor.class.getSimpleName());
  p.traverseTopologically(visitor);
}
 
开发者ID:apache,项目名称:beam,代码行数:15,代码来源:KeyedPValueTrackingVisitorTest.java

示例3: testBasic

import org.apache.beam.sdk.transforms.Keys; //导入依赖的package包/类
/**
 * Basic use of the LazyAvroCoder with the default schema supplier.
 */
@Test
public void testBasic() {
    // Create a PCollection of simple records, and assign it to be encoded with a LazyAvroCoder.
    PCollection<IndexedRecord> a = p.apply("a", RowGeneratorIO.read().withSchema(SampleSchemas.recordSimple()));
    a.setCoder(LazyAvroCoder.of());

    // Construct the a job looks like (a and c are collections of IndexedRecords):
    //
    // a ----> b ----> c ----> d
    // |
    // \-> b2

    // Trigger a transformation that requires the data to be shuffled and run the pipelne.
    PCollection<KV<IndexedRecord, Long>> b = a.apply("b", Count.<IndexedRecord> perElement());
    PCollection<IndexedRecord> c = b.apply("c", Keys.<IndexedRecord> create());
    c.setCoder(LazyAvroCoder.of());
    PCollection<KV<IndexedRecord, Long>> d = c.apply("d", Count.<IndexedRecord> perElement());

    PCollection<KV<IndexedRecord, Long>> b2 = a.apply("b2", Count.<IndexedRecord> perElement());

    p.run().waitUntilFinish();

    // No exception should have occurred.

    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(), hasSize(2));
    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(),
            contains(SampleSchemas.recordSimple(), SampleSchemas.recordSimple()));

    // Check that the reset cleans the supplier.
    LazyAvroCoder.resetSchemaSupplier();
    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(), emptyIterable());
}
 
开发者ID:Talend,项目名称:components,代码行数:36,代码来源:LazyAvroCoderTest.java

示例4: testBasicReuse

import org.apache.beam.sdk.transforms.Keys; //导入依赖的package包/类
/**
 * Exactly the same test as {@link #testBasic()} but reusing the LazyAvroCoder.
 */
@Test
public void testBasicReuse() {
    LazyAvroCoder lac = LazyAvroCoder.of();

    // Create a PCollection of simple records, and assign it to be encoded with a LazyAvroCoder.
    PCollection<IndexedRecord> a = p.apply("a", RowGeneratorIO.read().withSchema(SampleSchemas.recordSimple()));
    a.setCoder(lac);

    // Construct the a job looks like (a and c are collections of IndexedRecords):
    //
    // a ----> b ----> c ----> d
    // |
    // \-> b2

    // Trigger a transformation that requires the data to be shuffled and run the pipelne.
    PCollection<KV<IndexedRecord, Long>> b = a.apply("b", Count.<IndexedRecord> perElement());
    PCollection<IndexedRecord> c = b.apply("c", Keys.<IndexedRecord> create());
    c.setCoder(lac);
    PCollection<KV<IndexedRecord, Long>> d = c.apply("d", Count.<IndexedRecord> perElement());

    PCollection<KV<IndexedRecord, Long>> b2 = a.apply("b2", Count.<IndexedRecord> perElement());

    p.run().waitUntilFinish();

    // No exception should have occurred.

    // Only one schema was registered.
    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(), hasSize(1));
    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(), contains(SampleSchemas.recordSimple()));
}
 
开发者ID:Talend,项目名称:components,代码行数:34,代码来源:LazyAvroCoderTest.java

示例5: expand

import org.apache.beam.sdk.transforms.Keys; //导入依赖的package包/类
@Override
public PCollection<T> expand(PCollection<T> input) {
  return input
      .apply("Break fusion mapper", ParDo.of(new DummyMapFn<T>()))
      .apply(GroupByKey.<T, Integer>create())
      .apply(Keys.<T>create());
}
 
开发者ID:googlegenomics,项目名称:dataflow-java,代码行数:8,代码来源:BreakFusionTransform.java


注:本文中的org.apache.beam.sdk.transforms.Keys类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。