當前位置: 首頁>>代碼示例>>Java>>正文


Java Keys類代碼示例

本文整理匯總了Java中org.apache.beam.sdk.transforms.Keys的典型用法代碼示例。如果您正苦於以下問題:Java Keys類的具體用法?Java Keys怎麽用?Java Keys使用的例子?那麽, 這裏精選的類代碼示例或許可以為您提供幫助。


Keys類屬於org.apache.beam.sdk.transforms包,在下文中一共展示了Keys類的5個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: testTfIdf

import org.apache.beam.sdk.transforms.Keys; //導入依賴的package包/類
/** Test that the example runs. */
@Test
@Category(ValidatesRunner.class)
public void testTfIdf() throws Exception {

  pipeline.getCoderRegistry().registerCoderForClass(URI.class, StringDelegateCoder.of(URI.class));

  PCollection<KV<String, KV<URI, Double>>> wordToUriAndTfIdf = pipeline
      .apply(Create.of(
          KV.of(new URI("x"), "a b c d"),
          KV.of(new URI("y"), "a b c"),
          KV.of(new URI("z"), "a m n")))
      .apply(new TfIdf.ComputeTfIdf());

  PCollection<String> words = wordToUriAndTfIdf
      .apply(Keys.<String>create())
      .apply(Distinct.<String>create());

  PAssert.that(words).containsInAnyOrder(Arrays.asList("a", "m", "n", "b", "c", "d"));

  pipeline.run().waitUntilFinish();
}
 
開發者ID:apache,項目名稱:beam,代碼行數:23,代碼來源:TfIdfTest.java

示例2: traverseMultipleTimesThrows

import org.apache.beam.sdk.transforms.Keys; //導入依賴的package包/類
@Test
public void traverseMultipleTimesThrows() {
  p.apply(
          Create.of(KV.of(1, (Void) null), KV.of(2, (Void) null), KV.of(3, (Void) null))
              .withCoder(KvCoder.of(VarIntCoder.of(), VoidCoder.of())))
      .apply(GroupByKey.<Integer, Void>create())
      .apply(Keys.<Integer>create());

  p.traverseTopologically(visitor);
  thrown.expect(IllegalStateException.class);
  thrown.expectMessage("already been finalized");
  thrown.expectMessage(KeyedPValueTrackingVisitor.class.getSimpleName());
  p.traverseTopologically(visitor);
}
 
開發者ID:apache,項目名稱:beam,代碼行數:15,代碼來源:KeyedPValueTrackingVisitorTest.java

示例3: testBasic

import org.apache.beam.sdk.transforms.Keys; //導入依賴的package包/類
/**
 * Basic use of the LazyAvroCoder with the default schema supplier.
 */
@Test
public void testBasic() {
    // Create a PCollection of simple records, and assign it to be encoded with a LazyAvroCoder.
    PCollection<IndexedRecord> a = p.apply("a", RowGeneratorIO.read().withSchema(SampleSchemas.recordSimple()));
    a.setCoder(LazyAvroCoder.of());

    // Construct the a job looks like (a and c are collections of IndexedRecords):
    //
    // a ----> b ----> c ----> d
    // |
    // \-> b2

    // Trigger a transformation that requires the data to be shuffled and run the pipelne.
    PCollection<KV<IndexedRecord, Long>> b = a.apply("b", Count.<IndexedRecord> perElement());
    PCollection<IndexedRecord> c = b.apply("c", Keys.<IndexedRecord> create());
    c.setCoder(LazyAvroCoder.of());
    PCollection<KV<IndexedRecord, Long>> d = c.apply("d", Count.<IndexedRecord> perElement());

    PCollection<KV<IndexedRecord, Long>> b2 = a.apply("b2", Count.<IndexedRecord> perElement());

    p.run().waitUntilFinish();

    // No exception should have occurred.

    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(), hasSize(2));
    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(),
            contains(SampleSchemas.recordSimple(), SampleSchemas.recordSimple()));

    // Check that the reset cleans the supplier.
    LazyAvroCoder.resetSchemaSupplier();
    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(), emptyIterable());
}
 
開發者ID:Talend,項目名稱:components,代碼行數:36,代碼來源:LazyAvroCoderTest.java

示例4: testBasicReuse

import org.apache.beam.sdk.transforms.Keys; //導入依賴的package包/類
/**
 * Exactly the same test as {@link #testBasic()} but reusing the LazyAvroCoder.
 */
@Test
public void testBasicReuse() {
    LazyAvroCoder lac = LazyAvroCoder.of();

    // Create a PCollection of simple records, and assign it to be encoded with a LazyAvroCoder.
    PCollection<IndexedRecord> a = p.apply("a", RowGeneratorIO.read().withSchema(SampleSchemas.recordSimple()));
    a.setCoder(lac);

    // Construct the a job looks like (a and c are collections of IndexedRecords):
    //
    // a ----> b ----> c ----> d
    // |
    // \-> b2

    // Trigger a transformation that requires the data to be shuffled and run the pipelne.
    PCollection<KV<IndexedRecord, Long>> b = a.apply("b", Count.<IndexedRecord> perElement());
    PCollection<IndexedRecord> c = b.apply("c", Keys.<IndexedRecord> create());
    c.setCoder(lac);
    PCollection<KV<IndexedRecord, Long>> d = c.apply("d", Count.<IndexedRecord> perElement());

    PCollection<KV<IndexedRecord, Long>> b2 = a.apply("b2", Count.<IndexedRecord> perElement());

    p.run().waitUntilFinish();

    // No exception should have occurred.

    // Only one schema was registered.
    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(), hasSize(1));
    assertThat(LazyAvroCoder.StaticSchemaHolderSupplier.getSchemas(), contains(SampleSchemas.recordSimple()));
}
 
開發者ID:Talend,項目名稱:components,代碼行數:34,代碼來源:LazyAvroCoderTest.java

示例5: expand

import org.apache.beam.sdk.transforms.Keys; //導入依賴的package包/類
@Override
public PCollection<T> expand(PCollection<T> input) {
  return input
      .apply("Break fusion mapper", ParDo.of(new DummyMapFn<T>()))
      .apply(GroupByKey.<T, Integer>create())
      .apply(Keys.<T>create());
}
 
開發者ID:googlegenomics,項目名稱:dataflow-java,代碼行數:8,代碼來源:BreakFusionTransform.java


注:本文中的org.apache.beam.sdk.transforms.Keys類示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。