当前位置: 首页>>代码示例>>Java>>正文


Java Job类代码示例

本文整理汇总了Java中com.google.api.services.dataflow.model.Job的典型用法代码示例。如果您正苦于以下问题:Java Job类的具体用法?Java Job怎么用?Java Job使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


Job类属于com.google.api.services.dataflow.model包,在下文中一共展示了Job类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: translate

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
/**
 * Translates a {@link Pipeline} into a {@code JobSpecification}.
 */
public JobSpecification translate(
    Pipeline pipeline,
    DataflowRunner runner,
    List<DataflowPackage> packages) {

  // Capture the sdkComponents for look up during step translations
  SdkComponents sdkComponents = SdkComponents.create();
  RunnerApi.Pipeline pipelineProto = PipelineTranslation.toProto(pipeline, sdkComponents);

  LOG.debug("Portable pipeline proto:\n{}", TextFormat.printToString(pipelineProto));

  Translator translator = new Translator(pipeline, runner, sdkComponents);
  Job result = translator.translate(packages);
  return new JobSpecification(
      result, pipelineProto, Collections.unmodifiableMap(translator.stepNames));
}
 
开发者ID:apache,项目名称:beam,代码行数:20,代码来源:DataflowPipelineTranslator.java

示例2: getStateWithRetries

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
/**
 * Attempts to get the state. Uses exponential backoff on failure up to the maximum number
 * of passed in attempts.
 *
 * @param attempts The amount of attempts to make.
 * @param sleeper Object used to do the sleeps between attempts.
 * @return The state of the job or State.UNKNOWN in case of failure.
 */
@VisibleForTesting
State getStateWithRetries(BackOff attempts, Sleeper sleeper) {
  if (terminalState != null) {
    return terminalState;
  }
  try {
    Job job = getJobWithRetries(attempts, sleeper);
    return MonitoringUtil.toState(job.getCurrentState());
  } catch (IOException exn) {
    // The only IOException that getJobWithRetries is permitted to throw is the final IOException
    // that caused the failure of retry. Other exceptions are wrapped in an unchecked exceptions
    // and will propagate.
    return State.UNKNOWN;
  }
}
 
开发者ID:apache,项目名称:beam,代码行数:24,代码来源:DataflowPipelineJob.java

示例3: getJobWithRetries

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
/**
 * Attempts to get the underlying {@link Job}. Uses exponential backoff on failure up to the
 * maximum number of passed in attempts.
 *
 * @param backoff the {@link BackOff} used to control retries.
 * @param sleeper Object used to do the sleeps between attempts.
 * @return The underlying {@link Job} object.
 * @throws IOException When the maximum number of retries is exhausted, the last exception is
 * thrown.
 */
private Job getJobWithRetries(BackOff backoff, Sleeper sleeper) throws IOException {
  // Retry loop ends in return or throw
  while (true) {
    try {
      Job job = dataflowClient.getJob(jobId);
      State currentState = MonitoringUtil.toState(job.getCurrentState());
      if (currentState.isTerminal()) {
        terminalState = currentState;
        replacedByJob = new DataflowPipelineJob(
            dataflowClient, job.getReplacedByJobId(), dataflowOptions, transformStepNames);
      }
      return job;
    } catch (IOException exn) {
      LOG.warn("There were problems getting current job status: {}.", exn.getMessage());
      LOG.debug("Exception information:", exn);

      if (!nextBackOff(sleeper, backoff)) {
        throw exn;
      }
    }
  }
}
 
开发者ID:apache,项目名称:beam,代码行数:33,代码来源:DataflowPipelineJob.java

示例4: testEmptyMetricUpdates

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
@Test
public void testEmptyMetricUpdates() throws IOException {
  Job modelJob = new Job();
  modelJob.setCurrentState(State.RUNNING.toString());

  DataflowPipelineJob job = mock(DataflowPipelineJob.class);
  DataflowPipelineOptions options = mock(DataflowPipelineOptions.class);
  when(options.isStreaming()).thenReturn(false);
  when(job.getDataflowOptions()).thenReturn(options);
  when(job.getState()).thenReturn(State.RUNNING);
  job.jobId = JOB_ID;

  JobMetrics jobMetrics = new JobMetrics();
  jobMetrics.setMetrics(null /* this is how the APIs represent empty metrics */);
  DataflowClient dataflowClient = mock(DataflowClient.class);
  when(dataflowClient.getJobMetrics(JOB_ID)).thenReturn(jobMetrics);

  DataflowMetrics dataflowMetrics = new DataflowMetrics(job, dataflowClient);
  MetricQueryResults result = dataflowMetrics.queryMetrics();
  assertThat(ImmutableList.copyOf(result.counters()), is(empty()));
  assertThat(ImmutableList.copyOf(result.distributions()), is(empty()));
}
 
开发者ID:apache,项目名称:beam,代码行数:23,代码来源:DataflowMetricsTest.java

示例5: testCachingMetricUpdates

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
@Test
public void testCachingMetricUpdates() throws IOException {
  Job modelJob = new Job();
  modelJob.setCurrentState(State.RUNNING.toString());

  DataflowPipelineJob job = mock(DataflowPipelineJob.class);
  DataflowPipelineOptions options = mock(DataflowPipelineOptions.class);
  when(options.isStreaming()).thenReturn(false);
  when(job.getDataflowOptions()).thenReturn(options);
  when(job.getState()).thenReturn(State.DONE);
  job.jobId = JOB_ID;

  JobMetrics jobMetrics = new JobMetrics();
  jobMetrics.setMetrics(ImmutableList.<MetricUpdate>of());
  DataflowClient dataflowClient = mock(DataflowClient.class);
  when(dataflowClient.getJobMetrics(JOB_ID)).thenReturn(jobMetrics);

  DataflowMetrics dataflowMetrics = new DataflowMetrics(job, dataflowClient);
  verify(dataflowClient, times(0)).getJobMetrics(JOB_ID);
  dataflowMetrics.queryMetrics(null);
  verify(dataflowClient, times(1)).getJobMetrics(JOB_ID);
  dataflowMetrics.queryMetrics(null);
  verify(dataflowClient, times(1)).getJobMetrics(JOB_ID);
}
 
开发者ID:apache,项目名称:beam,代码行数:25,代码来源:DataflowMetricsTest.java

示例6: buildMockDataflow

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
private static Dataflow buildMockDataflow(
    ArgumentMatcher<Job> jobMatcher) throws IOException {
  Dataflow mockDataflowClient = mock(Dataflow.class);
  Dataflow.Projects mockProjects = mock(Dataflow.Projects.class);
  Dataflow.Projects.Jobs mockJobs = mock(Dataflow.Projects.Jobs.class);
  Dataflow.Projects.Jobs.Create mockRequest = mock(
      Dataflow.Projects.Jobs.Create.class);

  when(mockDataflowClient.projects()).thenReturn(mockProjects);
  when(mockProjects.jobs()).thenReturn(mockJobs);
  when(mockJobs.create(eq("someProject"), argThat(jobMatcher)))
      .thenReturn(mockRequest);

  Job resultJob = new Job();
  resultJob.setId("newid");
  when(mockRequest.execute()).thenReturn(resultJob);
  return mockDataflowClient;
}
 
开发者ID:apache,项目名称:beam,代码行数:19,代码来源:DataflowPipelineTranslatorTest.java

示例7: testNetworkConfig

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
@Test
public void testNetworkConfig() throws IOException {
  final String testNetwork = "test-network";

  DataflowPipelineOptions options = buildPipelineOptions();
  options.setNetwork(testNetwork);

  Pipeline p = buildPipeline(options);
  p.traverseTopologically(new RecordingPipelineVisitor());
  Job job =
      DataflowPipelineTranslator.fromOptions(options)
          .translate(
              p, DataflowRunner.fromOptions(options), Collections.<DataflowPackage>emptyList())
          .getJob();

  assertEquals(1, job.getEnvironment().getWorkerPools().size());
  assertEquals(testNetwork,
      job.getEnvironment().getWorkerPools().get(0).getNetwork());
}
 
开发者ID:apache,项目名称:beam,代码行数:20,代码来源:DataflowPipelineTranslatorTest.java

示例8: testSubnetworkConfig

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
@Test
public void testSubnetworkConfig() throws IOException {
  final String testSubnetwork = "regions/REGION/subnetworks/SUBNETWORK";

  DataflowPipelineOptions options = buildPipelineOptions();
  options.setSubnetwork(testSubnetwork);

  Pipeline p = buildPipeline(options);
  p.traverseTopologically(new RecordingPipelineVisitor());
  Job job =
      DataflowPipelineTranslator.fromOptions(options)
          .translate(
              p, DataflowRunner.fromOptions(options), Collections.<DataflowPackage>emptyList())
          .getJob();

  assertEquals(1, job.getEnvironment().getWorkerPools().size());
  assertEquals(testSubnetwork,
      job.getEnvironment().getWorkerPools().get(0).getSubnetwork());
}
 
开发者ID:apache,项目名称:beam,代码行数:20,代码来源:DataflowPipelineTranslatorTest.java

示例9: testZoneConfig

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
@Test
public void testZoneConfig() throws IOException {
  final String testZone = "test-zone-1";

  DataflowPipelineOptions options = buildPipelineOptions();
  options.setZone(testZone);

  Pipeline p = buildPipeline(options);
  p.traverseTopologically(new RecordingPipelineVisitor());
  Job job =
      DataflowPipelineTranslator.fromOptions(options)
          .translate(
              p, DataflowRunner.fromOptions(options), Collections.<DataflowPackage>emptyList())
          .getJob();

  assertEquals(1, job.getEnvironment().getWorkerPools().size());
  assertEquals(testZone,
      job.getEnvironment().getWorkerPools().get(0).getZone());
}
 
开发者ID:apache,项目名称:beam,代码行数:20,代码来源:DataflowPipelineTranslatorTest.java

示例10: testWorkerMachineTypeConfig

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
@Test
public void testWorkerMachineTypeConfig() throws IOException {
  final String testMachineType = "test-machine-type";

  DataflowPipelineOptions options = buildPipelineOptions();
  options.setWorkerMachineType(testMachineType);

  Pipeline p = buildPipeline(options);
  p.traverseTopologically(new RecordingPipelineVisitor());
  Job job =
      DataflowPipelineTranslator.fromOptions(options)
          .translate(
              p, DataflowRunner.fromOptions(options), Collections.<DataflowPackage>emptyList())
          .getJob();

  assertEquals(1, job.getEnvironment().getWorkerPools().size());

  WorkerPool workerPool = job.getEnvironment().getWorkerPools().get(0);
  assertEquals(testMachineType, workerPool.getMachineType());
}
 
开发者ID:apache,项目名称:beam,代码行数:21,代码来源:DataflowPipelineTranslatorTest.java

示例11: testDiskSizeGbConfig

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
@Test
public void testDiskSizeGbConfig() throws IOException {
  final Integer diskSizeGb = 1234;

  DataflowPipelineOptions options = buildPipelineOptions();
  options.setDiskSizeGb(diskSizeGb);

  Pipeline p = buildPipeline(options);
  p.traverseTopologically(new RecordingPipelineVisitor());
  Job job =
      DataflowPipelineTranslator.fromOptions(options)
          .translate(
              p, DataflowRunner.fromOptions(options), Collections.<DataflowPackage>emptyList())
          .getJob();

  assertEquals(1, job.getEnvironment().getWorkerPools().size());
  assertEquals(diskSizeGb,
      job.getEnvironment().getWorkerPools().get(0).getDiskSizeGb());
}
 
开发者ID:apache,项目名称:beam,代码行数:20,代码来源:DataflowPipelineTranslatorTest.java

示例12: createPredefinedStep

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
/**
 * Returns a Step for a {@link DoFn} by creating and translating a pipeline.
 */
private static Step createPredefinedStep() throws Exception {
  DataflowPipelineOptions options = buildPipelineOptions();
  DataflowPipelineTranslator translator = DataflowPipelineTranslator.fromOptions(options);
  Pipeline pipeline = Pipeline.create(options);
  String stepName = "DoFn1";
  pipeline.apply("ReadMyFile", TextIO.read().from("gs://bucket/in"))
      .apply(stepName, ParDo.of(new NoOpFn()))
      .apply("WriteMyFile", TextIO.write().to("gs://bucket/out"));
  DataflowRunner runner = DataflowRunner.fromOptions(options);
  runner.replaceTransforms(pipeline);
  Job job =
      translator
          .translate(
              pipeline,
              runner,
              Collections.<DataflowPackage>emptyList())
          .getJob();

  assertEquals(8, job.getSteps().size());
  Step step = job.getSteps().get(1);
  assertEquals(stepName, getString(step.getProperties(), PropertyNames.USER_NAME));
  assertAllStepOutputsHaveUniqueIds(job);
  return step;
}
 
开发者ID:apache,项目名称:beam,代码行数:28,代码来源:DataflowPipelineTranslatorTest.java

示例13: assertAllStepOutputsHaveUniqueIds

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
private static void assertAllStepOutputsHaveUniqueIds(Job job)
    throws Exception {
  List<Long> outputIds = new ArrayList<>();
  for (Step step : job.getSteps()) {
    List<Map<String, Object>> outputInfoList =
        (List<Map<String, Object>>) step.getProperties().get(PropertyNames.OUTPUT_INFO);
    if (outputInfoList != null) {
      for (Map<String, Object> outputInfo : outputInfoList) {
        outputIds.add(Long.parseLong(Structs.getString(outputInfo, PropertyNames.OUTPUT_NAME)));
      }
    }
  }
  Set<Long> uniqueOutputNames = new HashSet<>(outputIds);
  outputIds.removeAll(uniqueOutputNames);
  assertTrue(String.format("Found duplicate output ids %s", outputIds),
      outputIds.size() == 0);
}
 
开发者ID:apache,项目名称:beam,代码行数:18,代码来源:DataflowPipelineTranslatorTest.java

示例14: mockWaitToFinishInState

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
public State mockWaitToFinishInState(State state) throws Exception {
  Dataflow.Projects.Locations.Jobs.Get statusRequest =
      mock(Dataflow.Projects.Locations.Jobs.Get.class);

  Job statusResponse = new Job();
  statusResponse.setCurrentState("JOB_STATE_" + state.name());
  if (state == State.UPDATED) {
    statusResponse.setReplacedByJobId(REPLACEMENT_JOB_ID);
  }

  when(mockJobs.get(eq(PROJECT_ID), eq(REGION_ID), eq(JOB_ID))).thenReturn(statusRequest);
  when(statusRequest.execute()).thenReturn(statusResponse);

  DataflowPipelineJob job =
      new DataflowPipelineJob(
          DataflowClient.create(options),
          JOB_ID,
          options,
          ImmutableMap.<AppliedPTransform<?, ?, ?>, String>of());

  return job.waitUntilFinish(Duration.standardMinutes(1), null, fastClock, fastClock);
}
 
开发者ID:apache,项目名称:beam,代码行数:23,代码来源:DataflowPipelineJobTest.java

示例15: testCumulativeTimeOverflow

import com.google.api.services.dataflow.model.Job; //导入依赖的package包/类
@Test
public void testCumulativeTimeOverflow() throws Exception {
  Dataflow.Projects.Locations.Jobs.Get statusRequest =
      mock(Dataflow.Projects.Locations.Jobs.Get.class);

  Job statusResponse = new Job();
  statusResponse.setCurrentState("JOB_STATE_RUNNING");
  when(mockJobs.get(eq(PROJECT_ID), eq(REGION_ID), eq(JOB_ID))).thenReturn(statusRequest);
  when(statusRequest.execute()).thenReturn(statusResponse);

  FastNanoClockAndFuzzySleeper clock = new FastNanoClockAndFuzzySleeper();

  DataflowPipelineJob job =
      new DataflowPipelineJob(
          DataflowClient.create(options),
          JOB_ID,
          options,
          ImmutableMap.<AppliedPTransform<?, ?, ?>, String>of());
  long startTime = clock.nanoTime();
  State state = job.waitUntilFinish(Duration.millis(4), null, clock, clock);
  assertEquals(null, state);
  long timeDiff = TimeUnit.NANOSECONDS.toMillis(clock.nanoTime() - startTime);
  // Should only have slept for the 4 ms allowed.
  assertThat(timeDiff, lessThanOrEqualTo(4L));
}
 
开发者ID:apache,项目名称:beam,代码行数:26,代码来源:DataflowPipelineJobTest.java


注:本文中的com.google.api.services.dataflow.model.Job类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。