1

執行DataFlow管道時,我們每隔一段時間就會看到這些例外情況。我們能爲他們做些什麼嗎?我們有一個非常簡單的流程,它從BigQuery查詢中讀取數據並在BigTable中填充數據。Google Cloud Dataflow管道中的例外情況從BigQuery到Cloud Bigtable

另外管道內的數據會發生什麼變化?它是否被重新處理?還是在傳輸到BigTable時丟失了?

CloudBigtableIO.initializeForWrite(p); 
    p.apply(BigQueryIO.Read.fromQuery(getQuery())) 
    .apply(ParDo.of(new DoFn<TableRow, Mutation>() { 
      public void processElement(ProcessContext c) { 
      Mutation output = convertDataToRow(c.element()); 
      c.output(output); 
      } 

      })) 
     .apply(CloudBigtableIO.writeToTable(config)); 


private static Mutation convertDataToRow(TableRow element) { 
    LOG.info("element: "+ element); 
    LOG.info("BASM_segment_id: "+ element.get("BASM_segment_id")); 
    if(element.get("BASM_AID") != null){ 
     Put obj = new Put(getRowKey(element).getBytes()).addColumn(SEGMENT_FAMILY, SEGMENT_COLUMN_NAME, ((String)element.get("BAS_category")).getBytes()); 
       obj.addColumn(USER_FAMILY, "AID".getBytes(), ((String)element.get("BASM_AID")).getBytes()); 
     if(element.get("BASM_segment_id") != null){ 
       obj.addColumn(SEGMENT_FAMILY, "segment_id".getBytes(), ((String)element.get("BASM_segment_id")).getBytes()); 
     } 
     if(element.get("BAS_sub_category") != null){ 
       obj.addColumn(SEGMENT_FAMILY, "sub_category".getBytes(), ((String)element.get("BAS_sub_category")).getBytes()); 
     } 
     if(element.get("BAS_name") != null){ 
       obj.addColumn(SEGMENT_FAMILY, "name".getBytes(), ((String)element.get("BAS_name")).getBytes()); 
     } 
     if(element.get("BAS_description") != null){ 
       obj.addColumn(SEGMENT_FAMILY, "description".getBytes(), ((String)element.get("BAS_description")).getBytes()); 
     } 
     if(element.get("BAS_last_compute_day") != null){obj.addColumn(USER_FAMILY, "Krux_User_id".getBytes(), ((String)element.get("BASM_krux_user_id")).getBytes()); 
       obj.addColumn(SEGMENT_FAMILY, "last_compute_day".getBytes(), ((String)element.get("BAS_last_compute_day")).getBytes()); 
     } 
     if(element.get("BAS_type") != null){ 
       obj.addColumn(SEGMENT_FAMILY, "type".getBytes(), ((String)element.get("BAS_type")).getBytes()); 
     }  
     if(element.get("BASM_REGID") != null){ 
       obj.addColumn(USER_FAMILY, "REGID".getBytes(), ((String)element.get("BASM_REGID")).getBytes()); 
     } 
     return obj; 
    }else{ 
     return null; 
    } 
    } 

以下是我們所得到的例外:由造成

2016-08-22T21:47:33.469Z: Error: (84707221e08b977b): java.lang.RuntimeException: com.google.cloud.dataflow.sdk.util.UserCodeExc ption: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: StatusRuntimeException: 1 time, at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:162) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnContext.outputWindowedValue(DoFnRunnerBase.java:287) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnProcessContext.output(DoFnRunnerBase.java:449) at com.nytimes.adtech.dataflow.pipelines.BigTableSegmentData$2.processElement(BigTableSegmentData.java:70) Caused by: com.google.cloud.dataflow.sdk.util.UserCodeException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsExcept on: Failed 1 action: StatusRuntimeException: 1 time, at com.google.cloud.dataflow.sdk.util.UserCodeException.wrap(UserCodeException.java:35) at com.google.cloud.dataflow.sdk.util.UserCodeException.wrapIf(UserCodeException.java:40) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase.wrapUserCodeException(DoFnRunnerBase.java:368) at com.google.cloud.dataflow.sdk.util.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:51) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase.processElement(DoFnRunnerBase.java:138) at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:190) at com.google.cloud.dataflow.sdk.runners.worker.ForwardingParDoFn.processElement(ForwardingParDoFn.java:42) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerLoggingParDoFn.processElement(DataflowWorkerLoggingParDoFn. ava:47) at com.google.cloud.dataflow.sdk.util.common.worker.ParDoOperation.process(ParDoOperation.java:53) at com.google.cloud.dataflow.sdk.util.common.worker.OutputReceiver.process(OutputReceiver.java:52) at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:160) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnContext.outputWindowedValue(DoFnRunnerBase.java:287) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnProcessContext.output(DoFnRunnerBase.java:449) at com.nytimes.adtech.dataflow.pipelines.BigTableSegmentData$2.processElement(BigTableSegmentData.java:70) at com.google.cloud.dataflow.sdk.util.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:49) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase.processElement(DoFnRunnerBase.java:138) at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:190) at com.google.cloud.dataflow.sdk.runners.worker.ForwardingParDoFn.processElement(ForwardingParDoFn.java:42) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerLoggingParDoFn.processElement(DataflowWorkerLoggingParDoFn. ava:47) at com.google.cloud.dataflow.sdk.util.common.worker.ParDoOperation.process(ParDoOperation.java:53) at com.google.cloud.dataflow.sdk.util.common.worker.OutputReceiver.process(OutputReceiver.java:52) at com.google.cloud.dataflow.sdk.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:226) at com.google.cloud.dataflow.sdk.util.common.worker.ReadOperation.start(ReadOperation.java:167) at com.google.cloud.dataflow.sdk.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:71) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.executeWork(DataflowWorker.java:288) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.doWork(DataflowWorker.java:221) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:173) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.doWork(DataflowWorkerHarness.java:193) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:173) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:160) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: StatusRuntimeException: 1 time, at com.google.cloud.bigtable.hbase.BigtableBufferedMutator.handleExceptions(BigtableBufferedMutator.java:389) at com.google.cloud.bigtable.hbase.BigtableBufferedMutator.mutate(BigtableBufferedMutator.java:274) at com.google.cloud.bigtable.dataflow.CloudBigtableIO$CloudBigtableSingleTableBufferedWriteFn.processElement(CloudBigtabl IO.java:966)

異常數據流從控制檯複製

(7e75740160102c05): java.lang.RuntimeException: com.google.cloud.dataflow.sdk.util.UserCodeException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: StatusRuntimeException: 1 time, at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:162) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnContext.outputWindowedValue(DoFnRunnerBase.java:287) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnProcessContext.output(DoFnRunnerBase.java:449) at com.nytimes.adtech.dataflow.pipelines.BigTableSegmentData$2.processElement(BigTableSegmentData.java:70) Caused by: com.google.cloud.dataflow.sdk.util.UserCodeException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: StatusRuntimeException: 1 time, at com.google.cloud.dataflow.sdk.util.UserCodeException.wrap(UserCodeException.java:35) at com.google.cloud.dataflow.sdk.util.UserCodeException.wrapIf(UserCodeException.java:40) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase.wrapUserCodeException(DoFnRunnerBase.java:368) at com.google.cloud.dataflow.sdk.util.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:51) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase.processElement(DoFnRunnerBase.java:138) at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:190) at com.google.cloud.dataflow.sdk.runners.worker.ForwardingParDoFn.processElement(ForwardingParDoFn.java:42) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerLoggingParDoFn.processElement(DataflowWorkerLoggingParDoFn.java:47) at com.google.cloud.dataflow.sdk.util.common.worker.ParDoOperation.process(ParDoOperation.java:53) at com.google.cloud.dataflow.sdk.util.common.worker.OutputReceiver.process(OutputReceiver.java:52) at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:160) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnContext.outputWindowedValue(DoFnRunnerBase.java:287) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnProcessContext.output(DoFnRunnerBase.java:449) at com.nytimes.adtech.dataflow.pipelines.BigTableSegmentData$2.processElement(BigTableSegmentData.java:70) at com.google.cloud.dataflow.sdk.util.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:49) at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase.processElement(DoFnRunnerBase.java:138) at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:190) at com.google.cloud.dataflow.sdk.runners.worker.ForwardingParDoFn.processElement(ForwardingParDoFn.java:42) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerLoggingParDoFn.processElement(DataflowWorkerLoggingParDoFn.java:47) at com.google.cloud.dataflow.sdk.util.common.worker.ParDoOperation.process(ParDoOperation.java:53) at com.google.cloud.dataflow.sdk.util.common.worker.OutputReceiver.process(OutputReceiver.java:52) at com.google.cloud.dataflow.sdk.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:226) at com.google.cloud.dataflow.sdk.util.common.worker.ReadOperation.start(ReadOperation.java:167) at com.google.cloud.dataflow.sdk.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:71) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.executeWork(DataflowWorker.java:288) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.doWork(DataflowWorker.java:221) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:173) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.doWork(DataflowWorkerHarness.java:193) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:173) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:160) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: StatusRuntimeException: 1 time, at com.google.cloud.bigtable.hbase.BigtableBufferedMutator.handleExceptions(BigtableBufferedMutator.java:389) at com.google.cloud.bigtable.hbase.BigtableBufferedMutator.mutate(BigtableBufferedMutator.java:274) at com.google.cloud.bigtable.dataflow.CloudBigtableIO$CloudBigtableSingleTableBufferedWriteFn.processElement(CloudBigtableIO.java:966) 

2016年8月23日(13:17:54)java.lang.Runti meException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud。 dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException: org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop ... 。 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23( 13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java。 lang.RuntimeException:com.google.cloud.dataflow.sdk.util.User CodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop。 ... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08- 23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54) java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com。 google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk。 util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13 :17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang .RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud .dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException :org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .. .. 2016-08-23(13:17:54)java.lang.Run timeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud。 dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException: org.apache。hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016- 08-23(13:17:54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17: 54)java.lang.RuntimeException:com.google.cloud.dataflow.sdk.util.UserCodeException:org.apache.hadoop .... 2016-08-23(13:17:54)java.lang.RuntimeException:

在此先感謝

+0

您使用的客戶端版本是? 0.9.1? –

+0

@ LesVogel-GoogleDevRel是的,我們正在使用0.9.1版本的bigtable-hbase-dataflow – Amandeep

+0

我已經要求某個人在工程中發表評論 - 應該在今天晚些時候發表評論。 –

回答

1

我們脫機了。這裏的問題是,與羣集中的Cloud Bigtable節點數量相比,您擁有太多的Dataflow工作者。您需要通過reducing Dataflow workers或聯繫我們的團隊來增加您的Cloud Bigtable資源來更改此比例。

Bigtable相對於您所擁有的Cloud Bigtable節點數量表現出色,但來自Dataflow的負載過高而無法可靠處理。

您可以在Google雲端控制檯的"CPU Usage"圖表中查看您的使用情況。超過80%的容量可能會導致問題。如果獲得更多Bigtable Quota,則可以在運行Dataflow作業之前增加節點的數量,並在作業完成後將其減少。例如,Scio does that

==

關於「?還有發生了什麼數據管道內是否有再加工或者是在運輸途中丟失到BigTable的?」:

數據流嘗試將數據再次發送至Bigtable的。在這些情況下,Dataflow的重試機制將糾正臨時問題。

不幸的是,當問題變成Cloud Bigtable重載時,重試會通過向Bigtable發送更多流量來增加問題的複雜性。

+0

謝謝@Solomon Duskis,我減少了現在我看不到的工人數量這是前所未有的例外,但顯然,我的工作現在需要更多時間才能完成。 – Amandeep

+0

是的。減少工人將做到這一點。我快速查看了您的圖表,看起來您可以添加更多的工作人員。 或者,您可以要求我們增加配額,並在開始數據流之前添加更多Bigtable節點,並在完成後減少數量。 Bigtable節點越多,您可以執行的吞吐量就越多,而且您的工作完成得越快。 –

相關問題