2017-07-03 94 views
0

我可以使用JDBCIO和Avro Coder訪問我的mysql表。現在我正在嘗試使用JdbcIO加載我的配置單元數據庫。 從數據流連接到配置單元時拋出以下異常。任何梁怪才的幫助將是非常有幫助的。在apache中使用JdbcIO訪問Hive拋出java.lang.NoClassDefFoundError:org/apache/avro/reflect/AvroSchema

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/reflect/AvroSchema 
at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.recurse(AvroCoder.java:426) 
at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.check(AvroCoder.java:419) 
at org.apache.beam.sdk.coders.AvroCoder.<init>(AvroCoder.java:259) 
at org.apache.beam.sdk.coders.AvroCoder.of(AvroCoder.java:120) 
at com.google.cloud.bigquery.csv.loader.GoogleSQLPipeline.main(GoogleSQLPipeline.java:101) 
Caused by: java.lang.ClassNotFoundException: org.apache.avro.reflect.AvroSchema 
at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335) 
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
... 5 more 

下面的代碼片段試圖訪問蜂巢:

dataflowPipeline 
      .apply(JdbcIO.<Customer>read() 
        .withDataSourceConfiguration(JdbcIO.DataSourceConfiguration 
          .create("org.apache.hive.jdbc.HiveDriver", "jdbc:hive2://<ip>/mydb") 
          .withUsername("username").withPassword("password")) 
        .withQuery(
          "select c_customer_id,c_first_name,c_last_name,c_preferred_cust_flag,c_birth_day,c_birth_month,c_birth_year,c_birth_country,c_customer_sk,c_current_cdemo_sk,c_current_hdemo_sk from customer") 
        .withRowMapper(new JdbcIO.RowMapper<Customer>() { 
         @Override 
         public Customer mapRow(ResultSet resultSet) throws Exception 

POM依賴條件:


<dependencies> 
    <dependency> 
     <groupId>com.google.cloud.dataflow</groupId> 
     <artifactId>google-cloud-dataflow-java-sdk-all</artifactId> 
     <version>2.0.0</version> 
    </dependency> 
    <!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-java-io-jdbc --> 
    <dependency> 
     <groupId>org.apache.beam</groupId> 
     <artifactId>beam-sdks-java-io-jdbc</artifactId> 
     <version>2.0.0</version> 
    </dependency> 

    <!-- https://mvnrepository.com/artifact/org.apache.hive/hive-jdbc --> 
    <dependency> 
     <groupId>org.apache.hive</groupId> 
     <artifactId>hive-jdbc</artifactId> 
     <version>1.2.1</version> 
    </dependency> 
    <dependency> 
     <groupId>jdk.tools</groupId> 
     <artifactId>jdk.tools</artifactId> 
     <version>1.8.0_131</version> 
     <scope>system</scope> 
     <systemPath>${JAVA_HOME}/lib/tools.jar</systemPath> 
    </dependency> 

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common --> 
    <dependency> 
     <groupId>org.apache.hadoop</groupId> 
     <artifactId>hadoop-common</artifactId> 
     <version>2.8.1</version> 
    </dependency> 

    <dependency> 
     <groupId>com.google.guava</groupId> 
     <artifactId>guava</artifactId> 
     <version>18.0</version> 
    </dependency> 
    <!-- slf4j API frontend binding with JUL backend --> 
    <dependency> 
     <groupId>org.slf4j</groupId> 
     <artifactId>slf4j-jdk14</artifactId> 
     <version>1.7.14</version> 
    </dependency> 
</dependencies> 
+0

增加了對Avro 1.8.1的依賴和手動導入的導入org.apache.avro.reflect.AvroSchema;現在問題解決了。再次,與com.google.protobuf.GeneratedMessageV3相關的問題可以通過手動導入此類來解決。 – Balu

回答

0

增加依賴於Avro的1.8.1和手動導入進口org.apache .avro.reflect.AvroSchema;現在問題解決了。再次,與com.google.protobuf.GeneratedMessageV3相關的問題可以通過手動導入此類來解決。

相關問題