2013-07-16 7 views
1

我有一個Java Web服務(使用我公司的專有技術製作),用於處理請求/響應,並在處理請求時嘗試與Hadoop的Hive並執行查詢。但是,當我僅嘗試初始化連接時,它立即失敗。Java Web服務聯繫蜂巢 - DataNucleus ClassLoaderResolver錯誤

這是失敗的代碼行。我主要使用的代碼示例從https://cwiki.apache.org/confluence/display/Hive/HiveClient

String connString = "jdbc:hive://"; 
Connection con = DriverManager.getConnection(connString, "", ""); 

這裏是堆棧跟蹤:

javax.jdo.JDOFatalInternalException: Unexpected exception caught. 
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1186) 
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) 
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) 
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:246) 
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:275) 
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:208) 
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:183) 
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70) 
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:407) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:359) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:504) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:266) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:228) 
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:131) 
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:121) 
    at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:76) 
    at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104) 
    at java.sql.DriverManager.getConnection(DriverManager.java:582) 
    at java.sql.DriverManager.getConnection(DriverManager.java:185) 
    at (...my package...).RemoteCtrbTest.kickOffRemoteTest(RemoteCtrbTest.java:52) 

NestedThrowablesStackTrace: 
java.lang.reflect.InvocationTargetException 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) 
    at java.lang.reflect.Method.invoke(Method.java:597) 
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953) 
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159) 
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) 
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) 
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:246) 
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:275) 
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:208) 
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:183) 
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70) 
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:407) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:359) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:504) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:266) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:228) 
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:131) 
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:121) 
    at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:76) 
    at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104) 
    at java.sql.DriverManager.getConnection(DriverManager.java:582) 
    at java.sql.DriverManager.getConnection(DriverManager.java:185) 
    at (...my package...).RemoteCtrbTest.kickOffRemoteTest(RemoteCtrbTest.java:52) 

Caused by: org.datanucleus.exceptions.NucleusUserException: Persistence process has been specified to use a ClassLoaderResolver of name "jdo" yet this has not been found by the DataNucleus plugin mechanism. Please check your CLASSPATH and plugin specification. 
    at org.datanucleus.OMFContext.getClassLoaderResolver(OMFContext.java:319) 
    at org.datanucleus.OMFContext.<init>(OMFContext.java:165) 
    at org.datanucleus.OMFContext.<init>(OMFContext.java:137) 
    at org.datanucleus.ObjectManagerFactoryImpl.initialiseOMFContext(ObjectManagerFactoryImpl.java:132) 
    at org.datanucleus.jdo.JDOPersistenceManagerFactory.initialiseProperties(JDOPersistenceManagerFactory.java:363) 
    at org.datanucleus.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:307) 
    at org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:255) 
    at org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182) 
    ... 35 more 

我發現有一個類似的錯誤消息,另外一個問題,但它是關於Maven和沒」 t包含Hive(它是使用DataNucleus的代碼):Datanucleus, JDO and executable jar - how to do it?

我正在使用hive-site.xml文件爲hive和datanucleus指定一些屬性。數據中的數據如下。最後兩個我試圖解決這個問題,當我改變我爲datanucleus.classLoaderResolverName指定的任何東西時,它會更改引號中的錯誤消息。

<property> 
    <name>datanucleus.autoCreateSchema</name> 
    <value>false</value> 
</property> 

<property> 
    <name>datanucleus.fixedDatastore</name> 
    <value>true</value> 
</property> 

<property> 
    <name>datanucleus.classLoaderResolverName</name> 
    <value>jdo</value> 
</property> 

<property> 
    <name>javax.jdo.PersistenceManagerFactoryClass</name> 
    <value>org.datanucleus.jdo.JDOPersistenceManagerFactory</value> 
</property> 

,我不能弄清楚的部分是,如果以某種方式在服務重新捆紮的罐子,如在我上方連結,搞亂plugin.xml中和/或位置的其他計算器問題Manifest.mf文件。我也不確定插件文件如何與配置單元站點文件交互。

這裏的類路徑需要添加特定的jar而不是僅僅一個類路徑。我正在使用以下datanucleus罐: * datanucleus-connectionpool-2.0.3.jar * datanucleus-enhancer-2.0.3.jar * datanucelus-rdbms-2.0.3.jar * datanucleus-core-2.0.3 .jar

任何輸入,你可以幫助我將不勝感激。如果你需要它,我可以提供更多的信息,所以請不要問。

+0

我在使用hive 0的CDH5上運行shark 8.1應用程序時遇到同樣的問題。9 – vacuum

回答

0

DataNucleus顯然使用基於OSGi的插件機制。如果你沒有在OSGi容器中運行它,並且只使用標準的Maven項目,那麼可能發生的情況是插件位於類路徑中,但由於清單問題而未被註冊。你可以嘗試這樣的:

<build> 
    <plugins> 
     <plugin> 
      <groupId>org.apache.maven.plugins</groupId> 
      <artifactId>maven-dependency-plugin</artifactId> 
      <version>2.4</version> 
      <executions> 
       <execution> 
        <id>copy-dependencies</id> 
        <phase>package</phase> 
        <goals> 
         <goal>copy-dependencies</goal> 
        </goals> 
        <configuration> 
         <outputDirectory>${project.build.directory}/jars</outputDirectory> 
         <overWriteReleases>false</overWriteReleases> 
         <overWriteSnapshots>false</overWriteSnapshots> 
         <overWriteIfNewer>true</overWriteIfNewer> 
        </configuration> 
       </execution> 
      </executions> 
     </plugin> 
    </plugins> 
</build> 

這是answered previously

1

如果您在您的公司使用專有技術製作的應用程序中使用Spring框架,那麼您可以利用Spring-Hadoop支持。

所有您需要做的僅僅是添加下面的配置在您的applicationContext

<hdp:configuration> 
    fs.default.name=${fs.default.name.url} 
    mapred.job.tracker=${mapred.job.tracker.url} 
</hdp:configuration> 

<hdp:hive-client-factory host="${hadoop.hive.host.url}" port="10000" 
    xmlns="http://www.springframework.org/schema/hadoop" /> 

<hdp:hive-template /> 

之後,自動裝配HiveTemplate

@Autowired 
HiveTemplate hiveTemplate; 

,然後查詢配置單元,如下圖所示:

List<String> list = hiveTemplate.query(queryString, parameterMap);