2016-10-07 136 views
2

我正在編寫一個應該在hadoop-hdfs中存儲文件的軟件,當然我想爲此特定功能編寫測試用例。不幸的是,當我嘗試構建()MiniDFSCluster時,我得到以下結果。initMiniDFSCluster拋出NoClassDefFoundError(hadoop客戶端測試)

16/10/07 16:16:33 INFO hdfs.MiniDFSCluster: starting cluster: numNameNodes=1, numDataNodes=2 
16/10/07 16:16:33 INFO hdfs.MiniDFSCluster: Shutting down the Mini HDFS Cluster 

java.lang.NoClassDefFoundError: org/apache/hadoop/net/StaticMapping 

    at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:792) 
    at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:475) 
    at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:434) 
    at de.tuberlin.cit.storageassistant.ArchiveManagerTest.setUp(ArchiveManagerTest.java:33) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:497) 
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) 
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) 
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) 
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) 
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) 
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) 
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) 
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) 
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) 
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) 
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) 
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) 
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) 
    at org.junit.runners.ParentRunner.run(ParentRunner.java:363) 
    at org.junit.runner.JUnitCore.run(JUnitCore.java:137) 
    at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69) 
    at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:234) 
    at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:74) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:497) 
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144) 
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.net.StaticMapping 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
    ... 31 more 


Process finished with exit code 255 

這些都是我的Hadoop的pom.xml依賴關係:

<dependency> 
     <groupId>org.apache.hadoop</groupId> 
     <artifactId>hadoop-common</artifactId> 
     <version>${hadoop.version}</version> 
    </dependency> 

    <dependency> 
     <groupId>org.apache.hadoop</groupId> 
     <artifactId>hadoop-hdfs</artifactId> 
     <version>${hadoop.version}</version> 
    </dependency> 

    <dependency> 
     <groupId>org.apache.hadoop</groupId> 
     <artifactId>hadoop-hdfs</artifactId> 
     <type>test-jar</type> 
     <version>${hadoop.version}</version> 
     <scope>test</scope> 
    </dependency> 

這甚至給調試使用的hadoop如果是這樣,我怎麼能得到這個工作的一個工具的正確途徑。我使用經典示例代碼進行測試:

@org.junit.Before 
public void setUp() throws Exception { 
//  super.setUp(); 
    Configuration conf = new Configuration(); 
//  conf.set("fs.defaultFS", "hdfs://localhost:9000"); 
    File baseDir = new File("./target/hdfs/").getAbsoluteFile(); 
    FileUtil.fullyDelete(baseDir); 
    conf.set(MiniDFSCluster.HDFS_MINIDFS_BASEDIR, baseDir.getAbsolutePath()); 
    dfsCluster = new MiniDFSCluster 
      .Builder(conf) 
      .checkExitOnShutdown(true) 
      .numDataNodes(2) 
      .format(true) 
      .racks(null) 
      .build(); 
    hdfsURI = "hdfs://localhost:"+ dfsCluster.getNameNodePort() + "/"; 
} 

@org.junit.After 
public void tearDown() throws Exception { 
    if (dfsCluster != null) { 
     dfsCluster.shutdown(); 
    } 
} 

任何幫助或推薦?

回答

2

面臨着同樣的問題,能夠與添加的依賴性hadoop-common:tests來解決它:

<dependency> 
    <groupId>org.apache.hadoop</groupId> 
    <artifactId>hadoop-common</artifactId> 
    <version>${hadoop.version}</version> 
    <classifier>tests</classifier> 
</dependency> 

適當類是在用於生產代碼hadoop-common神器下落不明。

相關問題