2014-11-06 54 views
0

我試圖瀏覽這個問題上的很多博客,但沒有運氣。我不確定我在這裏做了什麼錯誤..有人可以幫我解決這個問題!找不到任何jar文件。用戶類別可能不會使用

我的計劃是:

package hadoopbook; 

import java.io.IOException; 

import org.apache.hadoop.conf.Configuration; 
import org.apache.hadoop.fs.Path; 
import org.apache.hadoop.io.IntWritable; 
import org.apache.hadoop.io.LongWritable; 
import org.apache.hadoop.io.Text; 
import org.apache.hadoop.mapreduce.Job; 
import org.apache.hadoop.mapreduce.Mapper; 
import org.apache.hadoop.mapreduce.Reducer; 
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; 
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; 


public class WordCount { 

    //Mapper 
    public static class WcMapperDemo extends Mapper<LongWritable, Text, Text, IntWritable>{ 

     Text MapKey = new Text(); 
     IntWritable MapValue = new IntWritable(); 

     public void map(LongWritable key, Text Value, Context Context) throws IOException, InterruptedException{ 
      String Record = Value.toString(); 
      String[] Words = Record.split(","); 

      for (String Word:Words){ 
       MapKey.set(Word); 
       MapValue.set(1); 
       Context.write(MapKey, MapValue); 
      } 
     } 
    } 

    //Reducer 
    public static class WcReducerDemo extends Reducer<Text, IntWritable, Text, IntWritable>{ 

     IntWritable RedValue = new IntWritable(); 

     public void reduce(Text key, Iterable<IntWritable> Values, Context Context) throws IOException, InterruptedException{ 
      int sum = 0; 

      for (IntWritable Value:Values){ 
       sum = sum + Value.get(); 
      } 
      RedValue.set(sum); 
      Context.write(key, RedValue); 
     } 
    } 

    //Driver 
    public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException { 
     // TODO Auto-generated method stub 

     Configuration Conf = new Configuration(); 

     Job Job = new Job(Conf, "Word Count Job"); 

     Job.setJarByClass(WordCount.class); 
     Job.setMapperClass(WcMapperDemo.class); 
     Job.setReducerClass(WcReducerDemo.class); 

     Job.setMapOutputKeyClass(Text.class); 
     Job.setMapOutputValueClass(IntWritable.class); 

     Job.setOutputKeyClass(Text.class); 
     Job.setOutputValueClass(IntWritable.class); 

     FileInputFormat.addInputPath(Job, new Path (args[0])); 
     FileOutputFormat.setOutputPath(Job, new Path (args[1])); 

     System.exit(Job.waitForCompletion(true) ? 0:1); 
    } 
} 

和我得到的錯誤是這樣通過執行以下命令:

[[email protected] Desktop]$ sudo -u hdfs hadoop jar WordCount.jar hadoopbook.WordCount /user/cloudera/InputFiles/Small/Words.txt /user/cloudera/Output 

,我有我的桌面文件夾中的文件WordCount.jar!

[[email protected] Desktop]$ sudo -u hdfs hadoop jar WordCount.jar hadoopbook.WordCount /user/cloudera/InputFiles/Small/Words.txt /user/cloudera/Output 
14/11/06 02:56:15 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 
14/11/06 02:56:15 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 
14/11/06 02:56:15 INFO input.FileInputFormat: Total input paths to process : 1 
14/11/06 02:56:16 INFO mapred.JobClient: Running job: job_201411040035_0017 
14/11/06 02:56:17 INFO mapred.JobClient: map 0% reduce 0% 
14/11/06 02:56:29 INFO mapred.JobClient: Task Id : attempt_201411040035_0017_m_000000_0, Status : FAILED 
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found 
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1617) 
    at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:191) 
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:631) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330) 
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:396) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) 
    at org.apache.hadoop.mapred.Child.main(Child.java:262) 
Caused by: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found 
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1523) 
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1615) 
    ... 8 more 

14/11/06 02:56:36 INFO mapred.JobClient: Task Id : attempt_201411040035_0017_m_000000_1, Status : FAILED 
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found 
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1617) 
    at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:191) 
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:631) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330) 
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:396) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) 
    at org.apache.hadoop.mapred.Child.main(Child.java:262) 
Caused by: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found 
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1523) 
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1615) 
    ... 8 more 

14/11/06 02:56:42 INFO mapred.JobClient: Task Id : attempt_201411040035_0017_m_000000_2, Status : FAILED 
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found 
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1617) 
    at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:191) 
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:631) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330) 
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:396) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) 
    at org.apache.hadoop.mapred.Child.main(Child.java:262) 
Caused by: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found 
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1523) 
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1615) 
    ... 8 more 

14/11/06 02:56:54 INFO mapred.JobClient: Job complete: job_201411040035_0017 
14/11/06 02:56:54 INFO mapred.JobClient: Counters: 7 
14/11/06 02:56:54 INFO mapred.JobClient: Job Counters 
14/11/06 02:56:54 INFO mapred.JobClient:  Failed map tasks=1 
14/11/06 02:56:54 INFO mapred.JobClient:  Launched map tasks=4 
14/11/06 02:56:54 INFO mapred.JobClient:  Data-local map tasks=4 
14/11/06 02:56:54 INFO mapred.JobClient:  Total time spent by all maps in occupied slots (ms)=33819 
14/11/06 02:56:54 INFO mapred.JobClient:  Total time spent by all reduces in occupied slots (ms)=0 
14/11/06 02:56:54 INFO mapred.JobClient:  Total time spent by all maps waiting after reserving slots (ms)=0 
14/11/06 02:56:54 INFO mapred.JobClient:  Total time spent by all reduces waiting after reserving slots (ms)=0 

我剛纔試過這個,但沒有運氣!我已經確保我的文件在桌面上!

[[email protected] ~]$ sudo -u hdfs hadoop jar /home/cloudera/Desktop/WordCount.jar hadoopbook.WordCount /user/cloudera/InputFiles/Small/Words.txt /user/cloudera/Output 
Exception in thread "main" java.io.IOException: Error opening job jar: /home/cloudera/Desktop/WordCount.jar 
    at org.apache.hadoop.util.RunJar.main(RunJar.java:135) 
Caused by: java.util.zip.ZipException: error in opening zip file 
    at java.util.zip.ZipFile.open(Native Method) 
    at java.util.zip.ZipFile.<init>(ZipFile.java:127) 
    at java.util.jar.JarFile.<init>(JarFile.java:135) 
    at java.util.jar.JarFile.<init>(JarFile.java:72) 
    at org.apache.hadoop.util.RunJar.main(RunJar.java:133) 

回答

0

嘗試通過以下命令

>hadoop jar <<absolute path to WordCount.jar>> hadoopbook.WordCount /user/cloudera/InputFiles/Small/Words.txt /user/cloudera/Output 
+0

維傑 - 嗨!剛剛嘗試過,但沒有運氣!它給了我另一個錯誤。我在我的實際問題中給出了錯誤的詳細信息!你能幫我麼! – 2014-11-07 04:46:04

+0

可能下面的鏈接可能是有用的http://stackoverflow.com/questions/19093327/error-while-trying-to-run-jar-in-hadoop http://stackoverflow.com/questions/17788865/exception-in- thread-main-java-io-ioexception-error-opening-job-jar-ex-jar-in – 2014-11-07 04:54:02

+0

試過Vijay!沒有運氣! 我無法將該jar複製到HDFS! $ sudo -u hdfs hadoop fs -copyFromLocal /home/cloudera/Desktop/WordCount.jar /user/cloudera/Programs/WordCount.jar copyFromLocal:'/ home/cloudera/Desktop/WordCount。 jar':沒有這樣的文件或目錄 – 2014-11-07 06:45:40

相關問題