2014-04-03 26 views
3

我的代碼是警告mapred.JobClient:沒有作業jar文件集。用戶類可能無法找到

import java.io.IOException; 
import java.util.*; 

import org.apache.hadoop.fs.Path; 
import org.apache.hadoop.conf.*; 
import org.apache.hadoop.io.*; 
import org.apache.hadoop.mapreduce.*; 
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; 
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat; 
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; 
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat; 

public class word_count_new { 

public static class Map extends Mapper<LongWritable, Text, Text, IntWritable> { 
    private final static IntWritable one = new IntWritable(1); 
    private Text word = new Text(); 

    public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { 
     String line = value.toString(); 
     StringTokenizer tokenizer = new StringTokenizer(line); 
     while (tokenizer.hasMoreTokens()) { 
      word.set(tokenizer.nextToken()); 
      context.write(word, one); 
     } 
    } 
} 

public static class Reduce extends Reducer<Text, IntWritable, Text, IntWritable> { 

    public void reduce(Text key, Iterable<IntWritable> values, Context context) 
     throws IOException, InterruptedException { 
     int sum = 0; 
     for (IntWritable val : values) { 
      sum += val.get(); 
     } 
     context.write(key, new IntWritable(sum)); 
    } 
} 
    public static void main(String[] args) throws Exception { 
    Configuration conf = new Configuration(); 

     Job job = new Job(conf, "wordcount"); 
     // job.setJarByClass(word_count_new.class); 
    // conf.setJar(word_count_new.jar); 
    job.setOutputKeyClass(Text.class); 
    job.setOutputValueClass(IntWritable.class); 

    job.setMapperClass(Map.class); 
    job.setReducerClass(Reduce.class); 

    job.setInputFormatClass(TextInputFormat.class); 
    job.setOutputFormatClass(TextOutputFormat.class); 

    FileInputFormat.addInputPath(job, new Path(args[0])); 
    FileOutputFormat.setOutputPath(job, new Path(args[1])); 
    job.setJarByClass(word_count_new.class); 
    job.waitForCompletion(true); 

} 

} 

下面是類文件和罐子:

-rw-r----- 1 ps993w hyhdev 2236 Apr 3 13:56 word_count_new.java 
-rw-r----- 1 ps993w hyhdev 1870 Apr 3 13:58 word_count_new$Map.class 
-rw-r----- 1 ps993w hyhdev 1638 Apr 3 13:58 word_count_new$Reduce.class 
-rw-r----- 1 ps993w hyhdev 1510 Apr 3 13:58 word_count_new.class 
-rw-r----- 1 ps993w hyhdev 2990 Apr 3 13:58 word_count_new.jar 

,誤差

[[email protected] ~]$ hadoop jar word_count_new.jar word_count_new /user/ps993w/indata/input_line.dat /user/ps993w/wordcount/ 
14/04/03 15:53:13 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 
14/04/03 15:53:13 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 
14/04/03 15:53:13 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 105404 for ps993w on 130.4.240.48:8020 
14/04/03 15:53:13 INFO security.TokenCache: Got dt for hdfs://hltd410.hydc.sbc.com:8020/user/ps993w/.staging/job_201402241341_9518;uri=130.4.240.48:8020;t.service=130.4.240.48:8020 
14/04/03 15:53:13 INFO input.FileInputFormat: Total input paths to process : 1 
14/04/03 15:53:13 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library 
14/04/03 15:53:13 INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev cf4e7cbf8ed0f0622504d008101c2729dc0c9ff3] 
14/04/03 15:53:13 WARN snappy.LoadSnappy: Snappy native library is available 
14/04/03 15:53:13 INFO util.NativeCodeLoader: Loaded the native-hadoop library 
14/04/03 15:53:13 INFO snappy.LoadSnappy: Snappy native library loaded 
14/04/03 15:53:13 INFO mapred.JobClient: Running job: job_201402241341_9518 
14/04/03 15:53:14 INFO mapred.JobClient: map 0% reduce 0% 
14/04/03 15:53:24 INFO mapred.JobClient: Task Id : attempt_201402241341_9518_m_000000_0, Status : FAILED 
java.lang.RuntimeException: java.lang.ClassNotFoundException: word_count_new$Map 

請建議

回答

相關問題