我在嘗試運行hadoop上的第一個程序時遇到此異常。 (我在版本0.20.2上使用hadoop新API)。我在網上搜索時,它看起來像大多數人在配置邏輯中沒有設置MapperClass和ReducerClass時面臨這個問題。 但我檢查,它看起來代碼是好的。如果有人能幫助我,我會很感激。在Hadoop中獲取WordCount程序中的異常
java.io.IOException的類型匹配:從地圖鍵:預計org.apache.hadoop.io.Text,在org.apache.hadoop.mapred.MapTask $收到org.apache.hadoop.io.LongWritable MapOutputBuffer.collect(MapTask.java:871)
package com.test.wc;
import java.io.IOException;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
public class WordCountMapper extends Mapper<LongWritable,Text,Text,IntWritable> {
public void Map(LongWritable key,Text value,Context ctx) throws IOException , InterruptedException {
String line = value.toString();
for(String word:line.split("\\W+")) {
if(word.length()> 0){
ctx.write(new Text(word), new IntWritable(1));
}
}
}
}
package com.test.wc;
import java.io.IOException;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;
public class WordCountReducer extends Reducer<Text,IntWritable,Text,IntWritable> {
public void reduce(Text key, Iterable<IntWritable> values, Context ctx) throws IOException,InterruptedException {
int wordCount = 0;
for(IntWritable value:values)
{
wordCount+=value.get();
}
ctx.write(key,new IntWritable(wordCount));
}
}
package com.test.wc;
import java.io.IOException;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class WordCountJob {
public static void main(String args[]) throws IOException, InterruptedException, ClassNotFoundException{
if(args.length!=2){
System.out.println("invalid usage");
System.exit(-1);
}
Job job = new Job();
job.setJarByClass(WordCountJob.class);
job.setJobName("WordCountJob");
FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
job.setMapperClass(WordCountMapper.class);
job.setReducerClass(WordCountReducer.class);
//job.setCombinerClass(WordCountReducer.class);
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(IntWritable.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
System.exit(job.waitForCompletion(true) ? 0:1);
}
}
您是否試圖在@覆蓋註解中使用?你的'map()'方法有一個大寫'M',可能會導致使用默認的'map()'而不是你的版本。 – Quetzalcoatl
@Quetzalcoatl評論是你遇到的問題 - 默認的地圖方法是一個標識函數,並將輸出相同的輸入鍵/值對 - 更改您的地圖方法名稱爲小寫,並添加一個「@ Override」註釋的方法。 –