2014-11-02 74 views
0

我想從MapReduce運行MaxTemperature示例。但是我無法在Hadoop MapReduce示例中找到MaxTemperature.jar。有人可以幫我找到jar文件或者執行這個程序的可能性,看看輸出結果是什麼?Hadoop上的MapReduce的MaxTemperature示例

+0

的Pssible重複http://stackoverflow.com/questions/19064300/mapreduce-java-program- to-calaculate-max-temperature-not-starting-to-run-it-is-r在這裏,您可以使用所有可以創建jar的代碼。你不會在hadoop安裝中獲得這個jar。 – SMA 2014-11-02 12:05:29

回答

0

試試這個,這個方案的

import java.io.IOException; 
import java.util.*; 

import org.apache.hadoop.fs.Path; 
import org.apache.hadoop.conf.*; 
import org.apache.hadoop.io.*; 
import org.apache.hadoop.mapred.*; 
import org.apache.hadoop.util.*; 
import org.apache.hadoop.io.IntWritable; 
import org.apache.hadoop.io.LongWritable; 
import org.apache.hadoop.io.Text; 
import org.apache.hadoop.mapreduce.Mapper; 






public class Temp { 

    public static class Map extends Mapper<LongWritable, Text, Text, IntWritable> { 
    // private final static IntWritable one = new IntWritable(); 
    // private Text word = new Text(); 

    public void map(LongWritable key, Text value, Context context) throws IOException InterruptedException 
{ 
     String line = value.toString(); 
     String year=line.substring(0,4); 
     //StringTokenizer tokenizer = new StringTokenizer(line); 
    // while (tokenizer.hasMoreTokens()) { 
     // word.set(tokenizer.nextToken()); 
     // output.collect(word, one); 
     int Temp=Integer.parseInt(line.substring(6,8)); 
     context.write(new Text(year),new IntWritable(Temp)); 
     } 
    } 
    } 
/* 
    public static class Reduce extends Reducer<Text, IntWritable, Text, IntWritable> { 
    public void reduce(Text key, Iterator<IntWritable> values, OutputCollector<Text, IntWritable> output, Reporter reporter) throws IOException { 
    // int max=Integer.MIN_VALUE; 
     int sum=0; 
     while (values.hasNext()) { 
      sum += values.next().get(); 
     } 
     output.collect(key, new IntWritable(sum)); 
    } 
    }*/ 
public class Reduce extends Reducer<Text, IntWritable, Text, IntWritable> 
{ 
@Override 
    public void reduce(Text key, Iterable<IntWritable> values,Context context) 
    throws IOException, InterruptedException 
    { 

     int maxValue = Integer.MIN_VALUE; 
      for (IntWritable value : values) { 
      maxValue = Math.max(maxValue, value.get()); 
    } 
      context.write(key, new IntWritable(maxValue)); 
     } 
} 

    public static void main(String[] args) throws Exception { 
    JobConf conf = new JobConf(Temp.class); 
    conf.setJobName("Temp"); 

    conf.setOutputKeyClass(Text.class); 
    conf.setOutputValueClass(IntWritable.class); 

    conf.setMapperClass(Map.class); 
    conf.setCombinerClass(Reduce.class); 
    conf.setReducerClass(Reduce.class); 

    conf.setInputFormat(TextInputFormat.class); 
    conf.setOutputFormat(TextOutputFormat.class); 

    FileInputFormat.setInputPaths(conf, new Path(args[0])); 
    FileOutputFormat.setOutputPath(conf, new Path(args[1])); 

    JobClient.runJob(conf); 
    } 
} 

化妝jar文件,執行命令

hadoop jar Temp.jar Temp /hdfs_inputFile /hdfs_inputFile 
+0

對不起,我有源代碼,但我想知道什麼是我可以設置的類路徑來編譯這個程序,以使罐子 – geetha 2014-11-05 04:44:39

+0

你使用一些IDE或在記事本中製作程序? – 2014-11-05 04:55:03

+0

在記事本程序 – geetha 2014-11-08 00:48:32