2016-03-07 22 views
1

我不明白Hadoop中的清理方法究竟是幹什麼的,它是如何工作的?我有以下Map-Reduce代碼來計算一堆數字的最大值,最小值和平均值。清理(上下文)方法有什麼作用?

public class Statistics 
{ 
    public static class Map extends Mapper<LongWritable, Text, Text, Text> 
    { 
     public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException 
     { 
      /* code to calculate min, max, and mean from among a bunch of numbers */ 
     } 
     public void cleanup(Context context) throws IOException, InterruptedException 
     { 
      Text key_min = new Text(); 
      key_min.set("min"); 
      Text value_min = new Text(); 
      value_min.set(String.valueOf(min)); 
      context.write(key_min,value_min); 

      Text key_max = new Text(); 
      key_max.set("max"); 
      Text value_max = new Text(); 
      value_max.set(String.valueOf(max)); 
      context.write(key_max,value_max); 

      Text key_avg = new Text(); 
      key_avg.set("avg"); 
      Text value_avg = new Text(); 
      value_avg.set(String.valueOf(linear_sum)+","+count); 
      context.write(key_avg,value_avg); 

      Text key_stddev = new Text(); 
      key_stddev.set("stddev"); 
      Text value_stddev = new Text(); 
      value_stddev.set(String.valueOf(linear_sum)+","+count+","+String.valueOf(quadratic_sum)); 
      context.write(key_stddev,value_stddev); 
     } 
    } 
    public static class Reduce extends Reducer<Text,Text,Text,Text> 
    { 
     public void reduce(Text key, Iterable<Text> values,Context context) throws IOException, InterruptedException 
     { 
      /* code to further find min, max and mean from among the outputs of different mappers */ 
     } 
    } 
    public static void main(String[] args) throws Exception 
    { 
     /* driver program */ 
    } 
} 

那麼cleanup(Context context)方法究竟在做什麼呢?我假設它從一羣映射器收集輸出(鍵,值)對並將其傳遞給reducer。在其他網站上,我讀過在MapReduce中運行的命令是:setup - > map - > cleanup,然後設置 - > reduce - > cleanup。爲什麼這個程序不使用設置方法?

回答

相關問題