2012-07-07 143 views
0

我使用Hadoop 0.18.3Hadoop的ClassCastException異常

java.lang.ClassCastException遇到以下錯誤:org.apache.hadoop.io.Text不能轉換到org.apache.hadoop.io.DoubleWritable

我定義我的映射爲:

public class HadoopMapper extends MapReduceBase implements Mapper<Text,DoubleWritable,Text,DoubleWritable> { 
// The Karmasphere Studio Workflow Log displays logging from Apache Commons Logging, for example: 
// private static final Log LOG = LogFactory.getLog("HadoopMapper"); 

@Override 
public void map(Text key, DoubleWritable value, OutputCollector<Text, DoubleWritable> output, Reporter reporter) 
     throws IOException { 
//  throw new UnsupportedOperationException("Not supported yet."); 
    Random generator = new Random(); 
    int i; 

    final int iter = 100000; 

    for (i =0; i < iter; i++) 
    { 
    double x = generator.nextDouble(); 
    double y = generator.nextDouble(); 

    double z; 

    z = x*x + y*y; 

    if (z <= 1){ 
     output.collect(new Text("VALUE"), new DoubleWritable(1)); 
    }else{ 
     output.collect(new Text ("VALUE"), new DoubleWritable(0)); 
    } 
    } 


    } 
} 

和減速類作爲

public class HadoopReducer extends MapReduceBase implements Reducer<Text,DoubleWritable,Text,DoubleWritable> { 
// The Karmasphere Studio Workflow Log displays logging from Apache Commons Logging, for example: 
// private static final Log LOG = LogFactory.getLog("HadoopReducer"); 

@Override 
public void reduce(Text key, Iterator<DoubleWritable> value, OutputCollector<Text, DoubleWritable> output, Reporter reporter) 
     throws IOException { 
    // TODO code reducer logic here 
//  throw new UnsupportedOperationException("Not supported yet."); 

    double pi = 0; 
    double inside = 0; 
    double outside = 0; 

    while (value.hasNext()) 
    { 
    if (value.next().get() == (long)1) 
    inside++; 
    else 
    outside++; 
    } 

    pi = (4*inside)/(inside + outside); 

    output.collect(new Text ("pi"), new DoubleWritable(pi)); 
    } 
} 

我設置的作業CONF爲:

public static void initJobConf(JobConf conf) { 
// Generating code using Karmasphere Protocol for Hadoop 0.18 
// CG_GLOBAL 

// CG_INPUT_HIDDEN 
    conf.setInputFormat(KeyValueTextInputFormat.class); 
// CG_MAPPER_HIDDEN 
conf.setMapperClass(HadoopMapper.class); 

// CG_MAPPER 

// CG_PARTITIONER_HIDDEN 
conf.setPartitionerClass(org.apache.hadoop.mapred.lib.HashPartitioner.class); 

// CG_PARTITIONER 

// CG_COMPARATOR_HIDDEN 
conf.setOutputKeyComparatorClass(org.apache.hadoop.io.Text.Comparator.class); 

// CG_COMPARATOR 

// CG_COMBINER_HIDDEN 

// CG_REDUCER_HIDDEN 
conf.setReducerClass(HadoopReducer.class); 

// CG_REDUCER 
    conf.setNumReduceTasks(1); 

    // CG_OUTPUT_HIDDEN 
    conf.setOutputKeyClass(Text.class); 
    conf.setOutputValueClass(DoubleWritable.class); 
    // CG_OUTPUT 

    // Others 
    } 

我找不到KeyValueTextInputFormat.class在conf.setInputFormat(KeyValueTextInputFormat.class)匹配和Inputformat是,那麼如何處理呢?我可以繼承嗎?你能幫我一個例子嗎? 謝謝

+0

那麼,它是否有訣竅? – Razvan 2012-07-08 13:11:27

回答

0

KeyValueTextInputFormat需要一個文本鍵和一個由SEPARATOR_CHARACTER(默認選項卡)分隔的文本值。您正在嘗試將其轉換爲DoubleWritable,默認情況下這不是可能的。因此 映射<文本,文本,文本,DoubleWritable>

和地圖方法,然後變換自己翻番文本:

因此,修改映射器。

+0

你也應該以類似的方式處理輸出格式! – Razvan 2012-07-08 00:01:41