2017-03-01 62 views
0

我是Spark新手。我寫了下面的Java代碼Spark Java - 爲什麼Tuple2不能成爲map中函數的參數?

JavaPairRDD<Detection, Detection> allCombinations = currentLevelNodes.cartesian(nextLevelNodes); 

allCombinations.map(new Function<Tuple2<Detection, Detection>, Segment>(){ 
    public Segment call(Tuple2<Detection, Detection> combination){ 
    Segment segment = new Segment(); 
    Detection a = combination._1(); 
    Detection b = combination._2(); 
    segment.distance = Math.sqrt(Math.pow((a.x)-(b.x), 2) + Math.pow((a.y)-(b.y), 2)); 

    return segment; 
    } 
}); 


的IDE(Eclipse的霓虹燈)顯示了以下消息

Multiple markers at this line 
- The method map(Function<Tuple2<Detection,Detection>,R>) in the type 
AbstractJavaRDDLike<Tuple2<Detection,Detection>,JavaPairRDD<Detection,Detection>> is not applicable for the arguments (new 
Function<Tuple2<Detection,Detection>,Segment>(){}) 
- The type new Function<Tuple2<Detection,Detection>,Segment>(){} must implement the inherited abstract method 
Function<Tuple2<Detection,Detection>,Segment>.apply(Tuple2<Detection,Detection>) 


我該如何解決這個問題?

+0

錯誤表示您正在實現call(),但應該實現apply()。使用lambda會讓你的生活更輕鬆 – Jack

+0

也是,你確定你可以使用Tuple2而不是JavaPairRDD作爲映射函數的參數嗎? – Jack

+0

我嘗試將函數名稱「call」修改爲「apply」,但它仍顯示第一行消息。 – lucienlo

回答

0

我使用了Lambda表達式並解決了這個問題。

這是我的示例代碼

JavaPairRDD<Detection,Detection> allCombinations = currentLevelNodes.cartesian(nextLevelNodes); 

JavaRDD<Segment> segmentRDD = allCombinations.map(tuple -> new Segment(tuple._1, tuple._2)); 


和我添加的構造

public Segment(Detection a, Detection b){ 
    this.distance = Math.sqrt(Math.pow(a.x-b.x, 2)+Math.pow(a.y-b.y, 2)); 
} 


和它的作品。

+0

希望這個答案可以幫助像我這樣的人 – lucienlo

相關問題