0
我是Spark新手。我寫了下面的Java代碼Spark Java - 爲什麼Tuple2不能成爲map中函數的參數?
JavaPairRDD<Detection, Detection> allCombinations = currentLevelNodes.cartesian(nextLevelNodes);
allCombinations.map(new Function<Tuple2<Detection, Detection>, Segment>(){
public Segment call(Tuple2<Detection, Detection> combination){
Segment segment = new Segment();
Detection a = combination._1();
Detection b = combination._2();
segment.distance = Math.sqrt(Math.pow((a.x)-(b.x), 2) + Math.pow((a.y)-(b.y), 2));
return segment;
}
});
的IDE(Eclipse的霓虹燈)顯示了以下消息
Multiple markers at this line
- The method map(Function<Tuple2<Detection,Detection>,R>) in the type
AbstractJavaRDDLike<Tuple2<Detection,Detection>,JavaPairRDD<Detection,Detection>> is not applicable for the arguments (new
Function<Tuple2<Detection,Detection>,Segment>(){})
- The type new Function<Tuple2<Detection,Detection>,Segment>(){} must implement the inherited abstract method
Function<Tuple2<Detection,Detection>,Segment>.apply(Tuple2<Detection,Detection>)
我該如何解決這個問題?
錯誤表示您正在實現call(),但應該實現apply()。使用lambda會讓你的生活更輕鬆 – Jack
也是,你確定你可以使用Tuple2而不是JavaPairRDD作爲映射函數的參數嗎? – Jack
我嘗試將函數名稱「call」修改爲「apply」,但它仍顯示第一行消息。 – lucienlo