2017-03-03 33 views
0

我正在使用Play WsClient將請求發送到面向Spark驅動程序的Spray服務器端點。有問題的調用是在這裏:使用Spark 2.2快照播放WSClient Nosuchmethod錯誤

def serializeDataset(requestUrl: String, recipe: Recipe): Future[(Option[String], String, Int)] = { 
    ws.url(requestUrl).post(Json.toJson(recipe)).map { response => 
    val code = (response.json \ "code").as[Int] 
    code match { 
     case OK => ((response.json \ "uuid").asOpt[String], (response.json \ "schema").as[String], code) 
     case _ => ((response.json \ "message").asOpt[String], "", code) 
    } 
    } 
} 

在執行時,我得到這個錯誤

Caused by: java.lang.NoSuchMethodError: io.netty.util.internal.PlatformDependent.newAtomicIntegerFieldUpdater(Ljava/lang/Class;Ljava/lang/String;)Ljava/util/concurrent/atomic/AtomicIntegerFieldUpdater; 
    at org.asynchttpclient.netty.NettyResponseFuture.<clinit>(NettyResponseFuture.java:52) 
    at org.asynchttpclient.netty.request.NettyRequestSender.newNettyResponseFuture(NettyRequestSender.java:311) 
    at org.asynchttpclient.netty.request.NettyRequestSender.newNettyRequestAndResponseFuture(NettyRequestSender.java:193) 
    at org.asynchttpclient.netty.request.NettyRequestSender.sendRequestWithCertainForceConnect(NettyRequestSender.java:129) 
    at org.asynchttpclient.netty.request.NettyRequestSender.sendRequest(NettyRequestSender.java:107) 
    at org.asynchttpclient.DefaultAsyncHttpClient.execute(DefaultAsyncHttpClient.java:216) 
    at org.asynchttpclient.DefaultAsyncHttpClient.executeRequest(DefaultAsyncHttpClient.java:184) 
    at play.api.libs.ws.ahc.AhcWSClient.executeRequest(AhcWS.scala:45) 
    at play.api.libs.ws.ahc.AhcWSRequest$.execute(AhcWS.scala:90) 
    at play.api.libs.ws.ahc.AhcWSRequest$$anon$2.execute(AhcWS.scala:166) 
    at play.api.libs.ws.ahc.AhcWSRequest.execute(AhcWS.scala:168) 
    at play.api.libs.ws.WSRequest$class.post(WS.scala:510) 
    at play.api.libs.ws.ahc.AhcWSRequest.post(AhcWS.scala:107) 
    at webservices.DataFrameService.serializeDataset(DataFrameService.scala:36) 

它看起來像WSClient是撿了版本的Netty的不包括相關的功能。

當我使用2.2-SNAPSHOT版本的Spark編譯應用程序時發生此問題,但在使用2.1版編譯時不會發生此問題。我不知道爲什麼這種改變會有所作爲。 Spark驅動程序是我的sbt版本中的一個獨立項目。

我懷疑這與應用程序及其依賴關係的打包有關。以下是我已經在SBT試圖recitify:

  1. 添加了一個明確的("io.netty" % "netty-all" % "4.0.43.Final")到我的依賴性
  2. 新增排除語句火花進口,像這樣:

    "org.apache.spark" %% "spark-sql" % sparkV exclude("org.jboss.netty","netty") exclude("io.netty","netty") 
    
    "org.apache.spark" %% "spark-core" % sparkV exclude("org.jboss.netty","netty") exclude("io.netty","netty") 
    
    "org.apache.spark" %% "spark-mllib" % sparkV exclude("org.scalamacros", "quasiquotes") exclude("org.jboss.netty","netty") exclude("io.netty","netty") 
    
    "org.apache.spark" %% "spark-hive" % sparkV exclude("org.scalamacros", "quasiquotes") exclude("org.jboss.netty","netty") exclude("io.netty","netty") 
    
  3. 改變了play-ws模塊添加到項目依賴項的順序(將其移動到末尾,將其移至開頭處)

非常感謝任何幫助。

回答

0

在進一步的回顧中,我發現Play項目中存在對Spark庫的依賴關係。我刪除了這個,它似乎工作。