2015-12-22 75 views
1

我是新來的蘇打水。我一直在intellij嘗試開發它的項目,但couldnt.I找不到在互聯網上相同的許多資源。 所以任何人都可以請告訴如何用IntelliJ開發一個使用h20和scala的簡單項目。如何在Intellij中使用scala設置H2O火花?

我試過這段代碼:

import org.apache.spark.h2o.H2OContext 
import org.apache.spark.sql.DataFrame 
import org.apache.spark.{h2o, SparkConf, SparkContext} 
import water.H2OClientApp 
import water.fvec._ 
import org.apache.spark.h2o._ 
object test { 
    def main(args: Array[String]) { 

    val conf = new SparkConf().setMaster("local[*]").setAppName("testing") 
    val sc = new SparkContext(conf) 

    val source = getClass.getResource("data.txt") 
    val distF = sc.textFile(source.getFile) 
    val sqlContext = new org.apache.spark.sql.SQLContext(sc) 
    import sqlContext.implicits._ 
    val table1 = distF.map(_.split(",")).map(p => Person(p(0), p(1),p(2),p(3),p(4),p(5),p(6))).toDF() 


    import org.apache.spark.h2o._ 
    val h2oContext = new H2OContext(sc).start() 
    import h2oContext._ 
    import org.apache.spark.rdd.RDD 

    val mydf2:h2o.RDD[Person] = h2oContext.createH2ORDD(table1) 
    println("Count of mydf2================>>>>>>>>"+mydf2.count()) 

    } 
} 

case class Person(Country: String, ASN: String,Time_Stamp: String,Metric_A: String,Co_Server: String,Bytes: String,Send_Time:String); 

爲此我得到了錯誤。 錯誤生成的日誌的部分是:

15/12/24 03:45:53 WARN TaskSetManager: Lost task 1.0 in stage 5.0 (TID 17, localhost): java.lang.IllegalArgumentException: argument type mismatch 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at org.apache.spark.rdd.H2ORDD$$anon$1.next(H2ORDD.scala:106) 
    at org.apache.spark.rdd.H2ORDD$$anon$1.next(H2ORDD.scala:64) 
    at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555) 
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) 
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) 
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) 
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) 
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) 
    at org.apache.spark.scheduler.Task.run(Task.scala:88) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 

15/12/24 03:45:53 ERROR TaskSetManager: Task 1 in stage 5.0 failed 1 times; aborting job 
15/12/24 03:45:53 INFO TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool 
15/12/24 03:45:53 INFO TaskSetManager: Lost task 0.0 in stage 5.0 (TID 16) on executor localhost: java.lang.IllegalArgumentException (argument type mismatch) [duplicate 1] 
15/12/24 03:45:53 INFO TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool 
15/12/24 03:45:53 INFO TaskSchedulerImpl: Cancelling stage 5 
15/12/24 03:45:53 INFO DAGScheduler: ResultStage 5 (count at test.scala:32) failed in 0.038 s 
15/12/24 03:45:53 INFO DAGScheduler: Job 5 failed: count at test.scala:32, took 0.050463 s 
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 5.0 failed 1 times, most recent failure: Lost task 1.0 in stage 5.0 (TID 17, localhost): java.lang.IllegalArgumentException: argument type mismatch 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at org.apache.spark.rdd.H2ORDD$$anon$1.next(H2ORDD.scala:106) 
    at org.apache.spark.rdd.H2ORDD$$anon$1.next(H2ORDD.scala:64) 
    at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555) 
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) 
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) 
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) 
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) 
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) 
    at org.apache.spark.scheduler.Task.run(Task.scala:88) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 

Driver stacktrace: 
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283) 
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271) 
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270) 
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) 
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) 
    at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270) 
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) 
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) 
    at scala.Option.foreach(Option.scala:236) 
    at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697) 
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496) 
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458) 
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447) 
    at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) 
    at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567) 
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1822) 
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1835) 
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1848) 
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1919) 
    at org.apache.spark.rdd.RDD.count(RDD.scala:1121) 
    at test$.main(test.scala:32) 
    at test.main(test.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144) 
Caused by: java.lang.IllegalArgumentException: argument type mismatch 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at org.apache.spark.rdd.H2ORDD$$anon$1.next(H2ORDD.scala:106) 
    at org.apache.spark.rdd.H2ORDD$$anon$1.next(H2ORDD.scala:64) 
    at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555) 
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) 
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) 
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) 
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) 
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) 
    at org.apache.spark.scheduler.Task.run(Task.scala:88) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 

請讓我知道我必須做的,爲什麼在我的Wnt錯的,什麼樣的變化。

回答

2

也看在https://github.com/h2oai/h2o-droplets/tree/master/sparkling-water-droplet

它提供了一個簡單的蘇打水項目框架代碼。也請看下面這些行:https://github.com/h2oai/h2o-droplets/blob/master/sparkling-water-droplet/build.gradle#L34-L43它允許你配置H2O和Spark的依賴關係。

我會建議使用最新版本的蘇打水 - 1.5.9。

關於Idea中的開放項目 - 只需在Idea中打開build.gradle,然後按照Gradle項目導入嚮導。

還有一個更新:液滴現在還包含SBT定義:https://github.com/h2oai/h2o-droplets/blob/master/sparkling-water-droplet/build.sbt

+0

要添加到Michal的第二點,請確保您已爲intellij安裝了scala插件。然後執行以下操作: 'git clone https:// github.com/h2oai/sparkling-water.git'; 'cd sparkling-water'; '。/ gradlew idea'; '打開sparkling-water.ipr' –

+0

謝謝Michal.Can你能告訴我,如果這個錯誤是由於RDD轉換爲h2oRDD而產生的嗎?或者這只是由於配置的依賴性... –

+0

你能否提供一點信息?您使用的是Sparkling Water和Spark版本?根據我的用戶體驗,最好使用DataFrame而不是使用強類型的RDD。 – Michal

1

首先是在Intellij中創建一個Scala項目。然後,您必須在build.sbt文件內設置依賴關係。具體做法是:根據您的火花版本和H2O版本,你可以在Maven的中央資料庫搜索和檢查。而這兩個包兼容,並下載相應的

name := "Your Project Name" 
version := "1.0-SNAPSHOT" 
scalaVersion := "2.10.4" 
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.5.1", 
"org.scalaz" %% "scalaz-core" % "7.1.5", 
"javax.servlet" % "javax.servlet-api" % "3.0.1", 
"junit" % "junit" % "4.12", 
"ai.h2o" % "sparkling-water-core_2.10" % "1.4.8" 
) 
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false) 

您的情況可能不需要javax.servlet包。

此外,對於裝配插件,您必須聲明/project/plugins.sbt文件中的以下內容:

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0") 

然後打開SBT標籤(在的IntelliJ窗口右側)然後按刷新按鈕(左上角)。

最後確認一切工作從以下鏈接執行數4: http://h2o-release.s3.amazonaws.com/sparkling-water/rel-1.4/9/index.html

希望以上會幫助你。

+0

感謝您的幫助。我試着用代碼(問題更新)並得到錯誤(檢查登錄問題)。請讓我知道它出錯的地方。 –

0

這主要發生是由於輸入的情況下階級之間的類型不匹配。

+0

這不是我想它應該在評論部分的答案。 – surajsn

+0

我覺得原來的問題有分歧所以我按照Log'寫了答案java.lang.IllegalArgumentException:參數類型不匹配' –