2016-11-30 85 views
1

我正在嘗試編寫Spark批處理作業。我想把它打包成一個jar並用spark提交。我的程序完全在火花外殼,但是我得到以下錯誤,當我嘗試用火花運行它提交:Spark和Hbase客戶端的版本兼容性

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; 
    at HBaseBulkload$.saveAsHFile(ThereInLocationGivenTimeInterval.scala:103) 
    at HBaseBulkload$.toHBaseBulk(ThereInLocationGivenTimeInterval.scala:178) 
    at ThereInLocationGivenTimeInterval$.main(ThereInLocationGivenTimeInterval.scala:241) 
    at ThereInLocationGivenTimeInterval.main(ThereInLocationGivenTimeInterval.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

根據this answer問題從版本不兼容起源。我還發現this,但我的火花的版本是1.6.0下面是我的項目文件名.bst:

name := "HbaseBulkLoad" 

version := "1.0" 

scalaVersion := "2.10.5" 

resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/" 

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0" 
//libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.9.0" 
//libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.9.0" 
//libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.9.0" 

libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.1.2" 
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.1.2" 
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.1.2" 

我進口,導致該錯誤是如下的代碼段: /SimpleApp.scala/ 進口org.apache.spark.SparkContext 進口org.apache.spark.SparkContext._ 進口org.apache.spark.SparkConf

// HBaseBulkLoad imports 
import java.util.UUID 

import org.apache.hadoop.conf.Configuration 
import org.apache.hadoop.fs.permission.FsPermission 
import org.apache.hadoop.fs.{Path, FileSystem} 
import org.apache.hadoop.hbase.{KeyValue, TableName} 
import org.apache.hadoop.hbase.client._ 
import org.apache.hadoop.hbase.io.ImmutableBytesWritable 
import org.apache.hadoop.hbase.mapreduce.{HFileOutputFormat2, LoadIncrementalHFiles} 
import org.apache.hadoop.hbase.util.Bytes 
import org.apache.hadoop.mapreduce.Job 
import org.apache.hadoop.mapreduce.lib.partition.TotalOrderPartitioner 
import org.apache.spark.rdd.RDD 
import org.apache.spark.Partitioner 
import org.apache.spark.storage.StorageLevel 

import scala.collection.JavaConversions._ 
import scala.reflect.ClassTag 

// Hbase admin imports 
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor} 
import org.apache.hadoop.hbase.client.HBaseAdmin 
import org.apache.hadoop.hbase.mapreduce.TableInputFormat 
import org.apache.hadoop.fs.Path; 
import org.apache.hadoop.hbase.HColumnDescriptor 
import org.apache.hadoop.hbase.util.Bytes 
import org.apache.hadoop.hbase.client.Put; 
import org.apache.hadoop.hbase.client.HTable; 
import java.util.Calendar 

val now = Calendar.getInstance.getTimeInMillis  
//val filteredRdd = myRdd.filter(... 
val resultRdd= filteredRdd.map{ row => (row(0).asInstanceOf[String].getBytes(), 
           scala.collection.immutable.Map("batchResults" -> 
             Array(("batchResult1", ("true", now))) 
           ) 
         ) 
     } 
println(resultRdd.count) 

回答

0

工作名.bst文件如下:

name := "HbaseBulkLoad" 

version := "1.0" 

scalaVersion := "2.10.5" 

resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/" 

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0-cdh5.9.0" 
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.9.0" 
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.9.0" 
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.9.0" 

如果您正在使用Cloudera的,你可以找到在以下目錄中的罐子和相應的版本:

/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/jars