2017-06-27 115 views
0

我在我的Spark項目中遇到了Scala IDE的問題。我不能連接到HiveContext。它給人的錯誤是spark sql hive連接錯誤

object hive is not a member of package org.apache.spark.sql 

代碼語句是

val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc) 

以下是在項目中使用的版本:

斯卡拉 - 2.11.8

爪哇 - 1.8

Spark - 2.1.0

pom.xml的參考是

<pluginRepositories> 
    <pluginRepository> 
     <id>scala-tools.org</id> 
     <name>Scala-tools Maven2 Repository</name> 
     <url>http://scala-tools.org/repo-releases</url> 
    </pluginRepository> 
</pluginRepositories> 
<dependencies> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-core_2.11</artifactId> 
     <version>2.1.0</version> 
     <scope>provided</scope> 
    </dependency> 

    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-sql_2.11</artifactId> 
     <version>2.1.0</version> 
     <scope>provided</scope> 
    </dependency> 

    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-hive_2.10</artifactId> 
     <version>2.1.0</version> 
    </dependency> 
</dependencies> 
<build> 
    <plugins> 
     <!-- mixed scala/java compile --> 
     <plugin> 
      <groupId>org.scala-tools</groupId> 
      <artifactId>maven-scala-plugin</artifactId> 
      <executions> 
       <execution> 
        <id>compile</id> 
        <goals> 
         <goal>compile</goal> 
        </goals> 
        <phase>compile</phase> 
       </execution> 
       <execution> 
        <id>test-compile</id> 
        <goals> 
         <goal>testCompile</goal> 
        </goals> 
        <phase>test-compile</phase> 
       </execution> 
       <execution> 
        <phase>process-resources</phase> 
        <goals> 
         <goal>compile</goal> 
        </goals> 
       </execution> 
      </executions> 
     </plugin> 
     <plugin> 
      <artifactId>maven-compiler-plugin</artifactId> 
      <configuration> 
       <source>1.8</source> 
       <target>1.8</target> 
      </configuration> 
     </plugin> 
     <!-- for fatjar --> 
     <plugin> 
      <groupId>org.apache.maven.plugins</groupId> 
      <artifactId>maven-assembly-plugin</artifactId> 
      <version>2.4</version> 
      <configuration> 
       <descriptorRefs> 
        <descriptorRef>jar-with-dependencies</descriptorRef> 
       </descriptorRefs> 
      </configuration> 
      <executions> 
       <execution> 
        <id>assemble-all</id> 
        <phase>package</phase> 
        <goals> 
         <goal>single</goal> 
        </goals> 
       </execution> 
      </executions> 
     </plugin> 
     <plugin> 
      <groupId>org.apache.maven.plugins</groupId> 
      <artifactId>maven-jar-plugin</artifactId> 
      <!-- <configuration> <archive> <manifest> <addClasspath>true</addClasspath> 
       <mainClass>fully.qualified.MainClass</mainClass> </manifest> </archive> </configuration> --> 
     </plugin> 
    </plugins> 
    <pluginManagement> 
     <plugins> 
      <!--This plugin's configuration is used to store Eclipse m2e settings 
       only. It has no influence on the Maven build itself. --> 
      <plugin> 
       <groupId>org.eclipse.m2e</groupId> 
       <artifactId>lifecycle-mapping</artifactId> 
       <version>1.0.0</version> 
       <configuration> 
        <lifecycleMappingMetadata> 
         <pluginExecutions> 
          <pluginExecution> 
           <pluginExecutionFilter> 
            <groupId>org.scala-tools</groupId> 
            <artifactId> maven-scala-plugin</artifactId> 
            <versionRange> [2.15.2,)</versionRange> 
            <goals> 
             <goal>compile</goal> 
             <goal>testCompile</goal> 
            </goals> 
           </pluginExecutionFilter> 
           <action> 
            <execute /> 
           </action> 
          </pluginExecution> 
         </pluginExecutions> 
        </lifecycleMappingMetadata> 
       </configuration> 
      </plugin> 
     </plugins> 
    </pluginManagement> 
</build> 

`

+0

不是解決方案,而是'HiveContext'已被棄用。你可以使用'SparkSession.builder.enableHiveSupport'。另外,爲什麼不'spark-hive_2.11'? – philantrovert

+0

@philantrovert感謝您的回覆。即使'SparkSession'不可用並引發相同的錯誤。 我想有一個依賴問題,我用spark_hive-2.11也嘗試過,但沒有解決。 –

+0

你是如何運行這個罐子的? – philantrovert

回答

0

這似乎解決做了幾個mvn install之後。 謝謝。