2017-05-18 46 views
1

我試圖導入sparkContext,但我recived的錯誤:錯誤:對象的StreamingContext不包org.apache.spark.streaming進口org.apache.spark.streaming.StreamingContext成員

error: object StreamingContext is not a member of package org.apache.spark.streaming import org.apache.spark.streaming.StreamingContext

我的代碼是:

package com.sundogsoftware.sparkstreaming 

import org.apache.spark.streaming.StreamingContext 
import collection.mutable._ 
import org.apache.kafka.common.serialization.StringDeserializer 
import org.apache.spark 
import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe 
import org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent 
import org.apache.spark.streaming.kafka010._ 
... 

我使用的IntelliJ行家,我的POM是:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-streaming_2.11</artifactId> 
    <version>2.1.1</version> 
    <name>${project.artifactId}</name> 
    <description>My wonderfull scala app</description> 
    <inceptionYear>2015</inceptionYear> 
    <licenses> 
    <license> 
     <name>My License</name> 
     <url>http://....</url> 
     <distribution>repo</distribution> 
    </license> 
    </licenses> 

    <properties> 
    <maven.compiler.source>1.8</maven.compiler.source> 
    <maven.compiler.target>1.8</maven.compiler.target> 
    <encoding>UTF-8</encoding> 
    <scala.version>2.11.8</scala.version> 
    <scala.compat.version>2.11</scala.compat.version> 
    </properties> 

    <dependencies> 
    <dependency> 
     <groupId>org.scala-lang</groupId> 
     <artifactId>scala-library</artifactId> 
     <version>${scala.version}</version> 
    </dependency> 

    <!-- Test --> 
    <dependency> 
     <groupId>junit</groupId> 
     <artifactId>junit</artifactId> 
     <version>4.12</version> 
     <scope>test</scope> 
    </dependency> 
     <dependency> 
      <groupId>org.specs2</groupId> 
      <artifactId>specs2-junit_${scala.compat.version}</artifactId> 
      <version>2.4.16</version> 
      <scope>test</scope> 
     </dependency> 
    <dependency> 
     <groupId>org.specs2</groupId> 
     <artifactId>specs2-core_${scala.compat.version}</artifactId> 
     <version>2.4.16</version> 
     <scope>test</scope> 
    </dependency> 
    <dependency> 
     <groupId>org.scalatest</groupId> 
     <artifactId>scalatest_${scala.compat.version}</artifactId> 
     <version>2.2.4</version> 
     <scope>test</scope> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.kafka</groupId> 
     <artifactId>kafka_2.11</artifactId> 
     <version>0.10.2.1</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming_2.11</artifactId> 
     <version>2.1.1</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming-kafka_2.11</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming-kafka-assembly_2.11</artifactId> 
     <version>1.5.2</version> 
    </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-core_2.11</artifactId> 
      <version>2.0.1</version> 
     </dependency> 

     <dependency> 
      <groupId>org.apache.kafka</groupId> 
      <artifactId>kafka-clients</artifactId> 
      <version>0.10.0.0</version> 
     </dependency> 


     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-streaming-kafka-0-10_2.11</artifactId> 
      <version>2.0.0</version> 
     </dependency> 

    </dependencies> 

    <build> 
    <sourceDirectory>src/main/scala</sourceDirectory> 
    <testSourceDirectory>src/test/scala</testSourceDirectory> 
    <plugins> 
     <plugin> 
     <!-- see http://davidb.github.com/scala-maven-plugin --> 
     <groupId>net.alchim31.maven</groupId> 
     <artifactId>scala-maven-plugin</artifactId> 
     <version>3.2.0</version> 
     <executions> 
      <execution> 
      <goals> 
       <goal>compile</goal> 
       <goal>testCompile</goal> 
      </goals> 
      <configuration> 
       <args> 
       <!-- <arg>-make:transitive</arg> --> 
       <arg>-dependencyfile</arg> 
       <arg>${project.build.directory}/.scala_dependencies</arg> 
      </args> 
      </configuration> 
     </execution> 
     </executions> 
    </plugin> 
    <plugin> 
     <groupId>org.apache.maven.plugins</groupId> 
     <artifactId>maven-surefire-plugin</artifactId> 
     <version>2.18.1</version> 
     <configuration> 
     <useFile>false</useFile> 
     <disableXmlReport>true</disableXmlReport> 
     <!-- If you have classpath issue like NoDefClassError,... --> 
      <!-- useManifestOnlyJar>false</useManifestOnlyJar --> 
      <includes> 
      <include>**/*Test.*</include> 
      <include>**/*Suite.*</include> 
      </includes> 
     </configuration> 
     </plugin> 
    </plugins> 
    </build> 
</project> 
+0

我使用Gradle,但嘗試檢查您的項目模型,並確保該jar包列在您的Dependencies選項卡下。 – Jeremy

回答

0

我解決我的proble米這是由於我的groupId,artifactId和版本與我的依賴之一相同。

<groupId>org.apache.spark</groupId> 
<artifactId>spark-streaming_2.11</artifactId> 
<version>2.1.1</version> 

<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-streaming_2.11</artifactId> 
    <version>2.1.1</version> 
</dependency> 

我已經改變了第一。

相關問題