2014-02-26 84 views
1

我的應用程序是hadoop和rest服務與spring框架的組合。我的目標是根據要求提供蜂巢表的結果。但是,當我運行的應用程序,MapReduce和蜂巢工作完成後,我獲得以下錯誤:org.springframework.beans.factory.NoSuchBeanDefinitionException:沒有[org.springframework.data.hadoop.hive.HiveOperations]類型的限定bean

java.lang.reflect.InvocationTargetException 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53) 
     at java.lang.Thread.run(Thread.java:744) 
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'logsRepository': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire method: public void hello.logsRepository.logsRepositoryC(org.springframework.data.hadoop.hive.HiveOperations); nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [org.springframework.data.hadoop.hive.HiveOperations] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {} 
     at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:292) 
     at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1185) 
     at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537) 
     at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475) 
     at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:304) 
     at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:228) 
     at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:300) 
     at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:195) 
     at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:700) 
     at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:760) 
     at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482) 
     at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:120) 
     at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:616) 
     at org.springframework.boot.SpringApplication.run(SpringApplication.java:306) 
     at org.springframework.boot.SpringApplication.run(SpringApplication.java:877) 
     at org.springframework.boot.SpringApplication.run(SpringApplication.java:866) 
     at hello.Application.main(Application.java:63) 
     ... 6 more 
Caused by: org.springframework.beans.factory.BeanCreationException: Could not autowire method: public void hello.logsRepository.logsRepositoryC(org.springframework.data.hadoop.hive.HiveOperations); nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [org.springframework.data.hadoop.hive.HiveOperations] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {} 
     at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredMethodElement.inject(AutowiredAnnotationBeanPostProcessor.java:596) 
     at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:87) 
     at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:289) 
     ... 22 more 
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [org.springframework.data.hadoop.hive.HiveOperations] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {} 
     at org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoSuchBeanDefinitionException(DefaultListableBeanFactory.java:1100) 
     at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:960) 
     at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:855) 
     at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredMethodElement.inject(AutowiredAnnotationBeanPostProcessor.java:553) 
     ... 24 more 

我的代碼:

application.java

@ComponentScan 
@EnableAutoConfiguration 
public class Application { 

    private static final Log log = LogFactory.getLog(Application.class); 

    public static void main(String[] args) { 

    private static AbstractApplicationContext ctx; 
    public static void main(String[] args) { 

     ctx = new ClassPathXmlApplicationContext("META-INF/spring/hadoop-context.xml"); 
     // shut down the context cleanly along with the VM 
     ctx.registerShutdownHook(); 

     HiveTemplate template = ctx.getBean(HiveTemplate.class); 
     log.info(template.query("show tables;")); 

     logsRepository repository = ctx.getBean(logsRepository.class); 
     repository.processLogFile("/home/hduser/yarn/hive_data"); 

     log.info("Count of password entries = " + repository.count()); 

     SpringApplication.run(Application.class, args); 
    } 

} 

logsRepository.java

@Repository 
public class logsRepository implements logsRepo { 

    private String tableName = "getlogs"; 


    private HiveOperations hiveOperations; 

    @Autowired 
    public void logsRepositoryC (HiveOperations hiveOperations) { 
     this.hiveOperations = hiveOperations; 
    } 

    @Override 
    public Long count() { 
     return hiveOperations.queryForLong("select count(*) from " + tableName); 
    } 

    @Override 
    public void processLogFile(String inputFile) { 
     Map<String, String> parameters = new HashMap<String, String>(); 
     parameters.put("inputFile", inputFile); 
     hiveOperations.query("classpath:logs-analysis.hql", parameters); 

    } 
} 

hadoop-context.xml

<?xml version="1.0" encoding="UTF-8"?> 
<beans xmlns="http://www.springframework.org/schema/beans" 
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xmlns:hdp="http://www.springframework.org/schema/hadoop" 
    xmlns:context="http://www.springframework.org/schema/context" 
    xmlns:batch="http://www.springframework.org/schema/batch" 
    xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd 
     http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd 
     http://www.springframework.org/schema/hadoop http://www.springframework.org/schema/hadoop/spring-hadoop.xsd 
     http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch.xsd"> 

     <context:property-placeholder location="hadoop.properties,hive.properties" /> 

     <context:component-scan base-package="hello" /> 

     <hdp:configuration> 
      fs.defaultFS=${hd.fs} 
      yarn.resourcemanager.address=${hd.rm} 
      mapreduce.framework.name=yarn 
      mapred.job.tracker=${hd.jt} 
     </hdp:configuration> 

     <hdp:script id="setupScript" location="copy-files.groovy"> 
      <hdp:property name="localSourceFile" value="${localSourceFile}"/> 
      <hdp:property name="inputDir" value="${inputDir}"/> 
      <hdp:property name="outputDir" value="${outputDir}"/> 
     </hdp:script> 

     <hdp:script id="setupfile" location="copy-to-local.groovy"> 
      <hdp:property name="outputDir" value="${outputDir}"/> 
     </hdp:script> 

     <hdp:job id="getlogsJob" 
      input-path="${inputDir}" 
      output-path="${outputDir}" 
      libs="${LIB_DIR}/gs-rest-service-0.1.0.jar" 
      mapper="hello.GetLogs.Map" 
      reducer="hello.GetLogs.Reduce" /> 

     <hdp:hive-client-factory host="${hive.host}" port="${hive.port}"/> 

     <hdp:hive-template id="hiveTemplate" hive-client-factory-ref="hiveClientFactory" />  

     <hdp:hive-runner id="hiveRunner" hive-client-factory-ref="hiveClientFactory" run-at-startup="false" pre-action="hdfsScript"> 
      <hdp:script location="logs-analysis.hql"> 
      </hdp:script> 
     </hdp:hive-runner> 

     <hdp:script id="hdfsScript" language="groovy" location="set-hive-permissions.groovy"/> 

     <hdp:job-runner id="runner" run-at-startup="true" pre-action="setupScript,hdfsScript" post-action="setupfile,hiveRunner" job-ref="getlogsJob" /> 
</beans> 

的pom.xml

<?xml version="1.0" encoding="UTF-8"?> 
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 

    <groupId>org.springframework</groupId> 
    <artifactId>gs-rest-service</artifactId> 
    <version>0.1.0</version> 

    <parent> 
     <groupId>org.springframework.boot</groupId> 
     <artifactId>spring-boot-starter-parent</artifactId> 
     <version>1.0.0.RC3</version> 
    </parent> 

    <profiles> 
     <profile> 
      <id>hadoop22</id> 
      <properties> 
       <spring.hadoop.version>2.0.0.M5-hadoop22</spring.hadoop.version> 
       <hadoop.version>2.2.0</hadoop.version> 
       <hive.version>0.10.0</hive.version> 
       <hadoop.version.generic>2.0.0-cdh4.1.3</hadoop.version.generic> 
       <hadoop.version.mr1>2.0.0-mr1-cdh4.1.3</hadoop.version.mr1> 
       <hadoop.examples>hadoop-mapreduce-examples</hadoop.examples> 
       <mapreduce.framework>yarn</mapreduce.framework> 
      </properties> 
      <dependencies> 
       <dependency> 
        <groupId>org.apache.hadoop</groupId> 
        <artifactId>hadoop-mapreduce-client-jobclient</artifactId> 
        <version>${hadoop.version}</version> 
        <scope>runtime</scope> 
       </dependency> 
      </dependencies> 
     </profile> 
     <profile> 
      <id>phd1</id> 
      <properties> 
       <spring.hadoop.version>1.0.2.RELEASE-phd1</spring.hadoop.version> 
       <hadoop.version>2.0.5-alpha-gphd-2.1.0.0</hadoop.version> 
       <hadoop.examples>hadoop-mapreduce-examples</hadoop.examples> 
       <mapreduce.framework>yarn</mapreduce.framework> 
      </properties> 
      <dependencies> 
       <dependency> 
        <groupId>org.apache.hadoop</groupId> 
        <artifactId>hadoop-mapreduce-client-jobclient</artifactId> 
        <version>${hadoop.version}</version> 
        <scope>runtime</scope> 
       </dependency> 
      </dependencies> 
      <build> 
       <plugins> 
        <plugin> 
         <groupId>org.apache.maven.plugins</groupId> 
         <artifactId>maven-antrun-plugin</artifactId> 
         <executions> 
          <execution> 
           <id>config</id> 
           <phase>package</phase> 
           <configuration> 
            <tasks> 
             <copy todir="target/appassembler/data"> 
              <fileset dir="data"/> 
             </copy> 
             <copy todir="target/appassembler/etc"> 
              <fileset dir="etc/phd1"/> 
             </copy> 
            </tasks> 
           </configuration> 
           <goals> 
            <goal>run</goal> 
           </goals> 
          </execution> 
         </executions> 
        </plugin> 
       </plugins> 
      </build> 
     </profile> 
    </profiles> 

    <dependencies> 
     <dependency> 
      <groupId>jdk.tools</groupId> 
      <artifactId>jdk.tools</artifactId> 
      <version>${java.version}</version> 
      <scope>system</scope> 
      <systemPath>${JAVA_HOME}/lib/tools.jar</systemPath> 
     </dependency> 
     <dependency> 
      <groupId>org.codehaus.groovy</groupId> 
      <artifactId>groovy-all</artifactId> 
      <version>2.2.1</version> 
      <scope>runtime</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.springframework.boot</groupId> 
      <artifactId>spring-boot-starter-web</artifactId> 
     </dependency> 
     <dependency> 
      <groupId>com.fasterxml.jackson.core</groupId> 
      <artifactId>jackson-databind</artifactId> 
     </dependency> 
     <dependency> 
      <groupId>org.springframework.data</groupId> 
      <artifactId>spring-data-hadoop</artifactId> 
      <version>2.0.0.M5</version> 
     </dependency> 
     <dependency> 
      <groupId>org.springframework</groupId> 
      <artifactId>spring-jdbc</artifactId> 
      <!-- <version>${spring.version}</version>--> 
     </dependency> 

     <dependency> 
      <groupId>org.springframework</groupId> 
      <artifactId>spring-test</artifactId> 
      <!-- <version>${spring.version}</version>--> 
     </dependency> 

     <dependency> 
      <groupId>org.springframework</groupId> 
      <artifactId>spring-tx</artifactId> 
      <!-- <version>${spring.version}</version>--> 
     </dependency> 
     <!--<dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-core</artifactId> 
      <version>2.2.0</version> 
     </dependency>--> 

     <dependency> 
      <groupId>org.springframework.data</groupId> 
      <artifactId>spring-data-hadoop-core</artifactId> 
      <version>2.0.0.M5</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hive</groupId> 
      <artifactId>hive-metastore</artifactId> 
      <version>0.10.0</version> 
      <scope>provided</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hive</groupId> 
      <artifactId>hive-service</artifactId> 
      <version>0.10.0</version> 
      <scope>provided</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hive</groupId> 
      <artifactId>hive-common</artifactId> 
      <version>0.10.0</version> 
      <scope>runtime</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hive</groupId> 
      <artifactId>hive-builtins</artifactId> 
      <version>0.10.0</version> 
      <scope>runtime</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hive</groupId> 
      <artifactId>hive-jdbc</artifactId> 
      <version>0.10.0</version> 
      <scope>runtime</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hive</groupId> 
      <artifactId>hive-shims</artifactId> 
      <version>0.10.0</version> 
      <scope>runtime</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hive</groupId> 
      <artifactId>hive-serde</artifactId> 
      <version>0.10.0</version> 
      <scope>runtime</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hive</groupId> 
      <artifactId>hive-contrib</artifactId> 
      <version>0.10.0</version> 
     </dependency> 
    </dependencies> 

    <properties> 
     <java.version>1.7</java.version> 
     <start-class>hello.Application</start-class> 
    </properties> 

    <build> 
     <plugins> 
      <plugin> 
       <artifactId>maven-compiler-plugin</artifactId> 
       <!-- <version>2.3.2</version> --> 
      </plugin> 
      <plugin> 
       <groupId>org.springframework.boot</groupId> 
       <artifactId>spring-boot-maven-plugin</artifactId> 
      </plugin> 
      <plugin> 
       <groupId>org.codehaus.mojo</groupId> 
       <artifactId>appassembler-maven-plugin</artifactId> 
       <version>1.2.2</version> 
       <configuration> 
        <repositoryLayout>flat</repositoryLayout> 
        <configurationSourceDirectory>src/main/config</configurationSourceDirectory> 
        <copyConfigurationDirectory>true</copyConfigurationDirectory> 
        <extraJvmArguments>-Dmr.fw=${mapreduce.framework}</extraJvmArguments> 
        <programs> 
         <program> 
          <mainClass>hello.Application</mainClass> 

         </program> 
        </programs> 
       </configuration> 
       <executions> 
        <execution> 
         <id>package</id> 
         <goals> 
          <goal>assemble</goal> 
         </goals> 
        </execution> 
       </executions> 
      </plugin> 
      <plugin> 
       <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-antrun-plugin</artifactId> 
       <executions> 
        <execution> 
         <id>config</id> 
         <phase>package</phase> 
         <configuration> 
          <!-- <tasks> 
           <copy todir="target/appassembler/data"> 
            <fileset dir="data"/> 
           </copy> 
          </tasks>--> 
         </configuration> 
         <goals> 
          <goal>run</goal> 
         </goals> 
        </execution> 
       </executions> 
      </plugin> 
     </plugins> 
    </build> 

    <repositories> 
     <repository> 
      <id>spring-snapshots</id> 
      <url>http://repo.spring.io/libs-snapshot</url> 
      <snapshots><enabled>true</enabled></snapshots> 
     </repository> 
    </repositories> 
    <pluginRepositories> 
     <pluginRepository> 
      <id>spring-snapshots</id> 
      <url>http://repo.spring.io/libs-snapshot</url> 
      <snapshots><enabled>true</enabled></snapshots> 
     </pluginRepository> 
    </pluginRepositories> 
</project> 
+0

你的基地,包是正確的?在haddop-context.xml看看這個解釋關於註釋http://stackoverflow.com/questions/7414794/difference-between-contextannotation-config-vs-contextcomponent-scan – ZaoTaoBao

+0

是的,它是正確的@ZaoTaoBao – Alina

+0

什麼你在buildpath中有jar文件嗎?如果你使用Maven發佈你的pom.xml – Rembo

回答

0

我發現「彈簧啓動自動配置」缺少的,所以我加在下面依賴於pom.xml文件:

<dependency> 
    <groupId>org.springframework.boot</groupId> 
    <artifactId>spring-boot-autoconfigure</artifactId> 
    <version>1.0.0.RC3</version> 
</dependency> 
0

看來,我認爲你的Spring應用程序啓動不加載你的Hadoop-context.xml文件。嘗試添加一個導入該文件的@Configuration類。

@Configuration 
@ImportResource("classpath:META-INF/spring/hadoop-context.xml") 
public class AppConfiguration { 
} 
相關問題