2016-07-13 40 views
1

我試圖創建一個「步驟」並收集許多小文件到一個,所以我可以分開它幾天。問題是,我intetando運行,不讓我。步驟失敗exitCode,亞馬遜Emr Hadoop,S3DistCp

執行它很適合我的命令:

hadoop distcp s3n://buket-name/output-files-hive/* s3n://buket-name/files-hive/test 

不過,如果我已經輸入的命令「分組依據」或「srcPattern」這並不讓我什麼。

在Amazon EMR控制檯中創建「步驟」後,給我提供了所有的時間錯誤。您指定的文件

命令:

aws emr add-steps --cluster-id j-XXXXXXX --steps Name="S3DistCp step",Jar="command-runner.jar",Args=["spark-submit","--src=s3n://buket-name/output-files-hive/output-files-hive/*","--dest=s3n://buket-name/output-files-hive/files-hive/test/"] 

錯誤:

2016-07-13T15:06:27.677Z INFO Ensure step 3 jar file command-runner.jar 
2016-07-13T15:06:27.678Z INFO StepRunner: Created Runner for step 3 
INFO startExec 'hadoop jar /var/lib/aws/emr/step-runner/hadoop-jars/command-runner.jar spark-submit --src=s3n://buket-name/output-files-hive/* --dest=s3n://buket-name/files-hive/test/' 
INFO Environment: 
    TERM=linux 
    CONSOLETYPE=serial 
    SHLVL=5 
    JAVA_HOME=/etc/alternatives/jre 
    HADOOP_IDENT_STRING=hadoop 
    LANGSH_SOURCED=1 
    XFILESEARCHPATH=/usr/dt/app-defaults/%L/Dt 
    HADOOP_ROOT_LOGGER=INFO,DRFA 
    AWS_CLOUDWATCH_HOME=/opt/aws/apitools/mon 
    UPSTART_JOB=rc 
    MAIL=/var/spool/mail/hadoop 
    EC2_AMITOOL_HOME=/opt/aws/amitools/ec2 
    PWD=/ 
    HOSTNAME=ip-172-31-21-173 
    LESS_TERMCAP_se=[0m 
    LOGNAME=hadoop 
    UPSTART_INSTANCE= 
    AWS_PATH=/opt/aws 
    LESS_TERMCAP_mb=[01;31m 
    _=/etc/alternatives/jre/bin/java 
    LESS_TERMCAP_me=[0m 
    NLSPATH=/usr/dt/lib/nls/msg/%L/%N.cat 
    LESS_TERMCAP_md=[01;38;5;208m 
    runlevel=3 
    AWS_AUTO_SCALING_HOME=/opt/aws/apitools/as 
    UPSTART_EVENTS=runlevel 
    HISTSIZE=1000 
    previous=N 
    HADOOP_LOGFILE=syslog 
    PATH=/sbin:/usr/sbin:/bin:/usr/bin:/usr/local/sbin:/opt/aws/bin 
    EC2_HOME=/opt/aws/apitools/ec2 
    HADOOP_LOG_DIR=/mnt/var/log/hadoop/steps/s-2SKUUYYPQ4KKK 
    LESS_TERMCAP_ue=[0m 
    AWS_ELB_HOME=/opt/aws/apitools/elb 
    RUNLEVEL=3 
    USER=hadoop 
    HADOOP_CLIENT_OPTS=-Djava.io.tmpdir=/mnt/var/lib/hadoop/steps/s-2SKUUYYPQ4KKK/tmp 
    PREVLEVEL=N 
    HOME=/home/hadoop 
    HISTCONTROL=ignoredups 
    LESSOPEN=||/usr/bin/lesspipe.sh %s 
    AWS_DEFAULT_REGION=eu-west-1 
    LANG=en_US.UTF-8 
    LESS_TERMCAP_us=[04;38;5;111m 
INFO redirectOutput to /mnt/var/log/hadoop/steps/s-2SKUUYYPQ4KKK/stdout 
INFO redirectError to /mnt/var/log/hadoop/steps/s-2SKUUYYPQ4KKK/stderr 
INFO Working dir /mnt/var/lib/hadoop/steps/s-2SKUUYYPQ4KKK 
INFO ProcessRunner started child process 7836 : 
hadoop 7836 2229 0 15:06 ?  00:00:00 bash /usr/lib/hadoop/bin/hadoop jar /var/lib/aws/emr/step-runner/hadoop-jars/command-runner.jar spark-submit --src=s3n://buket-name/output-files-hive/* --dest=s3n://buket-name/files-hive/test/ 
2016-07-13T15:06:31.724Z INFO HadoopJarStepRunner.Runner: startRun() called for s-2SKUUYYPQ4KKK Child Pid: 7836 
INFO Synchronously wait child process to complete : hadoop jar /var/lib/aws/emr/step-runner/hadoop-... 
INFO waitProcessCompletion ended with exit code 1 : hadoop jar /var/lib/aws/emr/step-runner/hadoop-... 
INFO total process run time: 2 seconds 
2016-07-13T15:06:31.991Z INFO Step created jobs: 
2016-07-13T15:06:31.992Z WARN Step failed with exitCode 1 and took 2 seconds 

回答

0

在EMR亞馬遜的新版本不再necesartio包括.jar文件,定義S3DistCp爭論還在繼續。

aws emr add-steps --cluster-id j-XXXXXX --steps Name="S3DistCp step V3",Jar="command-runner.jar",Args=["s3-dist-cp","--src=s3n://buket-name/output-files-hive/","--dest=s3n://buket-name/files-hive/test/"]