2016-03-17 285 views
0

我想避免保持綁定所有命令來運行簡單的mapreduce每次我想測試映射器和reducer文件,所以我寫了這個腳本,我是shell腳本的新手。我想知道這些hadoops命令是否將在shell腳本中運行,並且腳本是否需要更改?shell運行hadoop命令

echo textFile :"$1" 
echo mapper : "$2" 
echo reducer: "$3" 
echo inputDir :"$4" 
echo outputDir: "$5" 



hdfs dfs -copyFromLocal /home/hduser/"$2" # copies mapper.py file from argument to hdfs dir 
hdfs dfs -copyFromLocal /home/hduser/"$3" # copies reducer.py file from argument to hdfs dir 


hdfs dfs -test -d ~/"$5" #checks to see if hadoop output dir exists 
if [ $? == 0 ]; then 
    hdfs dfs -rm -r ~/"$5" 
else 
    echo "Output file doesn't exist and will be created when hadoop runs" 
fi 



hdfs dfs -test -d ~/"$4" #checks to see if hadoop input dir exists 
if [ $? == 0 ]; then 
    hdfs dfs -rm -r ~/"$4" 
    echo "Hadoop input dir alread exists deleting it now and creating a new one..." 
    hdfs dfs -mkdir ~/"$4" # makes an input dir for text file to be put in 

else 
    echo "Input file doesn't exist will be created now" 
    hdfs dfs -mkdir ~/"$4" # makes an input dir for text file to be put in 
fi 



hdfs dfs -copyFromLocal /home/hduser/"$1" ~/"$4" # sends textfile from local to hdfs folder 

# runs the hadoop mapreduce program with given parameters 
hadoop jar /usr/local/hadoop/share/hadoop/tools/lib/hadoop-streaming- 
2.6.2.jar -input /home/hduser/"$4"/* -output /home/hduser/"$5" -file 
/home/hduser/"$2" -mapper /home/hduser/"$2" -file 
/home/hduser/"$3" -reducer /home/hduser/"$3" 
+1

使用http://www.shellcheck.net/來檢查你的shell腳本。 – iamauser

回答

0

是的。它會運行,如果適當的參數傳遞。