2012-04-18 41 views
0

我已經編寫了一個shell腳本,用於自動執行 1)啓動hadoop服務(namenode,datanode,jobtracker,tasktracker,secondary namenode), 2)從配置單元中刪除所有表 3)再次從SQL SERVER導入蜂巢中的所有表格通過Shell腳本運行Java程序時遇到一些問題

而我從java調用這個shel腳本。下面是Shell腳本和Java代碼

Shell腳本的代碼:

export HADOOP_HOME=/home/hadoop/hadoop-0.20.2-cdh3u2/ 
export HIVE_HOME=/home/hadoop/hive-0.7.1/ 
export SQOOP_HOME=/home/hadoop/sqoop-1.3.0-cdh3u1/ 
export MSSQL_CONNECTOR_HOME=/home/hadoop/sqoop-sqlserver-1.0 
export HBASE_HOME=/home/hadoop/hbase-0.90.1-cdh3u0 
export ZOOKEEPER_HOME=/home/hadoop/zookeeper-3.3.1+10 
export SQOOP_CONF_DIR=/home/hadoop/sqoop-1.3.0-cdh3u1/conf/ 

/home/hadoop/hadoop-0.20.2-cdh3u2/bin/hadoop/start-all.sh 
/home/hadoop/hadoop-0.20.2-cdh3u2/bin/hadoop -rmr /user/hadoop/* 

/home/hadoop/hive-0.7.1/bin/hive -e 'show tables' > TablesToDelete.txt 
while read line1 
do 
    echo 'drop table '$line1 
    /home/hadoop/hive-0.7.1/bin/hive -e 'drop table '$line1 
done < TablesToDelete.txt 

while read line 
do 
    echo $line" ------------------------------" 
/home/hadoop/sqoop-1.3.0-cdh3u1/bin/sqoop-import --connect 'jdbc:sqlserver://192.168.1.1;username=abcd;password=12345;database=HadoopTest' --table line --hive-table $line --create-hive-table --hive-import -m 1 --hive-drop-import-delims --hive-home /home/hadoop/hive-0.7.1 --verbose 
done < /home/hadoop/sqoop-1.3.0-cdh3u1/bin/tables.txt 

Java代碼:

public class ImportTables 
{ 

    public static void main(String arsg[]) 
    { 
     PrintWriter pw=null; 
     try 
     { 
      Formatter formatter = new Formatter(); 
      String LogFile = "Log-"+ formatter.format("%1$tm%1$td-%1$tH%1$tM%1$tS", new Date()); 
      File f=new File("/home/hadoop/"+LogFile); 
      FileWriter fw1=null; 
      pw=new PrintWriter(f); 

      String cmd = "/home/hadoop/sqoop-1.3.0-cdh3u1/bin/TablesToImport.sh"; // this is the command to execute in the Unix shell 

      // create a process for the shell 
      ProcessBuilder pb = new ProcessBuilder("bash", "-c", cmd); 
      pb.redirectErrorStream(true); // use this to capture messages sent to stderr 
      Process shell = pb.start(); 
      InputStream shellIn = shell.getInputStream(); // this captures the output from the command 
      int shellExitStatus = shell.waitFor(); 
      // wait for the shell to finish and get the return code 
      // at this point you can process the output issued by the command 

      // for instance, this reads the output and writes it to System.out: 
      int c; 
      while ((c = shellIn.read()) != -1) 
      { 
       System.out.write(c); 
      } 

      // close the stream 
      shellIn.close(); 
     } 
     catch(Exception e) 
     { 
      e.printStackTrace(); 
      e.printStackTrace(pw); 
      pw.flush(); 
      System.exit(1); 
     } 



    } 
} 

但正如我運行在控制檯上看到nothiing的程序,程序遺體在運行模式下。 如果我把下面的代碼離子shell腳本:

/home/hadoop/hive-0.7.1/bin/hive -e 'show tables' > TablesToDelete.txt 
while read line1 
do 
    echo 'drop table '$line1 
    /home/hadoop/hive-0.7.1/bin/hive -e 'drop table '$line1 
done < TablesToDelete.txt 

然後輸出來作爲:

找不到Hadoop的安裝:$ HADOOP_HOME必須設置或Hadoop的必須路徑

我的程序/腳本有什麼問題?在哪裏以及如何設置HADOOP_HOME和我的腳本中的所有路徑?

回答

1

waitFor的調用是一個阻塞調用,正如名稱所示。它停止進一步執行,直到過程完成。但是因爲你的代碼也是這個過程的stdout的接收器,整個事情都會阻塞。只要將waitFor移至您已處理腳本的輸出。

相關問題