2014-09-29 95 views
0

我正在使用Cent os-6並使用cloudera cdh4.7。當我嘗試使用proxt http://xxx.xxx.xxx:50070從瀏覽器瀏覽文件系統時。我收到以下錯誤給出如下:Cloudera Hadoop 500錯誤

HTTP ERROR 500 
Problem accessing /nn_browsedfscontent.jsp. Reason: 
    Cannot issue delegation token. Name node is in safe mode. 
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode.. 
Caused by: 
org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot issue delegation token. Name node is in safe mode. 
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode.. 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken(FSNamesystem.java:5450) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDelegationToken(NameNodeRpcServer.java:392) 
     at org.apache.hadoop.hdfs.server.namenode.NamenodeJspHelper$1.run(NamenodeJspHelper.java:435) 
     at org.apache.hadoop.hdfs.server.namenode.NamenodeJspHelper$1.run(NamenodeJspHelper.java:432) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:416) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438) 
     at org.apache.hadoop.hdfs.server.namenode.NamenodeJspHelper.getDelegationToken(NamenodeJspHelper.java:431) 
     at org.apache.hadoop.hdfs.server.namenode.NamenodeJspHelper.redirectToRandomDataNode(NamenodeJspHelper.java:462) 
     at org.apache.hadoop.hdfs.server.namenode.nn_005fbrowsedfscontent_jsp._jspService(nn_005fbrowsedfscontent_jsp.java:70) 
     at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:98) 
     at javax.servlet.http.HttpServlet.service(HttpServlet.java:820) 
     at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511) 
     at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1221) 
     at org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter.doFilter(StaticUserWebFilter.java:109) 
     at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212) 
     at org.apache.hadoop.http.HttpServer$QuotingInputFilter.doFilter(HttpServer.java:1069) 
     at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212) 
     at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) 
     at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212) 
     at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) 
     at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212) 
     at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399) 
     at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) 
     at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182) 
     at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766) 
     at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450) 
     at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230) 
     at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) 
     at org.mortbay.jetty.Server.handle(Server.java:326) 
     at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542) 
     at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928) 
     at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549) 
     at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212) 
     at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404) 
     at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410) 
     at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582) 

我用下面給出的語句從安全模式刪除: 「命令-u HDFS dfsadmin -safemode離開」,通過使用這種說法也有不用找了。

請幫我渡過這個障礙。

+1

檢查您的Namenode是否耗盡主內存。 – Shekhar 2014-09-29 05:41:38

+0

嘗試重新啓動您的namenode和其他服務。然後如果可能,再格式化namenode再試一次 – 2014-09-29 05:55:27

回答

0

您的Namenode處於保存模式。

你需要離開那個。

hadoop dfsadmin -safemode leave 

這裏是Explanation

+0

執行以下步驟之後。1)重新啓動數據節點,名稱節點,輔助名稱節點。 2)格式化名稱節點。 3)開始在cd /etc/init.d中運行x的hdfs; ls hadoop-hdfs- *';做sudo服務$ x start;那麼,我已經聲明爲「sudo -u hdfs hadoop fs -mkdir/tmp」。我在下面給出了以下錯誤:「mkdir:無法創建目錄/ tmp。名稱節點處於安全模式」。 – user3292373 2014-10-01 03:41:07

+0

只是做「hadoop dfsadmin - 安全模式離開」。無需重啓任何東西 – 2014-10-01 05:15:29

0

確保您的存儲是不完整的。刪除一些文件並騰出空間。否則系統會再次進入安全模式。