2016-08-20 89 views
0

我正在使用cloudera hadoop,並且僅將節點添加到現有羣集,但無法在節點上啓動hdfs角色。以下是我得到的例外:命令'開始'服務'hdfs'失敗

FATAL org.apache.hadoop.hdfs.server.datanode.DataNode 
Exception in secureMain 

java.lang.InternalError 
at sun.security.ec.SunEC.initialize(Native Method) 
at sun.security.ec.SunEC.access$000(SunEC.java:49) 
at sun.security.ec.SunEC$1.run(SunEC.java:61) 
at sun.security.ec.SunEC$1.run(SunEC.java:58) 
at java.security.AccessController.doPrivileged(Native Method) 
at sun.security.ec.SunEC.<clinit>(SunEC.java:58) 
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
at java.lang.Class.newInstance(Class.java:383) 
at sun.security.jca.ProviderConfig$2.run(ProviderConfig.java:221) 
at sun.security.jca.ProviderConfig$2.run(ProviderConfig.java:206) 
at java.security.AccessController.doPrivileged(Native Method) 
at sun.security.jca.ProviderConfig.doLoadProvider(ProviderConfig.java:206) 
at sun.security.jca.ProviderConfig.getProvider(ProviderConfig.java:187) 
at sun.security.jca.ProviderList.getProvider(ProviderList.java:233) 
at sun.security.jca.ProviderList$ServiceList.tryGet(ProviderList.java:434) 
at sun.security.jca.ProviderList$ServiceList.access$200(ProviderList.java:376) 
at sun.security.jca.ProviderList$ServiceList$1.hasNext(ProviderList.java:486) 
at javax.crypto.KeyGenerator.nextSpi(KeyGenerator.java:339) 
at javax.crypto.KeyGenerator.<init>(KeyGenerator.java:169) 
at javax.crypto.KeyGenerator.getInstance(KeyGenerator.java:224) 
at org.apache.hadoop.security.token.SecretManager.<init>(SecretManager.java:143) 
at org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.<init>(BlockPoolTokenSecretManager.java:36) 
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1076) 
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:411) 
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2301) 
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2188) 
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2235) 
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2411) 
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2435) 

如果您知道解決方案,請幫助。

+0

'sudo chmod 755/path/to/new/Datanode /' - 嘗試賦予新數據節點所有權限目錄 –

+0

其已經存在! –

回答

0

只是降級openJDK版本,它的工作!

相關問題