0
我有一個關於aws的hadoop訪問s3的問題。hadoop無法訪問s3
<property>
<name>fs.default.name</name>
<value>s3n://testhadoophiveserver</value>
</property>
<property>
<name>fs.s3n.awsAccessKeyId</name>
<value>I have fill it</value>
</property>
<property>
<name>fs.s3n.awsSecretAccessKey</name>
<value>I have fill it</value>
</property>
因此我在運行start-all.sh時得到了錯誤代碼。 這樣的:
hadoopmaster: Exception in thread "main" java.net.UnknownHostException: unknown host: testhadoophiveserver
hadoopmaster: at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
hadoopmaster: at org.apache.hadoop.ipc.Client.getConnection(Client.java:850)
adoopmaster: at org.apache.hadoop.ipc.Client.call(Client.java:720)
hadoopmaster: at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
hadoopmaster: at $Proxy4.getProtocolVersion(Unknown Source)
hadoopmaster: at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
hadoopmaster: at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:346)
hadoopmaster: at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:383)
hadoopmaster: at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:314)
,但是,如果我使用HDFS,它的確定。 現在,我無法使用S3文件系統。 誰能幫幫我?