2016-11-17 39 views
0

當我看到它的工作原理特定文件:星火AVRO S3讀不工作的分區數據

val filePath= "s3n://bucket_name/f1/f2/avro/dt=2016-10-19/hr=19/000000"   
val df = spark.read.avro(filePath) 

但是,如果我指向一個文件夾來讀取日期分區數據失敗:

VAL文件路徑= 「S3N:// BUCKET_NAME/F1/F2 /架Avro/DT = 2016年10月19日/」

我得到這個錯誤:

Exception in thread "main" org.apache.hadoop.fs.s3.S3Exception: org.jets3t.service.S3ServiceException: S3 HEAD request failed for '/f1%2Ff2%2Favro%2Fdt%3D2016-10-19' - ResponseCode=403, ResponseMessage=Forbidden 
at org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.handleServiceException(Jets3tNativeFileSystemStore.java:245) 
at org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.retrieveMetadata(Jets3tNativeFileSystemStore.java:119) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:498) 
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) 
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
at org.apache.hadoop.fs.s3native.$Proxy7.retrieveMetadata(Unknown Source) 
at org.apache.hadoop.fs.s3native.NativeS3FileSystem.getFileStatus(NativeS3FileSystem.java:414) 
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397) 
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$12.apply(DataSource.scala:374) 
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$12.apply(DataSource.scala:364) 
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) 
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) 
at scala.collection.immutable.List.foreach(List.scala:381) 
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) 
at scala.collection.immutable.List.flatMap(List.scala:344) 
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:364) 
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149) 
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:132) 
at com.databricks.spark.avro.package$AvroDataFrameReader$$anonfun$avro$2.apply(package.scala:34) 
at com.databricks.spark.avro.package$AvroDataFrameReader$$anonfun$avro$2.apply(package.scala:34) 
at BasicS3Avro$.main(BasicS3Avro.scala:55) 
at BasicS3Avro.main(BasicS3Avro.scala) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:498) 
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) 

難道我錯過什麼東西?

+0

它看起來像是一個驗證錯誤,服務以403響應。也許你應該檢查與桶相關的策略。 – devsprint

+0

但是,當我訪問特定文件時,相同的憑據起作用。另外,我可以使用aws命令行列出文件夾的內容。 – JNish

+0

好吧,讓我試着在本地重現它... – devsprint

回答

0

更新的,維護的s3a客戶報告什麼?

+0

使用spark 2.0.0來重現問題。對不起,我可以重新解釋你的問題嗎? @Steve Loughran – JNish