下面提到的是我的flume配置。flume將數據發佈到HDFS但字符問題
a1.sources = r1
a1.sinks = k1
a1.channels = c1
a1.sources.r1.type = http
a1.sources.r1.port = 5140
a1.sources.r1.channels = c1
a1.sources.r1.handler = org.apache.flume.source.http.JSONHandler
a1.sources.r1.handler.nickname = random props
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
a1.sinks.k1.type = hdfs
a1.sinks.k1.channel = c1
a1.sinks.k1.hdfs.path = hdfs://10.0.40.18:9160/flume-test
a1.sinks.k1.hdfs.filePrefix = events-
a1.sinks.k1.hdfs.round = true
a1.sinks.k1.hdfs.roundValue = 10
a1.sinks.k1.hdfs.roundUnit = minute
在flume日誌文件中沒有錯誤,但在使用hadoop命令讀取文件時有問題。
hadoop fs -cat hdfs://10.0.40.18:9160/flume-test/even1393415633931
flume log message is hdfs file created is "hdfs://10.0.40.18:9160/flume-test/even1393415633931"
任何幫助明顯。
請告訴我日誌消息..請澄清..它不清楚FRM烏爾問題 – Jasper
HDFS文件寫入是成功的。唯一的問題是使用hadoop fs的-cat命令不能顯示字符。似乎是編碼問題/設置。 – user2775185