我正在使用以下配置將數據從日誌文件推送到hdfs。無法將數據從槽傳輸到hdfs hadoop日誌中
agent.channels.memory-channel.type = memory
agent.channels.memory-channel.capacity=5000
agent.sources.tail-source.type = exec
agent.sources.tail-source.command = tail -F /home/training/Downloads/log.txt
agent.sources.tail-source.channels = memory-channel
agent.sinks.log-sink.channel = memory-channel
agent.sinks.log-sink.type = logger
agent.sinks.hdfs-sink.channel = memory-channel
agent.sinks.hdfs-sink.type = hdfs
agent.sinks.hdfs-sink.batchSize=10
agent.sinks.hdfs-sink.hdfs.path = hdfs://localhost:8020/user/flume/data/log.txt
agent.sinks.hdfs-sink.hdfs.fileType = DataStream
agent.sinks.hdfs-sink.hdfs.writeFormat = Text
agent.channels = memory-channel
agent.sources = tail-source
agent.sinks = log-sink hdfs-sink
agent.channels = memory-channel
agent.sources = tail-source
agent.sinks = log-sink hdfs-sink
我沒有收到任何錯誤消息,但仍然無法找到hdfs中的輸出。 中斷我可以看到匯中斷異常&該日誌文件的一些數據。 我正在運行以下命令: flume-ng agent --conf/etc/flume-ng/conf/--conf-file /etc/flume-ng/conf/flume.conf -Dflume.root.logger = DEBUG, console -n agent;
您先不理解我的問題..現在,故事已經太舊了。 – 2015-09-09 07:37:39