我已成功启动flume-agent
,但无法查看HDFS
中的日志文件。
我在twitter.conf
中设置的路径是:
TwitterAgent.sinks.HDFS.hdfs.path = hdfs://localhost:9000/user/flume/tweets/
请帮助我摆脱这个错误并查看我的 HDFS
中的数据。
最佳答案
如果您在.bashrc
中将hadoop home
设置为
export HADOOP_HOME=<Path to your hadoop home>
那么下面就不需要localhost:9000
TwitterAgent.sinks.HDFS.hdfs.path = hdfs://localhost:9000/user/flume/tweets/
所以正确的行应该是
TwitterAgent.sinks.HDFS.hdfs.path = hdfs:///user/flume/tweets/
鉴于你的 twitter.conf 如下所示,它应该可以工作
# Naming the components on the current agent.
TwitterAgent.sources = Twitter
TwitterAgent.channels = MemChannel
TwitterAgent.sinks = HDFS
# Describing/Configuring the source
TwitterAgent.sources.Twitter.type = org.apache.flume.source.twitter.TwitterSource
TwitterAgent.sources.Twitter.consumerKey = Your OAuth consumer key
TwitterAgent.sources.Twitter.consumerSecret = Your OAuth consumer secret
TwitterAgent.sources.Twitter.accessToken = Your OAuth consumer key access token
TwitterAgent.sources.Twitter.accessTokenSecret = Your OAuth consumer key access token secret
TwitterAgent.sources.Twitter.keywords = tutorials point,java, bigdata, mapreduce, mahout, hbase, nosql
# Describing/Configuring the sink
TwitterAgent.sinks.HDFS.type = hdfs
TwitterAgent.sinks.HDFS.hdfs.path = hdfs:///user/flume/tweets/
TwitterAgent.sinks.HDFS.hdfs.fileType = DataStream
TwitterAgent.sinks.HDFS.hdfs.writeFormat = Text
TwitterAgent.sinks.HDFS.hdfs.batchSize = 1000
TwitterAgent.sinks.HDFS.hdfs.rollSize = 0
TwitterAgent.sinks.HDFS.hdfs.rollCount = 10000
# Describing/Configuring the channel TwitterAgent.channels.MemChannel.type = memory
TwitterAgent.channels.MemChannel.capacity = 10000
TwitterAgent.channels.MemChannel.transactionCapacity = 100
# Binding the source and sink to the channel
TwitterAgent.sources.Twitter.channels = MemChannel
TwitterAgent.sinks.HDFS.channel = MemChannel
TwitterAgent.channels.MemChannel.type=memory
你的命令应该类似于Flume
home
bin/flume-ng agent --conf ./conf/ -f conf/twitter.conf Dflume.root.logger=DEBUG,console -n itterAgent
可以查看Tutorials Point为了更好的理解
注意:您始终可以通过在 flume 的日志目录中的 flume.log
中查找确切的错误来调试错误
关于hadoop - 无法在本地主机浏览 HDFS 中查看日志文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44099880/