问个问题安装flume需要hadoop环境吗

0
已邀请:
0

seng - 从事BI、大数据、数据分析工作 2016-04-04 回答

单独的flume是不需要的, 不过如果你要写到HDFSl里面是需要相关的jar包的。

Flume的文档这样写的
This sink writes events into the Hadoop Distributed File System (HDFS). It currently supports creating text and sequence files. It supports compression in both file types. The files can be rolled (close current file and create a new one) periodically based on the elapsed time or size of data or number of events. It also buckets/partitions data by attributes like timestamp or machine where the event originated. The HDFS directory path may contain formatting escape sequences that will replaced by the HDFS sink to generate a directory/file name to store the events. Using this sink requires hadoop to be installed so that Flume can use the Hadoop jars to communicate with the HDFS cluster. Note that a version of Hadoop that supports the sync() call is required.
具体查看文档 http://flume.apache.org/FlumeUserGuide.html

要回复问题请先登录注册