hadoop整合flume

By | 2018年11月27日
版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/chukun123/article/details/78390709

安装flume
1、将flume-ng-1.5.0-cdh5.3.6.tar.gz拷贝到spark1的/usr/local目录下。
2、对flume进行解压缩:

tar -zxvf flume-ng-1.5.0-cdh5.3.6.tar.gz

3、对flume目录进行重命名:

mv apache-flume-1.5.0-cdh5.3.6-bin flume

4、配置flume相关的环境变量

vi ~/.bashrc
export FLUME_HOME=/usr/local/flume
export FLUME_CONF_DIR=$FLUME_HOME/conf
export PATH=$FLUME_HOME/bin
source ~/.bashrc

修改flume的配置文件

vi /usr/local/flume/conf/flume-conf.properties

#agent1表示代理名称
agent1.sources=source1
agent1.sinks=sink1
agent1.channels=channel1
#配置source1
agent1.sources.source1.type=spooldir
agent1.sources.source1.spoolDir=/usr/local/logs
agent1.sources.source1.channels=channel1
agent1.sources.source1.fileHeader = false
agent1.sources.source1.interceptors = i1
agent1.sources.source1.interceptors.i1.type = timestamp

#配置channel1
agent1.channels.channel1.type=file
agent1.channels.channel1.checkpointDir=/usr/local/logs_tmp_cp
agent1.channels.channel1.dataDirs=/usr/local/logs_tmp

#配置sink1
agent1.sinks.sink1.type=hdfs
agent1.sinks.sink1.hdfs.path=hdfs://spark1:9000/logs
agent1.sinks.sink1.hdfs.fileType=DataStream
agent1.sinks.sink1.hdfs.writeFormat=TEXT
agent1.sinks.sink1.hdfs.rollInterval=1
agent1.sinks.sink1.channel=channel1
agent1.sinks.sink1.hdfs.filePrefix=%Y-%m-%d

启动flume

flume-ng agent -n agent1 -c conf -f /usr/local/flume/conf/flume-conf.properties -Dflume.root.logger=DEBUG,console

测试flume

新建本地日志目录:

mkdir /usr/local/logs

新建hdfs目录:

hdfs dfs -mkdir /logs

新建一份文件,移动到/usr/local/logs目录下,flume就会自动上传到HDFS的/logs目录中

这里写图片描述

至此,hadoop整合flume完毕

发表评论