Output log files to Kafka consumer through logstash

created at 08-03-2021 views: 29


Download from the official website:

wget https://download.elastic.co/logstash/logstash/logstash-2.3.2.tar.gz

Unzip after downloading:

tar -xvf logstash-2.3.2.tar.gz

After renamed to logstash, move to /usr/lcoal:

mv logstash /usr/local


Remember to install the jdk of java8!

First, create a kafka/ folder under /usr/local to store kafka and zookeeper:

mkdir /usr/local/kafka

After entering the kafka folder, download:

wget https://mirrors.tuna.tsinghua.edu.cn/apache/zookeeper/zookeeper-3.6.3/apache-zookeeper-3.6.3-bin.tar.gz

Unzip after downloading:

tar -xvf apache-zookeeper-3.6.3-bin.tar.gz

Create a soft link:

ln -s apache-zookeeper-3.6.3-bin zookeeper

Go to the zookeeper's conf/ folder, create a zoo.cfg file, and copy the content in zoo_sample.cfg:

cp zoo_sample.cfg ./zoo.cfg

as the picture shows:


Remember that the default port number of zookeeper is 2181, which will be used in the future.

Start zookeeper

Enter the bin folder of zookeeper:

common commands:

  • start: ./bin/zkServer.sh start
  • stop: ./bin/zkServer.sh stop
  • restart: ./bin/zkServer.sh restart
  • check status: ./bin/zkServer.sh status

Enter the start command:

./zkServer.sh start

Check the status again to confirm whether it is started, and restart if it is not started.


Go to the /usr/local/kafka folder.

Download kafka:

wget https://mirrors.bfsu.edu.cn/apache/kafka/2.8.0/kafka_2.13-2.8.0.tgz


tar -xvf kafka_2.13-2.8.0.tgz

Create a soft link:

ln -s kafka_2.13-2.8.0 kafka

configure server.properties

vim kafka/config/server.properties

There are two points to note:

kafka server.properties

The zookeeper connection is configured according to the actual situation.

zookeeper connection

Start the relevant tools and complete the output

Start kafka and create a topic

start kafka:

bin/kafka-server-start.sh -daemon config/server.properties

Create topic:

bin/kafka-topics.sh --create --topic suricata-http-log --replication-factor 1 --partitions 1 --zookeeper localhost:2181
  • suricata-http-log: topic-id
  • localhost:2181:  host and port of zookeeper

these two items need to be modified by yourself.

Create logstash input and output configuration files

The location is arbitrary, such as in my personal folder:

vim /home/canvas/Test/logstash-kafka.conf

The conf file is created.

The content written in it is as follows:


The input here specifies the source file eve-http.json and the path of the source file that I want to output.

output here indicates that we want to output to kafka.

Note: The codec was originally specified as json, but it was found that there was a problem with the format when outputting, so I commented it. Maybe the problem is that my source file itself is in json format.

Kafka starts consumers

Although there is no need to start the producer in this case, here is the instruction to start the producer:

Start producer

bin/kafka-console-producer.sh --broker-list localhost:9092 --sync --topic suricata-http-log

Start consumer

bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic suricata-http-log --from-beginning

Enter a string in the producer interface and press Enter to confirm, the consumer will have the same output.

Start logstash

logstash/bin/logstash -f /home/canvas/Test/logstash-kafka.conf

The following prompt indicates successful startup:

successful startup of logtash

After the startup is successful, when the log file eve-http.json in the input file path written in /home/canvas/Test/logstash-kafka.conf has a new log generated, the consumer side of kafka, Will also output synchronously:

Synchronous output log
The red line part is the newly added mark of kafka, and the updated content in the log file is the part after the red line, which is consistent with the content in the log source file.

This completes our task.

created at:08-03-2021
edited at: 08-03-2021: