Good afternoon,

I share with you the second part of the previous article, a demo where I show how to use a kafka consumer connected to a scalable instance of an elasticsearch server.

This is the link of the GitHub project.  Download it and run the next command:

docker-compose up

This command will run a custom producer kafka micro service named demo-quartz, the custom consumer kafka micro service named demo-kafka-elastic, a kafka container node, a zookeeper container node, a elastic search container node and finally a kibana-sense container node.

Maybe you will want to run cerebro app client to see how the documents is loaded in elastic search node.

by the way, maybe you have a little problem with docker images and you have a lot of them! :). You can delete it with this command:

docker rmi -f $(docker images -q)

Have fun!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s