Apache Kafka – How to Setup, Start and Test?

How to setup and test Apache Kafka? Before jumping to that topic lets understand what Kafka is? Apache Kafka is a distributed streaming platform that is used publish and subscribe to streams of records. Apache Kafka maintains the commits logs and can process large amount of logs and data with great performance.

Apache Kafka can be used in maintaining huge logs, messages, data objects, notifications, auditing huge amount of data etc. Streaming is another example where Apache Kafka fits best.

Kafka has four core APIs:

  • The Producer API allows an application to publish a stream of records to one or more Kafka topics.
  • The Consumer API allows an application to subscribe to one or more topics and process the stream of records produced to them.
  • The Streams API allows an application to act as a stream processor, consuming an input stream from one or more topics and producing an output stream to one or more output topics, effectively transforming the input streams to output streams.
  • The Connector API allows building and running reusable producers or consumers that connect Kafka topics to existing applications or data systems. For example, a connector to a relational database might capture every change to a table.
How to setup Apache Kafka?

How to setup Apache Kafka?

Let’s jump to how can we install and run it, because that’s what we are here for 🙂 Follow these simple steps to install and setup Apache kafka topics.

NOTE : Following steps are written for linux based operating systems. Please replace “.sh” file with “.bat” file in windows.

Install and Run Apache Kafka

  • Download apache Kafka from https://kafka.apache.org/downloads
  • Extract the file and go to home directory of installation (say $kafka_home)
  • In a new terminal, Start Zookeeper using
    ./bin/zookeeper-server-start.sh config/zookeeper.properties
  • In a new terminal, Start Kafka server using
    ./bin/kafka-server-start.sh config/server.properties

Create and list topics

  • In a new terminal, run following command to create a new Kafka topic
    ./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic sampleTopic
  • To list available topics, run following command by replacing required params
    ./bin/kafka-topics.sh --list --zookeeper localhost:2181
  • To describe details of an existing kafka topic, run following command
    ./bin/kafka-topics.sh --describe --zookeeper localhost:2181 --topic sampleTopic

Create Kafka Producer and Consumer

  • In a new terminal, create a Kafka producer using following command
    ./bin/kafka-console-producer.sh --broker-list localhost:9092 --topic sampleTopic
  • In a new terminal, create a Kafka Consumer, run following command
    ./bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic sampleTopic --from-beginning

Test the setup

  • Go to Producer terminal and type any messages to see it on consumer terminals

In another article, I have covered up on how you can integrate your application to Kafka Topics using Spring based API. Please read it at http://www.hybriscx.com/apache-kafka/apache-kafka-how-to-integrate-with-spring-boot-rest-api/

Thank you. Please feel to post your questions in comments section.

2540cookie-checkApache Kafka – How to Setup, Start and Test?

Leave a Reply

Your email address will not be published.