Building Scalable Microservices Architecture with Node.js and Kafka

Learn how to build a scalable microservices architecture using Node.js and Apache Kafka. This beginner-friendly tutorial covers core concepts, setup, and example code to get you started.

Microservices are an architectural style that structures an application as a collection of small services, each running in its own process and communicating with lightweight mechanisms. Node.js, with its asynchronous and event-driven nature, is a perfect fit for building microservices. When combined with Apache Kafka, a distributed event streaming platform, you can build scalable and resilient microservices that can handle high throughput and real-time communication.

In this tutorial, you will learn how to set up a simple Node.js microservices application using Kafka to communicate between services. We will build two microservices: a Producer service that sends messages and a Consumer service that receives and processes them.

### Prerequisites - Node.js installed (LTS version recommended) - Kafka installed and running locally (you can use services like Confluent Cloud or Docker to run Kafka) - Basic knowledge of JavaScript and Node.js

### Step 1: Setup Kafka If you don't have Kafka running yet, one of the easiest ways is using Docker:

javascript
docker run -d --name zookeeper -p 2181:2181 confluentinc/cp-zookeeper:latest

docker run -d --name kafka -p 9092:9092 --link zookeeper:zookeeper \
  -e KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181 \
  -e KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://localhost:9092 \
  -e KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1 \
  confluentinc/cp-kafka:latest

### Step 2: Initialize Node.js Projects Create two folders: `producer-service` and `consumer-service`. Inside each folder, initialize a new Node.js project:

javascript
mkdir producer-service && cd producer-service
npm init -y

mkdir ../consumer-service && cd ../consumer-service
npm init -y

### Step 3: Install Kafka Client We will use the popular `kafkajs` library to interact with Kafka.

javascript
cd producer-service
npm install kafkajs

cd ../consumer-service
npm install kafkajs

### Step 4: Create Producer Service Create a file named `producer.js` in `producer-service` folder with the following content:

javascript
const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'producer-service',
  brokers: ['localhost:9092'],
});

const producer = kafka.producer();

const run = async () => {
  await producer.connect();
  console.log('Producer connected');

  // Sending a message every 5 seconds
  setInterval(async () => {
    const message = { value: `Hello Kafka! Time: ${new Date().toISOString()}` };
    try {
      await producer.send({
        topic: 'test-topic',
        messages: [message],
      });
      console.log('Sent message:', message.value);
    } catch (error) {
      console.error('Failed to send message', error);
    }
  }, 5000);
};

run().catch(console.error);

### Step 5: Create Consumer Service Create a file named `consumer.js` in `consumer-service` folder with this code:

javascript
const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'consumer-service',
  brokers: ['localhost:9092'],
});

const consumer = kafka.consumer({ groupId: 'test-group' });

const run = async () => {
  await consumer.connect();
  console.log('Consumer connected');

  await consumer.subscribe({ topic: 'test-topic', fromBeginning: true });

  await consumer.run({
    eachMessage: async ({ topic, partition, message }) => {
      console.log(`Received message: ${message.value.toString()}`);
    },
  });
};

run().catch(console.error);

### Step 6: Running the Services Make sure your Kafka server is running. Open two terminal windows or tabs: - In the first one, start the consumer service: bash node consumer.js - In the second, start the producer service: bash node producer.js You should see the producer sending messages every 5 seconds and the consumer receiving them in real time.

### Conclusion In this tutorial, we built a basic scalable microservices architecture with Node.js and Kafka by creating two services communicating asynchronously via Kafka topics. As your application grows, you can add more microservices interacting through Kafka, benefiting from Kafka's robustness and scalability. This architecture decouples services and lets them scale independently, a key principle for modern applications.

For production use, consider adding error handling, monitoring, and deploy Kafka on a cluster instead of a single instance. You can also explore schemas for Kafka messages with tools like Avro.