Kafka Stream Processor is a simple yet powerful application for producing and consuming messages using Apache Kafka. It includes logging functionality to track the flow of data and handle errors effectively.
Kafka Stream Processor is designed to simplify the process of sending and receiving messages using Apache Kafka. It supports message compression, error handling, and detailed logging to ensure reliability and traceability of the data flow.
- Produce and consume messages with Apache Kafka
- Detailed logging for monitoring and debugging
- Message compression using GZIP
- Graceful shutdown and error handling
Before you begin, ensure you have the following installed:
- Node.js (v20 or higher)
- Docker (for running Kafka and Zookeeper)
- Clone the repository:
git clone https://github.com/williamkoller/kafka-stream-processor.git
cd kafka-stream-processor
- Install the dependencies:
npm install
To run the main and send messages to the Kafka topic:
- Ensure Kafka and Zookeeper are running. You can use Docker Compose for convenience:
docker-compose down -v && docker-compose up -d
- Start:
npm run start
You can configure the Kafka connection and other settings using environment variables. Create a .env
file in the root directory with the following content:
APP_NAME=microservice-node
BOOTSTRAP_SERVER=localhost:9092
KAFKA_TOPIC=topic-test
src
├── infra
│ ├── log
│ │ └── index.ts
│ └── streams
│ └── kafka
│ ├── consumer.ts
│ ├── index.ts
│ └── producer.ts
└── main.ts
- infra/log: Contains the logging setup using a custom logger.
- infra/streams/kafka: Contains the Kafka producer and consumer implementations.
- main.ts: Entry point for the application, initializing the producer and consumer.