Pulls local Docker stats as a Redpanda Kafka stream into Deephaven
Redpanda is an open-source Kafka-compatible event streaming platform. This sample app shows how to ingest Docker stats data from Redpanda into Deephaven.
This app runs using Deephaven with Docker. See our Quickstart.
docker-compose.yml
- The Docker Compose file for the application. This is the same as the Deephavendocker-compose
file with Redpanda described in our Simple Kafka import.kafka-produce.py
- The Python script that pulls the data from Docker stats into streaming Kafka data onto Redpanda.data/app.d/start.app
- The Deephaven application mode app file.data/app.d/tables.py
- The Python script that pulls the data from Kafka stream and stores it into Deephaven.
This app pulls data from the local Docker containers. The data is placed into a Redpanda Kafka stream.
Once data is collected in Kafka, Deephaven consumes the stream.
To launch the latest release, you can clone the repository and run via:
git clone https://github.com/deephaven-examples/redpanda-docker-stats.git
cd redpanda-docker-stats
docker-compose up -d
Or, you may download the release docker-compose.yml file if preferred:
mkdir redpanda-docker-stats
cd redpanda-docker-stats
curl https://raw.githubusercontent.com/deephaven-examples/redpanda-docker-stats/main/release/docker-compose.yml -o docker-compose.yml
docker-compose up -d
This starts the containers needed for Redpanda and Deephaven.
To start listening to the Kafka topic docker-stats
, navigate to http://localhost:10000/ide.
In the Panels table you will see a table for docker-stats
and a figure for memoryUsage
The Python script uses confluent_kafka and you must have this installed on your machine. To install, run:
pip install confluent_kafka
To produce the Kafka stream, execute the kafka-produce.py
script in your terminal:
python3 ./kafka-produce.py
The code in this repository is built for Deephaven Community Core v0.10.0. No guarantee of forward or backward compatibility is given.