Skip to content

This project implements gathering and sending OS runtime metrics to Kafka and then to PostgreSQL DB

Notifications You must be signed in to change notification settings

suankan/streaming-os-metrics

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Description

This project goal:

  • Gathering OS metrics from any OS.
  • Sending OS metrics to Kafka topic.
  • Ingesting OS metrics from Kafka topic.
  • Storing obtained metrics into Postgres DB.

Used libs

We use Confluent python lib confluent_kafka because of its performance. Its based on native C lib librdkafka. Quick googling reveals huge performance advantage of it over other Kafka clients for Python. Example performance comparison here.

How to execute unit tests

Please see the code for details on what exactly do we test, how and why.

You can execute tests using this example:

$ python -m unittest metrics_test.py -v
test_get_methods_return_dict (metrics_test.TestMetrics) ... Testing method get_cpu_freq
Testing method get_cpu_percent
Testing method get_cpu_stats
Testing method get_cpu_times
Testing method get_cpu_times_percent
Testing method get_disk_io_counters
Testing method get_disk_usage
Testing method get_host_info
Testing method get_load_average
Testing method get_net_io_counters
Testing method get_swap_memory
Testing method get_virtual_memory
ok

----------------------------------------------------------------------
Ran 1 test in 0.440s

OK

Gathering OS metrics via class Metrics

Gathering OS runtime metrics for CPU, Memory and Disks in JSON format is implemented via module metrics.py

How to execute Kafka Producer

Put metrics.py and producer.py on the server which you want to monitor.

Execute producer.py script and specify cmd options for Kafka broker server:port, TLS certificates and interval.

This script will be:

  • Taking OS metrics using class Metrics
  • Sending them to Kafka topic
python producer.py \
  --broker kafka-39b301ca-kansuan-4650.aivencloud.com:14598 \
  --cacert certs/ca.pem \
  --cert certs/service.cert \
  --certkey certs/service.key \
  --interval 2

How to execute Kafka Consumer

Execute the script with specifying Kafka broker server:port, Postgres connection string, TLS certificates and polling interval.

The script will start polling Kafka for new messages and will be putting obtained messages into Postgres DB.

python consumer.py \
  --dsn 'postgres://USER:[email protected]:14596/defaultdb?sslmode=require' \
  --broker kafka-39b301ca-kansuan-4650.aivencloud.com:14598 \
  --cacert certs/ca.pem \
  --cert certs/service.cert \
  --certkey certs/service.key \
  --interval 2

About

This project implements gathering and sending OS runtime metrics to Kafka and then to PostgreSQL DB

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages