You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description:
The goal of this task is to integrate an Apache Flink process as a separate Verticle within the existing Vert.x server. The Flink process should:
Read messages from a RabbitMQ queue.
Insert the processed data into the same PostgreSQL database used by the Vert.x server.
Subtasks:
Set up Apache Flink as a Verticle in the Vert.x Server:
Create a new Flink processing Verticle.
Configure the Flink environment to run within the Vert.x ecosystem.
Ensure the Verticle is deployed along with the main Vert.x server on startup.
RabbitMQ Connection and Configuration:
Add RabbitMQ connection configurations in the Flink Verticle.
Set up RabbitMQ consumer in Flink to read data from a specified queue.
PostgreSQL Integration:
Share the existing PostgreSQL connection pool (e.g., HikariCP) between the Vert.x server and the Flink Verticle.
Insert the data consumed from RabbitMQ into PostgreSQL using this shared connection pool.
Verticle Deployment:
Modify the Vert.x server startup script to deploy both the HTTP server Verticle and the new Flink processing Verticle simultaneously.
Ensure proper handling of lifecycle events such as startup and shutdown.
Error Handling & Logging:
Add robust error handling mechanisms for RabbitMQ connection failures, PostgreSQL insertion issues, etc.
Log errors and important events in the system for easy debugging.
Testing:
Write unit and integration tests to verify that:
Data is correctly consumed from RabbitMQ.
Data is successfully inserted into PostgreSQL.
The Verticle deployment and lifecycle management work as expected.
The text was updated successfully, but these errors were encountered:
Gokul-KG
changed the title
Audit Server Prototype With Apache Flink
Integrate Apache Flink with Audit Server for RabbitMQ Data Insertion into PostgreSQL
Oct 16, 2024
Description:
The goal of this task is to integrate an Apache Flink process as a separate Verticle within the existing Vert.x server. The Flink process should:
Subtasks:
Create a new Flink processing Verticle.
Configure the Flink environment to run within the Vert.x ecosystem.
Ensure the Verticle is deployed along with the main Vert.x server on startup.
Add RabbitMQ connection configurations in the Flink Verticle.
Set up RabbitMQ consumer in Flink to read data from a specified queue.
PostgreSQL Integration:
Share the existing PostgreSQL connection pool (e.g., HikariCP) between the Vert.x server and the Flink Verticle.
Insert the data consumed from RabbitMQ into PostgreSQL using this shared connection pool.
Verticle Deployment:
Modify the Vert.x server startup script to deploy both the HTTP server Verticle and the new Flink processing Verticle simultaneously.
Ensure proper handling of lifecycle events such as startup and shutdown.
Error Handling & Logging:
Add robust error handling mechanisms for RabbitMQ connection failures, PostgreSQL insertion issues, etc.
Log errors and important events in the system for easy debugging.
Write unit and integration tests to verify that:
Data is correctly consumed from RabbitMQ.
Data is successfully inserted into PostgreSQL.
The Verticle deployment and lifecycle management work as expected.
The text was updated successfully, but these errors were encountered: