Skip to content

Latest commit

 

History

History
98 lines (70 loc) · 2.83 KB

README.md

File metadata and controls

98 lines (70 loc) · 2.83 KB

PubNub Kafka Connector

This codebase includes a PubNub Sink Connector. Kafka topic records can be coppied to a PubNub channel. The Kafka topic name will match PubNub channel name.

✅ Requirements

⬆️ Starting the local environment

With the connector properly built, you need to have a local environment to test it. This project includes a Docker Compose file that can spin up container instances for Apache Kafka and Kafka Connect. Additionally a sample producer feed starts on the "pubnub" topic. This feed emits a sample message every few seconds.

Start the containers using Docker Compose.

docker compose up

Wait until the containers kafka and connect are started and healthy.

⏯ Deploying and the Sink Connector

After the containers kafka and connect are started and healthy, then proceed. The following command will copy data from the configured Kafka Topic name to the PubNub Channel with the same name.

  • Deploy the connector.
curl -X POST \
    -d @examples/pubnub-sink-connector.json \
    -H "Content-Type:application/json" \
    http://localhost:8083/connectors

Now you can see the message deliveries with this web tool: PubNub Web Console

You can see messages are being delivered successfully using the PubNub Web Console.

📝 Modify Settings

Edit the configuration file: ./examples/pubnub-sink-connector.json. Update the file to include your publish_key, subscribe_key and secret_key. Update the topics to match the topics you want to sink to PubNub. Add topics.regex to match topic patterns!

{
    "name": "pubnub-sink-connector",
    "config": {
        "topics":"pubnub,pubnub1,pubnub2",
        "topics.regex":"",
        "pubnub.publish_key": "demo",
        "pubnub.subscribe_key": "demo",
        "pubnub.secret_key": "demo",
        "connector.class": "com.pubnub.kafka.connect.PubNubKafkaSinkConnector",
        "tasks.max": "3",
        "value.deserializer":"custom.class.serialization.JsonDeserializer",
        "value.serializer":"custom.class.serialization.JsonSerializer"
    }
}

⏹ Undeploy the connector

  • Use the following command to undeploy the connector from Kafka Connect:
curl -X DELETE \
    http://localhost:8083/connectors/pubnub-sink-connector

⬇️ Stopping the local environment

  • Stop the containers using Docker Compose.
docker compose down

⚙️ Building the connector locally

Build the Kafka Connect connector locally

mvn clean package

💡 A file named target/pubnub-kafka-connector-1.x.jar will be created.