Skip to content

Files

Latest commit

b2f185a · Sep 14, 2021

History

History
This branch is 7382 commits behind DataDog/integrations-core:master.

confluent_platform

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
Jan 27, 2021
Aug 30, 2021
Mar 25, 2021
Aug 12, 2021
Mar 18, 2020
Sep 14, 2021
Sep 17, 2020
Mar 26, 2021
Mar 18, 2020
Mar 18, 2020
Feb 1, 2021
Jun 28, 2021

Agent Check: Confluent Platform

Overview

This check monitors Confluent Platform and Kafka components through the Datadog Agent.

This integration collects JMX metrics for the following components:

  • Broker
  • Connect
  • Replicator
  • Schema Registry
  • ksqlDB Server
  • Streams
  • REST Proxy

Setup

Installation

The Confluent Platform check is included in the Datadog Agent package. No additional installation is needed on your Confluent Platform component server.

Note: This check collects metrics with JMX. A JVM is required on each node so the Agent can run jmxfetch. It is recommended to use an Oracle-provided JVM.

Configuration

  1. Edit the confluent_platform.d/conf.yaml file, in the conf.d/ folder at the root of your Agent's configuration directory to collect your Confluent Platform performance data. See the sample confluent_platform.d/conf.yaml for all available configuration options.

    For each component, you need to create a separate instance to collect its JMX metrics. The list of default metrics collected are listed in metrics.yaml file, for example:

    instances:
     - host: localhost
       port: 8686
       name: broker_instance
       user: username
       password: password
     - host: localhost
       port: 8687
       name: schema_registry_instance
     - host: localhost
       port: 8688
       name: rest_proxy_instance
  2. Restart the Agent.

Log collection

{{< site-region region="us3" >}} Log collection is not supported for this site. {{< /site-region >}}

Available for Agent versions >6.0

  1. Collecting logs is disabled by default in the Datadog Agent, you need to enable it in datadog.yaml:

    logs_enabled: true
  2. Add this configuration block to your confluent_platform.d/conf.yaml file to start collecting your Confluent Platform components logs:

      logs:
        - type: file
          path: <CONFLUENT_COMPONENT_PATH>/logs/*.log
          source: confluent_platform
          service: <SERVICE_NAME>
          log_processing_rules:
            - type: multi_line
              name: new_log_start_with_date
              pattern: \[\d{4}\-\d{2}\-\d{2}

    Change the path and service parameter values and configure them for your environment. See the sample confluent_platform.d/conf.yaml for all available configuration options.

  3. Restart the Agent.

Metric collection

For containerized environments, see the Autodiscovery with JMX guide.

Validation

Run the Agent's status subcommand and look for confluent_platform under the JMXFetch section.

    ========
    JMXFetch
    ========

      Initialized checks
      ==================
        confluent_platform
          instance_name : confluent_platform-localhost-31006
          message :
          metric_count : 26
          service_check_count : 0
          status : OK

Data Collected

Metrics

See metadata.csv for a list of metrics provided by this check.

Events

The Confluent Platform check does not include any events.

Service Checks

See service_checks.json for a list of service checks provided by this integration.

Troubleshooting

Need help? Contact Datadog support.