Skip to content
This repository has been archived by the owner on Mar 8, 2023. It is now read-only.

Simple server that scrapes HAProxy stats and exports them via HTTP for Prometheus consumption

License

Notifications You must be signed in to change notification settings

prometheus/haproxy_exporter

HAProxy Exporter for Prometheus

This is a simple server that scrapes HAProxy stats and exports them via HTTP for Prometheus consumption.

This exporter is retired

In all supported versions of HAProxy, the official source includes a Prometheus exporter module that can be built into your binary with a single flag during build time and offers a native Prometheus endpoint. For more information see down below.

Please transition to using the built-in support as soon as possible.

Getting Started

To run it:

./haproxy_exporter [flags]

Help on flags:

./haproxy_exporter --help

For more information check the source code documentation. All of the core developers are accessible via the Prometheus Developers mailinglist.

Usage

HTTP stats URL

Specify custom URLs for the HAProxy stats port using the --haproxy.scrape-uri flag. For example, if you have set stats uri /baz,

haproxy_exporter --haproxy.scrape-uri="http://localhost:5000/baz?stats;csv"

Or to scrape a remote host:

haproxy_exporter --haproxy.scrape-uri="http://haproxy.example.com/haproxy?stats;csv"

Note that the ;csv is mandatory (and needs to be quoted).

If your stats port is protected by basic auth, add the credentials to the scrape URL:

haproxy_exporter  --haproxy.scrape-uri="http://user:[email protected]/haproxy?stats;csv"

Alternatively, provide the password through a file, so that it does not appear in the process table or in the output of the /debug/pprof/cmdline profiling service:

echo '--haproxy.scrape-uri=http://user:[email protected]/haproxy?stats;csv' > args
haproxy_exporter @args

You can also scrape HTTPS URLs. Certificate validation is enabled by default, but you can disable it using the --no-haproxy.ssl-verify flag:

haproxy_exporter --no-haproxy.ssl-verify --haproxy.scrape-uri="https://haproxy.example.com/haproxy?stats;csv"

If scraping a remote HAProxy must be done via an HTTP proxy, you can enable reading of the standard $http_proxy / $https_proxy / $no_proxy environment variables by using the --http.proxy-from-env flag (these variables will be ignored otherwise):

export HTTP_PROXY="http://proxy:3128"
haproxy_exporter --http.proxy-from-env --haproxy.scrape-uri="http://haproxy.example.com/haproxy?stats;csv"

Unix Sockets

As alternative to localhost HTTP a stats socket can be used. Enable the stats socket in HAProxy with for example:

stats socket /run/haproxy/admin.sock mode 660 level admin

The scrape URL uses the 'unix:' scheme:

haproxy_exporter --haproxy.scrape-uri=unix:/run/haproxy/admin.sock

Docker

Docker Repository on Quay Docker Pulls

To run the haproxy exporter as a Docker container, run:

docker run -p 9101:9101 quay.io/prometheus/haproxy-exporter:latest --haproxy.scrape-uri="http://user:[email protected]/haproxy?stats;csv"

Development

Go Report Card Code Climate

Building

make build

Testing

CircleCI

make test

TLS and basic authentication

The HAProxy Exporter supports TLS and basic authentication.

To use TLS and/or basic authentication, you need to pass a configuration file using the --web.config.file parameter. The format of the file is described in the exporter-toolkit repository.

License

Apache License 2.0, see LICENSE.

Alternatives

Official Prometheus exporter

As of 2.0.0, HAProxy includes a Prometheus exporter module that can be built into your binary during build time. For HAProxy 2.4 and higher, pass the USE_PROMEX flag to make:

make TARGET=linux-glibc USE_PROMEX=1

Pre-built versions, including the Docker image, typically have this enabled already.

Once built, you can enable and configure the Prometheus endpoint from your haproxy.cfg file as a typical frontend:

frontend stats
    bind *:8404
    http-request use-service prometheus-exporter if { path /metrics }
    stats enable
    stats uri /stats
    stats refresh 10s

For more information, see this official blog post.