Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve how metrics are captured from the OTel receiver #4467

Open
yvrhdn opened this issue Dec 18, 2024 · 2 comments
Open

Improve how metrics are captured from the OTel receiver #4467

yvrhdn opened this issue Dec 18, 2024 · 2 comments
Labels
help wanted Extra attention is needed stale Used for stale issues / PRs

Comments

@yvrhdn
Copy link
Member

yvrhdn commented Dec 18, 2024

Is your feature request related to a problem? Please describe.

The distributor embeds the OTel receiver to receive traces. We currently very selectively export the metrics and convert them to Prometheus metrics. This means not all metrics are exposed and we might miss information.

Describe the solution you'd like

Export all metrics emitted by the receiver code and push them into our Prometheus registry.

Describe alternatives you've considered

We don't have to do this and keep the current situation.

Additional context

When we create the receiver we pass these TelemetrySettings:

params := receiver.CreateSettings{
ID: component.NewIDWithName(nopType, fmt.Sprintf("%s_receiver", componentID.Type().String())),
TelemetrySettings: component.TelemetrySettings{
Logger: zapLogger,
TracerProvider: traceProvider,
MeterProvider: meterProvider,
ReportStatus: func(*component.StatusEvent) {
},
},
}
receiver, err := factoryBase.CreateTracesReceiver(ctx, params, cfg, middleware.Wrap(shim))

The custom MeterProvider is implemented here: https://github.com/grafana/tempo/blob/main/modules/distributor/receiver/metrics_provider.go

We could instead create a MeterProvider that exports to our prometheus registry. The code would look something like this:

    // create an OTel metrics -> Prometheus exporter
	exporterOpts := []otelexporterprometheus.Option{
		otelexporterprometheus.WithRegisterer(reg),  // this is our main prometheus.Registry
        // optional, depends if you want otel_scope_info and otel_target_info metrics or not
		otelexporterprometheus.WithoutScopeInfo(),
		otelexporterprometheus.WithoutTargetInfo(),
	}
	exporter, err := otelexporterprometheus.New(exporterOpts...)
	// handle err

    // create a MeterProvider with the exporter as reader
	meterProviderOpts := []sdkmetric.Option{
		sdkmetric.WithReader(exporter),
	}
	meterProvider := sdkmetric.NewMeterProvider(meterProviderOpts...)

Then you can pass this meterProvider to TelemetrySettings.

@joe-elliott joe-elliott added the help wanted Extra attention is needed label Dec 19, 2024
@yvrhdn
Copy link
Member Author

yvrhdn commented Dec 19, 2024

I took a quick stab at this and using the exporter changes the metric names. If we want to keep the same metrics we will need to setup a view to rename metrics I think. I'm not super familiar with how that works tbh.

As a first step I set up a test to check the metrics are being emitted correctly: #4477

(I probably won't look further into this unless it prevents us from upgrading OTel)

Copy link
Contributor

This issue has been automatically marked as stale because it has not had any activity in the past 60 days.
The next time this stale check runs, the stale label will be removed if there is new activity. The issue will be closed after 15 days if there is no new activity.
Please apply keepalive label to exempt this Issue.

@github-actions github-actions bot added the stale Used for stale issues / PRs label Feb 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed stale Used for stale issues / PRs
Projects
None yet
Development

No branches or pull requests

2 participants