Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JMX: Tomcat sessions is always exported as 0 #1360

Closed
sky333999 opened this issue Jul 1, 2024 · 7 comments
Closed

JMX: Tomcat sessions is always exported as 0 #1360

sky333999 opened this issue Jul 1, 2024 · 7 comments
Assignees
Labels
component: jmx-metrics needs author feedback Waiting for additional feedback from the author Stale

Comments

@sky333999
Copy link

Component(s)

jmx-metrics

What happened?

Description

When using tomcat as the target system for jmx-metrics, tomcat.sessions is always emitted as 0 despite the MXBeans showing a non-zero value for the activeSessions attribute.

Steps to Reproduce

  • Spin up a tomcat server and install any sample web app that leverages sessions on it.
  • Create sessions and confirm by looking at the Tomcat manager that sessions is non-zero.
image In this example, **shoppingcart** is a sample app and as can be seen, the sessions count is 1.
  • Use a tool such as jconsole or jmxterm to query MXBeans and confirm activeSessions is non-zero.
$>get -b Catalina:context=/shoppingcart,host=localhost,type=Manager activeSessions
#mbean = Catalina:context=/shoppingcart,host=localhost,type=Manager:
activeSessions = 1;

The above is the output from jmxterm to query the MXBeans. As can be seen, the value matches the expectation.

  • Observe the value of tomcat.sessions coming from jmx-metrics. For example, setup an OTel collector pipeline with a JMX receiver and debug exporter.

Expected Result

I expect the value of tomcat.sessions as per the definition here to also be non-zero.

Actual Result

The value of tomcat.sessions is always 0.

Component version

v1.35.0

Log output

No response

Additional context

No response

@jefchien
Copy link
Contributor

jefchien commented Jul 1, 2024

Looked into the code a bit and I think I figured it out. It's because the MBean for the tomcat manager

def beantomcatmanager = otel.mbean("Catalina:type=Manager,host=localhost,context=*")
otel.instrument(beantomcatmanager, "tomcat.sessions", "The number of active sessions.", "sessions", "activeSessions", otel.&doubleValueCallback)

and many others are being defined as a single bean

MBeanHelper mbean(String objNameStr) {
def mbeanHelper = new MBeanHelper(jmxClient, objNameStr, true)
mbeanHelper.fetch()
return mbeanHelper

when in reality, the object name includes a wildcard and after performing a query against the MBean server, Catalina:type=Manager,host=localhost,context=* resolves into multiple MBeans

  • Catalina:context=/shoppingcart,host=localhost,type=Manager
  • Catalina:context=/,host=localhost,type=Manager
  • etc.

What ends up happening is that the gatherer will fetch all the MBeans that match the pattern and then because it thinks it's a single MBean, it only keeps the first one.

@PackageScope List<GroovyMBean> getMBeans() {
if (mbeans == null || mbeans.size() == 0) {
logger.warning("No active MBeans. Be sure to fetch() before updating any applicable instruments.")
return []
}
return isSingle ? [mbeans[0]]: mbeans
}

So, the metrics don't represent an aggregated value of all the beans that match the pattern. It's just the first one, which happens to be 0.

@jefchien
Copy link
Contributor

jefchien commented Jul 2, 2024

It looks like this is a simple case of misconfiguration. The JVM metrics are defined correctly using otel.mbeans

def garbageCollector = otel.mbeans("java.lang:type=GarbageCollector,*")

The Tomcat metrics just need to be updated to use otel.mbeans instead of otel.mbean.

@jefchien
Copy link
Contributor

jefchien commented Jul 2, 2024

Not quite so simple. The gatherer needs to be updated to support aggregating metrics with the same attributes. After changing the metrics to otel.mbeans, it's calling record on each of the individual data points, which results in it dropping all but the first.

{"caller":"subprocess/subprocess.go:280","msg":"WARNING: Instrument tomcat.sessions has recorded multiple values for the same attributes: {}","kind":"receiver","name":"jmx","data_type":"metrics"}

@breedx-splk
Copy link
Contributor

Just to circle back on this -- there's currently an effort underway to deprecate/remove the jmx-metrics module here in favor of a new jmx-scraper. That new module will use yaml configs instead of groovy, and will share configs with the jmx instrumentation in the agent.

In the first pass, it does look like we are continuing to support the opt-in aggregation flag (as implemented in #1366), but long term we want to report values from all the MBeans with some mbean name as an attribute/dimension so that it can be aggregated or ignored as appropriate in backends.

@SylvainJuge
Copy link
Contributor

Hi,

With the jmx-scraper the metrics definitions will be in YAML, in this particular case for Tomcat metrics adding an extra metric attribute with the context allows to solve this without having to compute a sum.

In the current PR proposal #1485 (which I hope will be merged soon), the metrics for Catalina:type=Manager,host=localhost,context=* MBean definition will include the context as a metric attribute as you can see here:

# minor divergence from tomcat.groovy to capture metric for all deployed webapps
context: param(context)

In the short, term once #1485 is merged and released you should be able to test this jmx-scraper component as the metrics should be the same as with the current jmx-metrics.

@breedx-splk
Copy link
Contributor

Yes, I'm happy with where this has landed. The other module has the opt-in aggregation flag, and the (new) jmx scraper has the more correct dimension. Any reason to leave this open then?

@breedx-splk breedx-splk added the needs author feedback Waiting for additional feedback from the author label Oct 21, 2024
Copy link
Contributor

This has been automatically marked as stale because it has been marked as needing author feedback and has not had any activity for 7 days. It will be closed if no further activity occurs within 7 days of this comment.

@github-actions github-actions bot added the Stale label Oct 28, 2024
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Nov 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: jmx-metrics needs author feedback Waiting for additional feedback from the author Stale
Projects
None yet
Development

No branches or pull requests

8 participants