Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Supporting JSON as-is encoding in OTLP #524

Open
scheler opened this issue Jan 9, 2024 · 2 comments
Open

Supporting JSON as-is encoding in OTLP #524

scheler opened this issue Jan 9, 2024 · 2 comments

Comments

@scheler
Copy link
Contributor

scheler commented Jan 9, 2024

This topic came up in the discussion here so I want to explore this further in a separate issue. It is noted at https://github.com/tigrannajaryan/exp-jsevent/blob/main/README.md that JSON as-is is at least twice as fast as OTLP/JSON, hence it deserves consideration.

OTLP/JSON was introduced PRIMARILY for support in client side scenarios (browsers and mobile) to reduce the package size avoiding protobuf library - ref. As such, implementations for OTLP/JSON does not exist in many language SDKs, but only in a few - JS and PHP (not sure why).

While this was helpful for reducing browser SDK size, there is still scope for improvement if we choose to avoid converting objects into OTLP/JSON and instead use plain JSON, for eg., using JSON.stringify. This helps with both the SDK size reduction as well as exporter performance. Note that I am not mentioning the size of the data on the wire, as compression on either form of json encoding would result in similar size.

If we are able to support JSON as-is in OTLP as an additional form of encoding, the changes are needed only in a very few languages and collector.

One other large downside is that this puts additional burden on the vendors implementing receivers, however, the tradeoff is the performance improvement in browser sdk.

@dyladan
Copy link
Member

dyladan commented Jan 16, 2024

As the JS maintainer I'm not sure about this proposal. The internal SDK representation wasn't meant to be a wire representation when it was created and we would not be able to make certain stability guarantees that I think would be required.

In JS, we're aware that there are challenges related to the SDK representation not matching the OTLP representation of objects. JSON receivers and the protobuf library we use expect objects to be in a very specific format for encoding. Because of this, we have a transformation library which needs to rewrite objects into the expected format before creation of an OTLP message. This is done for all OTLP messages regardless of the transport used, and is the primary source of serialization-related overhead in JS.

Right now in JS we're working on a 2.0 for the SDK which we hope to land in the spring. One of the major initiatives is to make the internal SDK representation match the OTLP expected representation, which will allow us to remove this transformation library and the overhead it causes. open-telemetry/opentelemetry-js#4385

@johnbley
Copy link
Member

When building the first version of our web rum product, I looked at otlp-json and encountered similar problems - significant performance penalties on download size, telemetry payload size, and thrash objects/cpu. I eventually just used zipkin as a stopgap solution. It would be great if any combination of { removing translation layers, a cleaner otlp-in-json encoding spec, code size reduction of the exporter itself } happened. It looks like the work you cited above would help quite a bit, particularly if the completely custom encoder happened (which, as you point out, would be the more technically demanding option).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants