Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Prompts are noy displayed correctly #5813

Open
kripper opened this issue Dec 21, 2024 · 1 comment
Open

[BUG] Prompts are noy displayed correctly #5813

kripper opened this issue Dec 21, 2024 · 1 comment
Labels
enhancement New feature or request question Further information is requested

Comments

@kripper
Copy link

kripper commented Dec 21, 2024

image
Is it possible to view the prompts as Markdown instead of escaped JSON strings, which are hard on the eyes?

This is a trace of "vertex_ai/gemini-2.0-flash-exp" via the LiteLLM OTEL callback.

@kripper kripper added enhancement New feature or request triage issues that need triage labels Dec 21, 2024
@github-project-automation github-project-automation bot moved this to 📘 Todo in phoenix Dec 21, 2024
@dosubot dosubot bot added the question Further information is requested label Dec 21, 2024
@kripper kripper changed the title [QUESTION] View prompts markdown [QUESTION] View prompts correctly Dec 22, 2024
@kripper kripper changed the title [QUESTION] View prompts correctly [BUG] Prompts are noy displayed correctly Dec 22, 2024
@mikeldking
Copy link
Contributor

Hey @kripper - thanks for your note - it looks like you might be using openLLMetry which has different conventions around llm semantic conventions. We are participating in the genai conventions group but currently don't support LiteLLM instrumentation from traceloop.

If you'd like to try our LiteLLM instrumentation please check out the openinference intsrumentation https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-litellm

Thanks for your feedback. We're hoping we can land on a good set of conventions that all backends support!

@mikeldking mikeldking removed the triage issues that need triage label Dec 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request question Further information is requested
Projects
Status: 📘 Todo
Development

No branches or pull requests

2 participants