-
Hi, I have created new proxy middleware in asp net core application and trying to communicate langfuse with openrouter. Now my question is why Langfuse didn't show this response in their output window. Thanks. |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 5 replies
-
Hey @nikunjce!👋 I'm here to help you with bugs, questions, and becoming a contributor. Let's tackle this issue together! To display the output in Langfuse's "output" window, the response needs to adhere to a specific JSON schema. The expected structure is a JSON object with the following properties:
Here's an example of the expected response structure: {
"steps": [
{
"explanation": "The assistant provides the capital of Italy.",
"output": "The capital of Italy is Rome."
}
],
"final_answer": "The capital of Italy is Rome."
} You need to transform your API response to match this structure for Langfuse to display it correctly in the "output" window [1]. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
Thanks for your reply. Currently when I add custom model directly by using Openrouter URL then it is working fine without any issue But when I add custom model by using custom middleware URL then Langfuse UI didn't show output in output window. So I would like to know that should I need to pass anything in request so Openrouter will return those parameters in response OR I need to change Openrouter response and need to add your suggested parameters in it. Thanks. |
Beta Was this translation helpful? Give feedback.
-
Can you please provide me the link or description of the langfuse supported response or Please provide me the example for it so I ca create structured response which will be supported by the Langfuse. |
Beta Was this translation helpful? Give feedback.
-
Hi @nikunjce, Thanks for the ping via other channels that this is unresolved. Generally, Langfuse supports the openai api schema as it is implemented by the openai api. Under the hood the Langchain JS Openai Class is used here. When setting up a new model, Langfuse tests whether it works by requesting a test completion. Afterwards the model can be used in the UI.
Is this a general problem when using openrouter or specific to your setup/implementation? |
Beta Was this translation helpful? Give feedback.
Thanks for your help. I think there was an issue in settings or something else. I don't know exactly what was the problem but now everything is working fine.