Linking chat prompt with trace #5155
-
Hello all, const langfusePrompt = await this.langfuse.getPrompt(name, version, { label: 'production', type: 'chat' });
const prompt = ChatPromptTemplate.fromMessages(langfusePrompt.getLangchainPrompt().map((m) => [m.role, m.content])).withConfig({
metadata: {
langfusePrompt,
},
});
return prompt.pipe(llm.withStructuredOutput(answerSchema)).stream(
{
currentDate,
question: input.question,
context,
},
{ callbacks: [langfuseHandler] }
); is not linking the prompt with the trace correctly. Only when I add and empty pass through to the chain: const test = async (prompt) => {
console.log('???', prompt);
return prompt;
};
return prompt
.pipe(test) // or new RunnablePassthrough() from langchain
.pipe(llm.withStructuredOutput(answerSchema))
.stream(
{
currentDate,
question: input.question,
context,
},
{ callbacks: [langfuseHandler] }
); It starts working and my trace is correctly linked to the prompt. Could somebody please help me explain why and who is it to blame? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
I couldn't find specific information about the internal mechanisms of the A similar unsolved discussion might be relevant: Cannot connect prompt to trace when using Langchain AzureChatOpenai, which was last updated on 2024-07-19 [1]. This might provide additional insights or context related to your issue. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
This is interesting, can you open a bug issue as this is an interesting case to investigate and improve |
Beta Was this translation helpful? Give feedback.
This is interesting, can you open a bug issue as this is an interesting case to investigate and improve