We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
let llmOutput = null; const prompt = getPrompt(autobotParameter?.prePrompt); const outputParser = new HttpResponseOutputParser(); /** * Chat models stream message chunks rather than bytes, so this * output parser handles serialization and byte-encoding. */ // const chain = RunnableSequence.from([prompt, model, outputParser]); const chain = prompt.pipe(model).pipe(outputParser); // Create a context object with all required template variables const contextData = { chat_history: formattedPreviousMessages, input: rewrittenIntent?.lc_kwargs.content, context: formatContext(getContext.data), }; const stream = await chain.stream(contextData, { callbacks: [ { // lc_serializable: true, handleLLMEnd: (output: any) => { console.log('🚀 ~ POST ~ output:', output); // Get lc_kwargs from output llmOutput = output.generations[0][0]?.message; }, }, ], }); // Safely set the headers after `llmOutput` is ready return new StreamingTextResponse(stream);
If i use callbacks it more likely return the response first then the callback is hit. Any other approach can I write?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
If i use callbacks it more likely return the response first then the callback is hit. Any other approach can I write?
The text was updated successfully, but these errors were encountered: