-
Notifications
You must be signed in to change notification settings - Fork 1.3k
How to use llm outputs in the on_handoff function #567
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@rm-openai can you help with this please ? |
Yeah you can do this, would it work?
|
no no
|
This issue is stale because it has been open for 7 days with no activity. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi everyone!
I'm trying to creation agent that will extract load details and detect warnings
After that it should
handoff
to another agentso when it extracted the details we need to send request to another service to visualize them
I want to do it using
on_handoff
, is there any way to get LLM output on this function ?The text was updated successfully, but these errors were encountered: