Skip to content

How to use llm outputs in the on_handoff function #567

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Salaudev opened this issue Apr 22, 2025 · 5 comments
Open

How to use llm outputs in the on_handoff function #567

Salaudev opened this issue Apr 22, 2025 · 5 comments
Labels
question Question about using the SDK stale

Comments

@Salaudev
Copy link

Hi everyone!
I'm trying to creation agent that will extract load details and detect warnings
After that it should handoff to another agent

so when it extracted the details we need to send request to another service to visualize them
I want to do it using on_handoff , is there any way to get LLM output on this function ?

def update_details() -> str:
    # send request to update details with A structure
    # send request to upsert warnings with B structure
    return details

# Define the agents
extractor_agent = Agent[OrchestratorContext](
    name="Triage Agent",
    instructions=prompt_with_handoff_instructions(
        "You are responsible to extract the load details ...."
    ),
    model=OpenAIChatCompletionsModel(
        model=os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME"), openai_client=azure_client
    ),
    handoffs=[handoff(agent=intent_detector_agent, on_handoff=update_details)]
)
@Salaudev Salaudev added the question Question about using the SDK label Apr 22, 2025
@Salaudev Salaudev changed the title How to use llm outputs in the handoff function How to use llm outputs in the on_handoff function Apr 22, 2025
@Salaudev
Copy link
Author

@rm-openai can you help with this please ?

@rm-openai
Copy link
Collaborator

Yeah you can do this, would it work?

class LoadData(BaseModel):
  some_field: str
  ...

async def update_details(ctx: RunContextWrapper, input: LoadData):
 ...

handoff_obj = handoff(agent=intent_detector_agent, on_handoff=update_details, input_type=LoadData)

extractor_agent = Agent(..., handoffs=[handoff_obj])

@Salaudev
Copy link
Author

Yeah you can do this, would it work?

class LoadData(BaseModel):
  some_field: str
  ...

async def update_details(ctx: RunContextWrapper, input: LoadData):
 ...

handoff_obj = handoff(agent=intent_detector_agent, on_handoff=update_details, input_type=LoadData)

extractor_agent = Agent(..., handoffs=[handoff_obj])

no no
in the on_handoff function I need to use LLM output result

extractor_agent extracted some details and handoff to the next agent
when handoff happened I need to change the context

@rm-openai
Copy link
Collaborator

@Salaudev got it. Its not possible right now, but I'm working on adding the message history to the context in #572. Hopefully ready soon.

Copy link

This issue is stale because it has been open for 7 days with no activity.

@github-actions github-actions bot added the stale label Apr 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question about using the SDK stale
Projects
None yet
Development

No branches or pull requests

2 participants