Releases: run-llama/llama_deploy
Releases · run-llama/llama_deploy
v0.2.4
v0.2.3
v0.2.3
v0.2.1
v0.2.1
v0.2.0
v0.2.0 is out now, with the main improvement being the addition of streaming support!
Now, if you have a workflow that writes to the event stream like:
class ProgressEvent(Event):
progress: str
# create a dummy workflow
class MyWorkflow(Workflow):
@step()
async def run_step(self, ctx: Context, ev: StartEvent) -> StopEvent:
# Your workflow logic here
arg1 = str(ev.get("arg1", ""))
result = arg1 + "_result"
# stream events as steps run
ctx.write_event_to_stream(
ProgressEvent(progress="I am doing something!")
)
return StopEvent(result=result)
You can stream the events using the client
# create a session
session = client.create_session()
# kick off run
task_id = session.run_nowait("streaming_workflow", arg1="hello_world")
# stream events -- the will yield a dict representing each event
for event in session.get_task_result_stream(task_id):
print(event)
# get final result
result = session.get_task_result(task_id)
print(result)
# prints 'hello_world_result'
v0.1.3
v0.1.3
v0.1.2
v0.1.2
v0.1.1
v0.1.1
v0.1.0
llama_agents
-> llama_deploy
v0.1.0 is here! This is a huge refactor, which also renames llama_agents
to llama_deploy
!
Now, llama_deploy
is the place to go to deploy and scale agentic workflows that you built with llama_index
.
We have extensive documentation in the updated readme, and thorough examples, so check it out!
We've also added official docs pages and API reference to the llama_index
documentation for llama_deploy
!
v0.0.14
v0.0.14
v0.0.13
v0.0.13