[Feature Request] Make LocalLauncher
support launching perpetual serving
#156
Labels
LocalLauncher
support launching perpetual serving
#156
Right now,
LocalLauncher
only supportslaunch_single()
, which is kind of a one-shot setup, run, teardown process.In addition, I think there's room for a pseudo
launch_server()
that runs everything in a single async process.I've seen at least two people now trying to run the server launcher inside of an already existing fast-api server, which won't work. Currently you'd need to launch llama-agents separately from your existing fast-api server and use a client to communicate.
The text was updated successfully, but these errors were encountered: