Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Make LocalLauncher support launching perpetual serving #156

Open
logan-markewich opened this issue Jul 24, 2024 · 0 comments
Labels
enhancement New feature or request P1

Comments

@logan-markewich
Copy link
Collaborator

Right now, LocalLauncher only supports launch_single(), which is kind of a one-shot setup, run, teardown process.

In addition, I think there's room for a pseudo launch_server() that runs everything in a single async process.

I've seen at least two people now trying to run the server launcher inside of an already existing fast-api server, which won't work. Currently you'd need to launch llama-agents separately from your existing fast-api server and use a client to communicate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request P1
Projects
None yet
Development

No branches or pull requests

1 participant