-
Notifications
You must be signed in to change notification settings - Fork 215
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tduval/feature/llama stack client support #530
base: main
Are you sure you want to change the base?
tduval/feature/llama stack client support #530
Conversation
️✅ There are no secrets present in this pull request anymore.If these secrets were true positive and are still valid, we highly recommend you to revoke them. 🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request. |
Hey @thaddavis congrats for your first PR! good job!! 👏 |
Signed-off-by: Teo <[email protected]>
bb35885
to
ec8445d
Compare
Codecov ReportAttention: Patch coverage is
Flags with carried forward coverage won't be shown. Click here to find out more.
🚨 Try these New Features:
|
…s for future reference
…ter live demo to Maintainers team
📥 Pull Request
📘 Description
This is the v1 of support for monitoring Llama Stack applications via the LlamaStackClient class
🧪 Testing
Included a smoke test for standard completions and completions with streaming in the tests/core_manual_tests/providers/llama_stack_client_canary.py file