Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tduval/feature/llama stack client support #530

Open
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

thaddavis
Copy link

@thaddavis thaddavis commented Nov 23, 2024

📥 Pull Request

📘 Description
This is the v1 of support for monitoring Llama Stack applications via the LlamaStackClient class

🧪 Testing
Included a smoke test for standard completions and completions with streaming in the tests/core_manual_tests/providers/llama_stack_client_canary.py file

Copy link

gitguardian bot commented Nov 23, 2024

️✅ There are no secrets present in this pull request anymore.

If these secrets were true positive and are still valid, we highly recommend you to revoke them.
Once a secret has been leaked into a git repository, you should consider it compromised, even if it was deleted immediately.
Find here more information about risks.


🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.

@teocns
Copy link
Contributor

teocns commented Nov 23, 2024

Hey @thaddavis congrats for your first PR!
Looks very good to me, I'll just do some cleaning for you since 1. the commit history leaks an API key, 2) your branch was based on an older version of upstream and does not pass styling (auto format commit hook was recently added)

good job!! 👏

@teocns teocns force-pushed the tduval/feature/llamaStackClientSupport branch from bb35885 to ec8445d Compare November 23, 2024 05:17
Copy link

codecov bot commented Nov 23, 2024

Codecov Report

Attention: Patch coverage is 20.20202% with 79 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
agentops/llms/llama_stack_client.py 20.87% 72 Missing ⚠️
agentops/llms/__init__.py 12.50% 7 Missing ⚠️
Flag Coverage Δ
unittests 54.67% <20.20%> (-0.91%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
agentops/llms/__init__.py 25.00% <12.50%> (-1.09%) ⬇️
agentops/llms/llama_stack_client.py 20.87% <20.87%> (ø)

🚨 Try these New Features:

@thaddavis thaddavis changed the title Tduval/feature/llama stack client support tduval/feature/llama stack client support Nov 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants