-
-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Add Bitdeer AI model api provider #9362
base: main
Are you sure you want to change the base?
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
litellm/main.py
Outdated
@@ -1666,6 +1666,7 @@ def completion( # type: ignore # noqa: PLR0915 | |||
or custom_llm_provider == "mistral" | |||
or custom_llm_provider == "openai" | |||
or custom_llm_provider == "together_ai" | |||
or custom_llm_provider == "bitdeerai" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please just update __init__.py
and register this as an openai_compatible_provider
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed this, already add biteerai to openai_compatible_provider
litellm/main.py
Outdated
or get_secret("BITDEERAI_API_BASE") | ||
or "https://api-inference.bitdeer.ai/v1" | ||
) | ||
model_response = openai_like_chat_completion.completion( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please use base_llm_http_handler.completion
instead
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove this, add to openai_compatible_provider
] | ||
|
||
if sync_mode: | ||
response = completion( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please mock these tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed, already add mock test
|
||
@pytest.mark.asyncio | ||
@pytest.mark.parametrize("sync_mode", [True, False]) | ||
async def test_chat_completion_bitdeerai_stream(sync_mode): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here - please mock this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed, already add mock test
tests/test_bitdeerai.py
Outdated
|
||
from litellm import completion, acompletion, embedding, aembedding, EmbeddingResponse | ||
|
||
@pytest.mark.parametrize("sync_mode", [True, False]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why does this file exist at the root?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
move thie file to local_testing folder
Title
Feature: Add Bitdeer AI model api provider
Relevant issues
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/
directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit
)[https://docs.litellm.ai/docs/extras/contributing_code]Type
🆕 New Feature
Changes
Screenshots for test:
