Skip to content

Commit

Permalink
Revert "Merge branch 'andrewyng:main' into vargacypher/add-google-fun…
Browse files Browse the repository at this point in the history
…ction-calling"

This reverts commit 9da9ba5, reversing
changes made to 1e2533e.
  • Loading branch information
vargacypher committed Dec 17, 2024
1 parent da0a998 commit 6e2d78d
Show file tree
Hide file tree
Showing 30 changed files with 261 additions and 1,147 deletions.
13 changes: 1 addition & 12 deletions .env.sample
Original file line number Diff line number Diff line change
Expand Up @@ -18,21 +18,10 @@ GOOGLE_REGION=
GOOGLE_PROJECT_ID=

# Hugging Face token
HF_TOKEN=
HUGGINGFACE_TOKEN=

# Fireworks
FIREWORKS_API_KEY=

# Together AI
TOGETHER_API_KEY=

# WatsonX
WATSONX_SERVICE_URL=
WATSONX_API_KEY=
WATSONX_PROJECT_ID=

# xAI
XAI_API_KEY=

# Sambanova
SAMBANOVA_API_KEY=
2 changes: 1 addition & 1 deletion .github/workflows/run_pytest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install poetry
poetry install --all-extras --with test
poetry install
- name: Test with pytest
run: poetry run pytest

9 changes: 0 additions & 9 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,3 @@ __pycache__/
env/
.env
.google-adc

# Testing
.coverage

# pyenv
.python-version

.DS_Store
**/.DS_Store
17 changes: 3 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,13 @@
# aisuite

[![PyPI](https://img.shields.io/pypi/v/aisuite)](https://pypi.org/project/aisuite/)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

Simple, unified interface to multiple Generative AI providers.

`aisuite` makes it easy for developers to use multiple LLM through a standardized interface. Using an interface similar to OpenAI's, `aisuite` makes it easy to interact with the most popular LLMs and compare the results. It is a thin wrapper around python client libraries, and allows creators to seamlessly swap out and test responses from different LLM providers without changing their code. Today, the library is primarily focussed on chat completions. We will expand it cover more use cases in near future.

Currently supported providers are -
OpenAI, Anthropic, Azure, Google, AWS, Groq, Mistral, HuggingFace Ollama, Sambanova and Watsonx.
OpenAI, Anthropic, Azure, Google, AWS, Groq, Mistral, HuggingFace and Ollama.
To maximize stability, `aisuite` uses either the HTTP endpoint or the SDK for making calls to the provider.

## Installation
Expand All @@ -22,13 +21,11 @@ pip install aisuite
```

This installs aisuite along with anthropic's library.

```shell
pip install 'aisuite[anthropic]'
```

This installs all the provider-specific libraries

```shell
pip install 'aisuite[all]'
```
Expand All @@ -44,14 +41,12 @@ You can use tools like [`python-dotenv`](https://pypi.org/project/python-dotenv/
Here is a short example of using `aisuite` to generate chat completion responses from gpt-4o and claude-3-5-sonnet.

Set the API keys.

```shell
export OPENAI_API_KEY="your-openai-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"
```

Use the python client.

```python
import aisuite as ai
client = ai.Client()
Expand All @@ -72,7 +67,6 @@ for model in models:
print(response.choices[0].message.content)

```

Note that the model name in the create() call uses the format - `<provider>:<model-name>`.
`aisuite` will call the appropriate provider with the right parameters based on the provider value.
For a list of provider values, you can look at the directory - `aisuite/providers/`. The list of supported providers are of the format - `<provider>_provider.py` in that directory. We welcome providers adding support to this library by adding an implementation file in this directory. Please see section below for how to contribute.
Expand All @@ -85,10 +79,9 @@ aisuite is released under the MIT License. You are free to use, modify, and dist

## Contributing

If you would like to contribute, please read our [Contributing Guide](https://github.com/andrewyng/aisuite/blob/main/CONTRIBUTING.md) and join our [Discord](https://discord.gg/T6Nvn8ExSb) server!
If you would like to contribute, please read our [Contributing Guide](CONTRIBUTING.md) and join our [Discord](https://discord.gg/T6Nvn8ExSb) server!

## Adding support for a provider

We have made easy for a provider or volunteer to add support for a new platform.

### Naming Convention for Provider Modules
Expand All @@ -98,24 +91,20 @@ We follow a convention-based approach for loading providers, which relies on str
- The provider's module file must be named in the format `<provider>_provider.py`.
- The class inside this module must follow the format: the provider name with the first letter capitalized, followed by the suffix `Provider`.

#### Examples
#### Examples:

- **Hugging Face**:
The provider class should be defined as:

```python
class HuggingfaceProvider(BaseProvider)
```

in providers/huggingface_provider.py.

- **OpenAI**:
The provider class should be defined as:

```python
class OpenaiProvider(BaseProvider)
```

in providers/openai_provider.py

This convention simplifies the addition of new providers and ensures consistency across provider implementations.
2 changes: 1 addition & 1 deletion aisuite/framework/message.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
"""Interface to hold contents of api responses when they do not confirm to the OpenAI style response"""
"""Interface to hold contents of api responses when they do not conform to the OpenAI style response"""

from pydantic import BaseModel
from typing import Literal, Optional
Expand Down
2 changes: 1 addition & 1 deletion aisuite/providers/aws_provider.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ class BedrockConfig:

def __init__(self, **config):
self.region_name = config.get(
"region_name", os.getenv("AWS_REGION", "us-west-2")
"region_name", os.getenv("AWS_REGION_NAME", "us-west-2")
)

def create_client(self):
Expand Down
37 changes: 0 additions & 37 deletions aisuite/providers/cohere_provider.py

This file was deleted.

4 changes: 2 additions & 2 deletions aisuite/providers/huggingface_provider.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,10 +19,10 @@ def __init__(self, **config):
The token is fetched from the config or environment variables.
"""
# Ensure API key is provided either in config or via environment variable
self.token = config.get("token") or os.getenv("HF_TOKEN")
self.token = config.get("token") or os.getenv("HUGGINGFACE_TOKEN")
if not self.token:
raise ValueError(
"Hugging Face token is missing. Please provide it in the config or set the HF_TOKEN environment variable."
"Hugging Face token is missing. Please provide it in the config or set the HUGGINGFACE_TOKEN environment variable."
)

# Optionally set a custom timeout (default to 30s)
Expand Down
30 changes: 0 additions & 30 deletions aisuite/providers/sambanova_provider.py

This file was deleted.

39 changes: 0 additions & 39 deletions aisuite/providers/watsonx_provider.py

This file was deleted.

65 changes: 0 additions & 65 deletions aisuite/providers/xai_provider.py

This file was deleted.

6 changes: 3 additions & 3 deletions examples/client.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@
"source": [
"# IMP NOTE: Azure expects model endpoint to be passed in the format of \"azure:<model_name>\".\n",
"# The model name is the deployment name in Project/Deployments.\n",
"# In the example below, the model is \"mistral-large-2407\", but the name given to the\n",
"# In the exmaple below, the model is \"mistral-large-2407\", but the name given to the\n",
"# deployment is \"aisuite-mistral-large-2407\" under the deployments section in Azure.\n",
"client.configure({\"azure\" : {\n",
" \"api_key\": os.environ[\"AZURE_API_KEY\"],\n",
Expand All @@ -142,7 +142,7 @@
"source": [
"# HuggingFace expects the model to be passed in the format of \"huggingface:<model_name>\".\n",
"# The model name is the full name of the model in HuggingFace.\n",
"# In the example below, the model is \"mistralai/Mistral-7B-Instruct-v0.3\".\n",
"# In the exmaple below, the model is \"mistralai/Mistral-7B-Instruct-v0.3\".\n",
"# The model is deployed as serverless inference endpoint in HuggingFace.\n",
"hf_model = \"huggingface:mistralai/Mistral-7B-Instruct-v0.3\"\n",
"response = client.chat.completions.create(model=hf_model, messages=messages)\n",
Expand All @@ -159,7 +159,7 @@
"\n",
"# Groq expects the model to be passed in the format of \"groq:<model_name>\".\n",
"# The model name is the full name of the model in Groq.\n",
"# In the example below, the model is \"llama3-8b-8192\".\n",
"# In the exmaple below, the model is \"llama3-8b-8192\".\n",
"groq_llama3_8b = \"groq:llama3-8b-8192\"\n",
"# groq_llama3_70b = \"groq:llama3-70b-8192\"\n",
"response = client.chat.completions.create(model=groq_llama3_8b, messages=messages)\n",
Expand Down
5 changes: 1 addition & 4 deletions guides/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,13 @@

These guides give directions for obtaining API keys from different providers.

Here are the instructions for:
Here're the instructions for:
- [Anthropic](anthropic.md)
- [AWS](aws.md)
- [Azure](azure.md)
- [Cohere](cohere.md)
- [Google](google.md)
- [Hugging Face](huggingface.md)
- [OpenAI](openai.md)
- [SambaNova](sambanova.md)
- [xAI](xai.md)

Unless otherwise stated, these guides have not been endorsed by the providers.

Expand Down
2 changes: 1 addition & 1 deletion guides/anthropic.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,4 +44,4 @@ response = client.chat.completions.create(
print(response.choices[0].message.content)
```

Happy coding! If you would like to contribute, please read our [Contributing Guide](../CONTRIBUTING.md).
Happy coding! If you would like to contribute, please read our [Contributing Guide](CONTRIBUTING.md).
Loading

0 comments on commit 6e2d78d

Please sign in to comment.