Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modify openai model to receive openai client as argument during initialization #593

Merged
merged 2 commits into from
Mar 7, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 22 additions & 5 deletions docs/reference/models/openai.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Generate text with the OpenAI API
# Generate text with the OpenAI and compatible APIs

!!! Installation

Expand All @@ -16,16 +16,33 @@ print(type(model))
# OpenAI
```

Outlines also supports Azure OpenAI models:

It is possible to pass a system message to the model when initializing it:

```python
```
from outlines import models

model = models.openai("gpt-4", system_prompt="You are a useful assistant")
model = models.azure_openai(
api_version="2023-07-01-preview",
azure_endpoint="https://example-endpoint.openai.azure.com",
)
```

More generally, you can use any API client compatible with the OpenAI interface by passing an instance of the client, a configuration, and optionally the corresponding tokenizer (if you want to be able to use `outlines.generate.choice`):

```
from openai import AsyncOpenAI
import tiktoken

from outlines.models.openai import OpenAI, OpenAIConfig

config = OpenAIConfig(model="gpt-4")
client = AsyncOpenAI()
tokenizer = tiktoken.encoding_for_model("gpt-4")

model = OpenAI(client, config, tokenizer)
```

This message will be used for every subsequent use of the model:

## Monitoring API use

Expand Down
12 changes: 7 additions & 5 deletions examples/pick_odd_one_out.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,13 +29,15 @@ def build_ooo_prompt(options):
"""


options = ["sea", "mountains", "plains", "sock"]

model = models.openai("gpt-3.5-turbo")
generator = outlines.generate.text(model)
gen_text = outlines.generate.text(model)
gen_choice = outlines.generate.choice(model, options)

options = ["sea", "mountains", "plains", "sock"]
prompt = build_ooo_prompt(options)
reasoning = generator(prompt, stop_at=["Pick the odd word", "So the odd one"])
reasoning = gen_text(prompt, stop_at=["Pick the odd word", "So the odd one"])
prompt += reasoning
result = generator(prompt)
result = gen_choice(prompt)
prompt += result
print(prompt)
print(result)
2 changes: 0 additions & 2 deletions outlines/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,10 @@
"""
from typing import Union

from .azure import AzureOpenAI, azure_openai
from .exllamav2 import ExLlamaV2Model, exl2
from .llamacpp import LlamaCpp, llamacpp
from .mamba import Mamba, mamba
from .openai import OpenAI, openai
from .openai_compatible import OpenAICompatibleAPI, openai_compatible_api
from .transformers import Transformers, transformers

LogitsGenerator = Union[Transformers, LlamaCpp, ExLlamaV2Model, Mamba]
119 changes: 0 additions & 119 deletions outlines/models/azure.py

This file was deleted.

Loading
Loading