Skip to content

Commit

Permalink
added support for asynchronous chat_completion with `OllamaInstruct…
Browse files Browse the repository at this point in the history
…orAsyncClient` class; added test for functions in `cleaner.py` and the functions `chat_completion/_with_stream`, updated requirements.txt; updated version to 0.2.0
  • Loading branch information
lennartpollvogt committed Jun 15, 2024
1 parent cb43a1b commit 4c8ef2a
Show file tree
Hide file tree
Showing 11 changed files with 647 additions and 339 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -158,3 +158,6 @@ cython_debug/
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/
async_stream_test.py
async_test.py
test.py
51 changes: 49 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,53 @@ Output:
{"name": "Jason", "age": 30}
```

**asynchronous chat completion**:
```python
from pydantic import BaseModel, ConfigDict
from enum import Enum
from typing import List
import rich
import asyncio

from ollama_instructor.ollama_instructor_client import OllamaInstructorAsyncClient

class Gender(Enum):
MALE = 'male'
FEMALE = 'female'

class Person(BaseModel):
'''
This model defines a person.
'''
name: str
age: int
gender: Gender
friends: List[str] = []

model_config = ConfigDict(
extra='forbid'
)

async def main():
client = OllamaInstructorAsyncClient(...)
await client.async_init() # Wichtig: Die asynchrone Initialisierung aufrufen

response = await client.chat_completion(
model='phi3:instruct',
pydantic_model=Person,
messages=[
{
'role': 'user',
'content': 'Jason is 25 years old. Jason loves to play soccer with his friends Nick and Gabriel. His favorite food is pizza.'
}
],
)
rich.print(response['message']['content'])

if __name__ == "__main__":
asyncio.run(main())
```

**chat completion with streaming**:
```python
from ollama_instructor.ollama_instructor_client import OllamaInstructorClient
Expand All @@ -92,9 +139,9 @@ for chunk in response:
print(chunk['message']['content'])
```

## OllamaInstructorClient
## OllamaInstructorClient and OllamaInstructorAsyncClient

The class `OllamaInstructorClient` is the main class of the `ollama-instructor` library. It is the the wrapper around the `Ollama` client and contains the following arguments:
The classes `OllamaInstructorClient` and `OllamaInstructorAsyncClient` are the main class of the `ollama-instructor` library. They are the wrapper around the `Ollama` client and contain the following arguments:
- `host`: the URL of the Ollama server (default: `http://localhost:11434`). See documentation of [Ollama](https://github.com/ollama/ollama)
- `debug`: a `bool` indicating whether to print debug messages (default: `False`).

Expand Down
7 changes: 5 additions & 2 deletions Roadmap.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,15 @@ The current features of the `ollama-instructor` library are mentioned in the [RE

**Planned features**:

- [ ] Async client for chat_completion and chat_completion_with_stream
- [x] Async client
- [x] chat_completion
- [ ] chat_completion_with_stream
- [ ] Advanced logging

**General enhancements**:
- [ ] Add more docs and guides + structure docs
- [ ] Advanced testing
- [x] Advanced testing
- [ ] Add more tests

---
> *Note*: I'm still discovering new fields of use cases, so if you have any suggestions or ideas, please feel free to open an issue or pull request. And I would be super happy to have pull requests for example code (see [examples](/examples/)).
2 changes: 0 additions & 2 deletions ollama_instructor/cleaner.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@

from icecream import ic

# TODO: ref
'''
NOTE:
The function "create_partial_model" in the following was created with the research and help of Phind-70b from www.phind.com.
Expand Down Expand Up @@ -112,4 +111,3 @@ def set_nested_value(data: Any, loc: tuple, error_type: str):
except Exception as e:
ic()
raise e

Loading

0 comments on commit 4c8ef2a

Please sign in to comment.