Skip to content

Commit

Permalink
fix docs
Browse files Browse the repository at this point in the history
Signed-off-by: Gilad Barnea <[email protected]>
  • Loading branch information
Gilad Barnea committed Aug 31, 2023
1 parent 6dff7fd commit 6e9f9ea
Show file tree
Hide file tree
Showing 27 changed files with 209 additions and 183 deletions.
27 changes: 18 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,23 +45,23 @@ pip install declarai

### Setup
```bash
export DECLARAI_OPENAI_API_KEY=<your openai token>
export OPENAI_API_KEY=<your openai token>
```
or pass the token when initializing the declarai object
```python
from declarai import Declarai
import declarai

declarai = Declarai(provider="openai", model="gpt-3.5-turbo", openai_token="<your-openai-key>")
gpt_35 = declarai.openai(model="gpt-3.5-turbo", openai_token="<your-openai-key>")
```

## 💡 Basic Usage
Craft AI-powered functionalities with ease using the `@task` decorator. Just add some type hints and a bit of documentation, and watch Declarai do its magic!
```python
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@openai.task
@gpt_35.task
def generate_poem(title: str) -> str:
"""
Write a 4 line poem on the provided title
Expand Down Expand Up @@ -96,7 +96,11 @@ The resulting code is readable and easily maintainable.

Python primitives
```python
@openai.task
import declarai

gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@gpt_35.task
def rank_by_severity(message: str) -> int:
"""
Rank the severity of the provided message by it's urgency.
Expand All @@ -116,7 +120,11 @@ rank_by_severity(message="How was your weekend?"))

Python complex objects
```python
@openai.task
import declarai

gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@gpt_35.task
def datetime_parser(raw_date: str) -> datetime:
"""
Parse the input into a valid datetime string of the format YYYY-mm-ddThh:mm:ss
Expand All @@ -132,13 +140,14 @@ datetime_parser(raw_date="Janury 1st 2020"))

pydantic models
```python
from pydantic import BaseModel
class Animal(BaseModel):
name: str
family: str
leg_count: int


@openai.task
@gpt_35.task
def suggest_animals(location: str) -> Dict[int, List[Animal]]:
"""
Create a list of numbers from 0 to 5
Expand Down Expand Up @@ -168,7 +177,7 @@ suggest_animals(location="jungle")

### Simple Chat interface
```python
@openai.experimental.chat
@gpt_35.experimental.chat
class CalculatorBot:
"""
You a calculator bot,
Expand Down
63 changes: 34 additions & 29 deletions docs/beginners-guide/controlling-task-behavior.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,20 +11,23 @@ Controlling these parameters is key to achieving the desired results from the mo
### Passing parameters to the task :label:

In the following example, we'll create a task that suggests movies to watch based on a given input.

```python
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")


@openai.task
def movie_recommender(user_input: str): # (1)!
@gpt_35.task
def movie_recommender(user_input: str): # (1)!
"""
Recommend a movie to watch based on the user input
:param user_input: The user's input
""" # (2)!
""" # (2)!
```

1. Notice how providing a type hint for the `user_input` parameter allows declarai to understand the expected input type.
1. Notice how providing a type hint for the `user_input` parameter allows declarai to understand the expected input
type.
2. Adding the param to the docstring allows declarai to communicate the **meaning** of this parameter to the model.

```python
Expand All @@ -33,71 +36,77 @@ print(movie_recommender(user_input="I want to watch a movie about space"))
```

### Using return types to control the output :gear:
This is a good start,

This is a good start,
but let's say we want to have a selection of movies instead of a single suggestion.

```python
from typing import List
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
@openai.task
def movie_recommender(user_input: str) -> List[str]: # (1)!
gpt_35 = declarai.openai(model="gpt-3.5-turbo")


@gpt_35.task
def movie_recommender(user_input: str) -> List[str]: # (1)!
"""
Recommend a selection of movies to watch based on the user input
:param user_input: The user's input
:return: A list of movie recommendations
""" # (2)!
""" # (2)!
```

1. Adding a return type hint allows declarai to parse the output of the llm into the provided type,
1. Adding a return type hint allows declarai to parse the output of the llm into the provided type,
in our case a list of strings.
2. Explaining the return value aids the model in returning the expected output and avoiding hallucinations.

```python
print(movie_recommender(user_input="I want to watch a movie about space"))
> ['Interstellar', 'Gravity', 'The Martian', 'Apollo 13', '2001: A Space Odyssey', 'Moon', 'Sunshine', 'Contact', 'The Right Stuff', 'Hidden Figures']
> ['Interstellar', 'Gravity', 'The Martian', 'Apollo 13', '2001: A Space Odyssey', 'Moon', 'Sunshine', 'Contact',
'The Right Stuff', 'Hidden Figures']
```


!!! info

Notice How the text in our documentation has changed from singular to plural form.
Maintaining consistency between the task's description and the return type is important for the model to understand the expected output.<br>
For more best-practices, see [here](../../best-practices).


Awesome!
Awesome!

Now we have a list of movies to choose from!

But what if we want to go even further :thinking:? <br>
Let's say we want the model to also provide a short description of each movie.

```python
from typing import Dict
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
@openai.task
def movie_recommender(user_input: str) -> Dict[str, str]: # (1)!
gpt_35 = declarai.openai(model="gpt-3.5-turbo")


@gpt_35.task
def movie_recommender(user_input: str) -> Dict[str, str]: # (1)!
"""
Recommend a selection of movies to watch based on the user input
For each movie provide a short description as well
:param user_input: The user's input
:return: A dictionary of movie names and descriptions
""" # (2)!
""" # (2)!
```

1. We've updated the return value to allow for the creation of a dictionary of movie names and descriptions.
2. We re-enforce the description of the return value to ensure the model understands the expected output.

```python
print(movie_recommender(user_input="I want to watch a movie about space"))
>{
'Interstellar': "A team of explorers travel through a wormhole in space in an attempt to ensure humanity's survival.",
'Gravity': 'Two astronauts work together to survive after an accident leaves them stranded in space.',
'The Martian': 'An astronaut is left behind on Mars after his team assumes he is dead and must find a way to survive and signal for rescue.',
'Apollo 13': 'The true story of the Apollo 13 mission, where an explosion in space jeopardizes the lives of the crew and their safe return to Earth.',
'2001: A Space Odyssey': "A journey through human evolution and the discovery of a mysterious black monolith that may hold the key to humanity's future."
> {
'Interstellar': "A team of explorers travel through a wormhole in space in an attempt to ensure humanity's survival.",
'Gravity': 'Two astronauts work together to survive after an accident leaves them stranded in space.',
'The Martian': 'An astronaut is left behind on Mars after his team assumes he is dead and must find a way to survive and signal for rescue.',
'Apollo 13': 'The true story of the Apollo 13 mission, where an explosion in space jeopardizes the lives of the crew and their safe return to Earth.',
'2001: A Space Odyssey': "A journey through human evolution and the discovery of a mysterious black monolith that may hold the key to humanity's future."
}
```

Expand All @@ -111,10 +120,6 @@ print(movie_recommender(user_input="I want to watch a movie about space"))
Try experimenting with various descriptions and see how far you can push the model's understanding!
who knows what you'll find :open_mouth:!





<div style="display: flex; justify-content: space-between;">
<a href="../simple-task" class="md-button">
Previous <i class="fas fa-arrow-left"></i>
Expand Down
4 changes: 2 additions & 2 deletions docs/beginners-guide/debugging-tasks.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,9 @@ Let's take the last task from the previous section and add a call to the `compil
from typing import Dict
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@openai.task
@gpt_35.task
def movie_recommender(user_input: str) -> Dict[str, str]:
"""
Recommend a selection of movies to watch based on the user input
Expand Down
2 changes: 1 addition & 1 deletion docs/beginners-guide/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Once completed, the rest of the examples in this module should be as simple as c
```python title="declarai_tutorial.py"
import declarai

openai = declarai.openai(model="gpt-3.5-turbo", openai_token="<your-openai-token>")
gpt_35 = declarai.openai(model="gpt-3.5-turbo", openai_token="<your-openai-token>")
```


Expand Down
4 changes: 2 additions & 2 deletions docs/beginners-guide/simple-task.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@ The simplest Declarai usage is a function decorated with `@task`:
```py
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@openai.task
@gpt_35.task
def say_something() -> str:
"""
Say something short to the world
Expand Down
16 changes: 8 additions & 8 deletions docs/best-practices/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ Reviewing the movie recommender example from the beginner's guide, we can see a
from typing import Dict
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
@openai.task
gpt_35 = declarai.openai(model="gpt-3.5-turbo")
@gpt_35.task
def movie_recommender(user_input: str) -> Dict[str, str]:
"""
Recommend a selection of movies to watch based on the user input
Expand Down Expand Up @@ -58,9 +58,9 @@ For example, in the following, the prompt is written in single form, while the r
from typing import List
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@openai.task
@gpt_35.task
def movie_recommender(user_input: str) -> List[str]:
"""
Recommend a movie to watch based on the user input
Expand All @@ -74,8 +74,8 @@ Instead, we could write the prompt as follows:
from typing import List
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
@openai.task
gpt_35 = declarai.openai(model="gpt-3.5-turbo")
@gpt_35.task
def movie_recommender(user_input: str) -> List[str]:
"""
Recommend a selection of movies to watch based on the user input
Expand All @@ -94,9 +94,9 @@ For example in this implementation of a calculator bot, the bot usually returns
from typing import Union
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@openai.experimental.chat
@gpt_35.experimental.chat
class CalculatorBot:
"""
You a calculator bot,
Expand Down
4 changes: 2 additions & 2 deletions docs/examples/deployments/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@ from fastapi import FastAPI, APIRouter
import declarai
app = FastAPI()
router = APIRouter()
openai = declarai.openai(model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")


@openai.task
@gpt_35.task
def movie_recommender(user_input: str) -> Dict[str, str]:
"""
Recommend a selection of real movies to watch based on the user input
Expand Down
4 changes: 2 additions & 2 deletions docs/features/chat/advanced-initialization.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,8 @@ Let's see how we can initialize a chatbot by passing the `system` and `greeting`
```py
import declarai

openai = declarai.openai(model="gpt-3.5-turbo")
@openai.experimental.chat
gpt_35 = declarai.openai(model="gpt-3.5-turbo")
@gpt_35.experimental.chat
class SQLBot:
...

Expand Down
11 changes: 6 additions & 5 deletions docs/features/chat/chat-memory/file-memory.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@ For chat that requires a persistent message history, you can use a file to store
## Set file memory

```py
from declarai import Declarai
import declarai
from declarai.memory import FileMessageHistory
declarai = Declarai(provider="openai", model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@declarai.experimental.chat(chat_history=FileMessageHistory("sql_bot_history.txt")) # (1)!
@gpt_35.experimental.chat(chat_history=FileMessageHistory("sql_bot_history.txt")) # (1)!
class SQLBot:
"""
You are a sql assistant. You help with SQL related questions with one-line answers.
Expand All @@ -28,10 +28,11 @@ We can also initialize the `FileMessageHistory` class with a custom file path.
In case you want to set the file memory at runtime, you can use the `set_memory` method.

```py
from declarai import Declarai
import declarai
from declarai.memory import FileMessageHistory
gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@declarai.experimental.chat
@gpt_35.experimental.chat
class SQLBot:
"""
You are a sql assistant. You help with SQL related questions with one-line answers.
Expand Down
12 changes: 6 additions & 6 deletions docs/features/chat/chat-memory/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,12 +61,12 @@ If you prefer to have a persistent history, you can use the `FileMessageHistory`
Setting up a memory is done by passing `chat_history` as a keyword argument to the `declarai.experimental.chat` decorator.

```py
from declarai import Declarai
import declarai
from declarai.memory import FileMessageHistory

declarai = Declarai(provider="openai", model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@declarai.experimental.chat(chat_history=FileMessageHistory("sql_bot_history.txt")) # (1)!
@gpt_35.experimental.chat(chat_history=FileMessageHistory("sql_bot_history.txt")) # (1)!
class SQLBot:
"""
You are a sql assistant. You help with SQL related questions with one-line answers.
Expand All @@ -78,12 +78,12 @@ class SQLBot:
We can also initialize the chat_history at runtime

```py
from declarai import Declarai
import declarai
from declarai.memory import FileMessageHistory

declarai = Declarai(provider="openai", model="gpt-3.5-turbo")
gpt_35 = declarai.openai(model="gpt-3.5-turbo")

@declarai.experimental.chat
@gpt_35.experimental.chat
class SQLBot:
"""
You are a sql assistant. You help with SQL related questions with one-line answers.
Expand Down
Loading

0 comments on commit 6e9f9ea

Please sign in to comment.