-
Notifications
You must be signed in to change notification settings - Fork 59
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
GitHub Actions
committed
Oct 25, 2024
1 parent
bfaa003
commit e5f312b
Showing
5 changed files
with
189 additions
and
21 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,181 @@ | ||
--- | ||
id: outscale | ||
title: Outscale | ||
sidebar_position: 3.26 | ||
--- | ||
|
||
import Tabs from '@theme/Tabs'; | ||
import TabItem from '@theme/TabItem'; | ||
|
||
## Introduction | ||
|
||
Mistral AI models are available on the Outscale platform as managed deployments. | ||
Through the Outscale marketplace, you can subscribe to a Mistral service that will, | ||
on your behalf, provision a virtual machine and a GPU then deploy the model on it. | ||
|
||
|
||
As of today, the following models are available: | ||
|
||
- Mistral Small (2409) | ||
- Codestral | ||
|
||
For more details, visit the [models](../../../getting-started/models/models_overview) page. | ||
|
||
## Getting started | ||
|
||
The following sections outline the steps to query a Mistral model on the Outscale platform. | ||
|
||
### Deploying the model | ||
|
||
Follow the steps described in the | ||
[Outscale documentation](https://docs.outscale.com/en/userguide/Subscribing-To-a-Mistral-Service-and-Deploying-it.html) to deploy a service | ||
with the model of your choice. | ||
|
||
### Querying the model (chat completion) | ||
|
||
Deployed models expose a REST API that you can query using Mistral's SDK or plain HTTP calls. | ||
To run the examples below you will need to set the following environment variables: | ||
|
||
- `OUTSCALE_SERVER_URL`: the URL of the VM hosting your Mistral model | ||
- `OUTSCALE_MODEL_NAME`: the name of the model to query (e.g. `small`, `codestral`) | ||
|
||
|
||
<Tabs> | ||
<TabItem value="curl" label="cURL"> | ||
```bash | ||
echo $OUTSCALE_SERVER_URL/v1/chat/completions | ||
echo $OUTSCALE_MODEL_NAME | ||
curl --location $OUTSCALE_SRV_URL/v1/chat/completions \ | ||
--header "Content-Type: application/json" \ | ||
--header "Accept: application/json" \ | ||
--data '{ | ||
"model": "'"$OUTSCALE_MODEL_NAME"'", | ||
"temperature": 0, | ||
"messages": [ | ||
{"role": "user", "content": "Who is the best French painter? Answer in one short sentence."} | ||
], | ||
"stream": false | ||
}' | ||
``` | ||
</TabItem> | ||
<TabItem value="python" label="Python"> | ||
```python | ||
import os | ||
from mistralai import Mistral | ||
|
||
client = Mistral(server_url=os.environ["OUTSCALE_SERVER_URL"]) | ||
|
||
resp = client.chat.complete( | ||
model=os.environ["OUTSCALE_MODEL_NAME"], | ||
messages=[ | ||
{ | ||
"role": "user", | ||
"content": "Who is the best French painter? Answer in one short sentence.", | ||
} | ||
], | ||
temperature=0 | ||
) | ||
|
||
print(resp.choices[0].message.content) | ||
``` | ||
</TabItem> | ||
<TabItem value="ts" label="TypeScript"> | ||
```typescript | ||
import { Mistral } from "@mistralai/mistralai"; | ||
|
||
const client = new Mistral({ | ||
serverURL: process.env.OUTSCALE_SERVER_URL || "" | ||
}); | ||
|
||
const modelName = process.env.OUTSCALE_MODEL_NAME|| ""; | ||
|
||
async function chatCompletion(user_msg: string) { | ||
const resp = await client.chat.complete({ | ||
model: modelName, | ||
messages: [ | ||
{ | ||
content: user_msg, | ||
role: "user", | ||
}, | ||
], | ||
}); | ||
if (resp.choices && resp.choices.length > 0) { | ||
console.log(resp.choices[0]); | ||
} | ||
} | ||
|
||
chatCompletion("Who is the best French painter? Answer in one short sentence."); | ||
``` | ||
</TabItem> | ||
</Tabs> | ||
|
||
### Querying the model (FIM completion) | ||
|
||
Codestral can be queried using an additional completion mode called fill-in-the-middle (FIM). | ||
For more information, see the | ||
[code generation section](../../../capabilities/code_generation/#fill-in-the-middle-endpoint). | ||
|
||
|
||
<Tabs> | ||
<TabItem value="curl" label="cURL"> | ||
```bash | ||
curl --location $OUTSCALE_SERVER_URL/v1/fim/completions \ | ||
--header "Content-Type: application/json" \ | ||
--header "Accept: application/json" \ | ||
--data '{ | ||
"model": "'"$OUTSCALE_MODEL_NAME"'", | ||
"prompt": "def count_words_in_file(file_path: str) -> int:", | ||
"suffix": "return n_words", | ||
"stream": false | ||
}' | ||
``` | ||
</TabItem> | ||
<TabItem value="python" label="Python"> | ||
```python | ||
import os | ||
from mistralai import Mistral | ||
|
||
client = Mistral(server_url=os.environ["OUTSCALE_SERVER_URL"]) | ||
|
||
resp = client.fim.complete( | ||
model = os.environ["OUTSCALE_MODEL_NAME"], | ||
prompt="def count_words_in_file(file_path: str) -> int:", | ||
suffix="return n_words" | ||
) | ||
|
||
print(resp.choices[0].message.content) | ||
``` | ||
</TabItem> | ||
<TabItem value="ts" label="TypeScript"> | ||
```typescript | ||
import { Mistral} from "@mistralai/mistralai"; | ||
|
||
const client = new Mistral({ | ||
serverURL: process.env.OUTSCALE_SERVER_URL || "" | ||
}); | ||
|
||
const modelName = "codestral"; | ||
|
||
async function fimCompletion(prompt: string, suffix: string) { | ||
const resp = await client.fim.complete({ | ||
model: modelName, | ||
prompt: prompt, | ||
suffix: suffix | ||
}); | ||
if (resp.choices && resp.choices.length > 0) { | ||
console.log(resp.choices[0]); | ||
} | ||
} | ||
|
||
fimCompletion("def count_words_in_file(file_path: str) -> int:", | ||
"return n_words"); | ||
``` | ||
</TabItem> | ||
</Tabs> | ||
|
||
## Going further | ||
|
||
For more information and examples, you can check: | ||
|
||
- The [Outscale documentation](https://docs.outscale.com/en/userguide/Subscribing-To-a-Mistral-Service-and-Deploying-it.html) | ||
explaining how to subscribe to a Mistral service and deploy it. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1 @@ | ||
v0.0.93 | ||
v0.0.15 |