Skip to content

Commit

Permalink
DeepSeek documentation (#1190)
Browse files Browse the repository at this point in the history
  • Loading branch information
jverre authored Feb 2, 2025
1 parent 4eadcf3 commit 547eeb5
Show file tree
Hide file tree
Showing 4 changed files with 144 additions and 0 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -133,6 +133,7 @@ The easiest way to get started is to use one of our integrations. Opik supports:
| Anthropic | Log traces for all Anthropic LLM calls | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/anthropic?utm_source=opik&utm_medium=github&utm_content=anthropic_link&utm_campaign=opik) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/anthropic.ipynb) |
| Bedrock | Log traces for all Bedrock LLM calls | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/bedrock?utm_source=opik&utm_medium=github&utm_content=bedrock_link&utm_campaign=opik) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/bedrock.ipynb) |
| CrewAI | Log traces for all CrewAI calls | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/crewai?utm_source=opik&utm_medium=github&utm_content=crewai_link&utm_campaign=opik) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/crewai.ipynb) |
| DeepSeek | Log traces for all DeepSeek LLM calls | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/deepseek?utm_source=opik&utm_medium=github&utm_content=deepseek_link&utm_campaign=opik) | |
| DSPy | Log traces for all DSPy runs | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/dspy?utm_source=opik&utm_medium=github&utm_content=dspy_link&utm_campaign=opik) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/dspy.ipynb) |
| Gemini | Log traces for all Gemini LLM calls | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/gemini?utm_source=opik&utm_medium=github&utm_content=gemini_link&utm_campaign=opik) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/gemini.ipynb) |
| Groq | Log traces for all Groq LLM calls | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/groq?utm_source=opik&utm_medium=github&utm_content=groq_link&utm_campaign=opik) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/groq.ipynb) |
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,141 @@
---
sidebar_label: DeepSeek
description: Describes how to track DeepSeek LLM calls using Opik
pytest_codeblocks_skip: false
---

import Tabs from "@theme/Tabs";
import TabItem from "@theme/TabItem";

# Deepseek

Deepseek is an Open-Source LLM model that rivals o1 from OpenAI. You can learn more about DeepSeek on [Github](https://github.com/deepseek-ai/DeepSeek-R1) or
on [deepseek.com](https://www.deepseek.com/).

In this guide, we will showcase how to track DeepSeek calls using Opik. As DeepSeek is open-source, there are many way to run and call the model. We will focus on how to integrate Opik with the following hosting options:

1. DeepSeek API
2. Fireworks AI API
3. Together AI API

## Getting started

### Configuring your hosting provider

Before you can start tracking DeepSeek calls, you need to get the API key from your hosting provider.

<Tabs>
<TabItem value="DeepSeek API" title="DeepSeek API">

In order to use the DeepSeek API, you will need to have an API key. You can register for an account on [DeepSeek.com](https://chat.deepseek.com/sign_up).
Once you have signed up, you can register for an API key.

</TabItem>
<TabItem value="Fireworks AI API" title="Fireworks AI API">

You can log into Fireworks AI on [fireworks.ai](https://fireworks.ai/). You can then access your API key on the [API keys](https://fireworks.ai/account/api-keys) page.

</TabItem>
<TabItem value="Together AI API" title="Together AI API">

You can log into Together AI on [together.ai](https://together.ai/). You can then access your API key on the [API keys](https://api.together.ai/settings/api-keys) page.

</TabItem>
</Tabs>

### Configuring Opik

```bash
pip install --upgrade --quiet opik

opik configure
```

:::tip
Opik is fully open-source and can be run locally or through the Opik Cloud platform. You can learn more about hosting Opik on your own infrastructure in the [self-hosting guide](/docs/self-host/overview.md).
:::

## Tracking DeepSeek calls

The easiest way to call DeepSeek with Opik is to use the OpenAI Python SDK and the `track_openai` decorator. This approach is compatible with the DeepSeek API, Fireworks AI API and Together AI API:

<Tabs>
<TabItem value="DeepSeek API" title="DeepSeek API">

```python
from opik.integrations.openai import track_openai
from openai import OpenAI

# Create the OpenAI client that points to DeepSeek API
client = OpenAI(api_key="<DeepSeek API Key>", base_url="https://api.deepseek.com")

# Wrap your OpenAI client to track all calls to Opik
client = track_openai(client)

# Call the API
response = client.chat.completions.create(
model="deepseek-chat",
messages=[
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "Hello"},
],
stream=False
)

print(response.choices[0].message.content)
```

</TabItem>
<TabItem value="Fireworks AI API" title="Fireworks AI API">

```python
from opik.integrations.openai import track_openai
from openai import OpenAI

# Create the OpenAI client that points to DeepSeek API
client = OpenAI(api_key="<Firebworks AI API Key>", base_url="https://api.fireworks.ai/inference/v1")

# Wrap your OpenAI client to track all calls to Opik
client = track_openai(client)

# Call the API
response = client.chat.completions.create(
model="accounts/fireworks/models/deepseek-v3",
messages=[
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "Hello"},
],
stream=False
)

print(response.choices[0].message.content)
```

</TabItem>
<TabItem value="Together AI API" title="Together AI API">

```python
from opik.integrations.openai import track_openai
from openai import OpenAI

# Create the OpenAI client that points to Together AI API
client = OpenAI(api_key="<Together AI API Key>", base_url="https://api.together.xyz/v1")

# Wrap your OpenAI client to track all calls to Opik
client = track_openai(client)

# Call the API
response = client.chat.completions.create(
model="deepseek-ai/DeepSeek-R1",
messages=[
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "Hello"},
],
stream=False
)

print(response.choices[0].message.content)
```

</TabItem>
</Tabs>
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ Opik aims to make it as easy as possible to log, view and evaluate your LLM trac
| Anthropic | Log traces for all Anthropic LLM calls | [Documentation](/tracing/integrations/anthropic.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/anthropic.ipynb) |
| Bedrock | Log traces for all AWS Bedrock LLM calls | [Documentation](/tracing/integrations/bedrock.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/bedrock.ipynb) |
| CrewAI | Log traces for all CrewAI LLM calls | [Documentation](/tracing/integrations/crewai.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/crewai.ipynb) |
| DeepSeek | Log traces for all LLM calls made with DeepSeek | [Documentation](/tracing/integrations/deepseek.mdx) | |
| Dify | Log traces and LLM calls for your Dify Apps | [Documentation](/tracing/integrations/dify.mdx) | |
| DSPy | Log traces for all DSPy runs | [Documentation](/tracing/integrations/dspy.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/dspy.ipynb) |
| Guardrails | Log traces for all Guardrails validations | [Documentation](/tracing/integrations/guardrails-ai.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/guardrails-ai.ipynb) |
Expand Down
1 change: 1 addition & 0 deletions apps/opik-documentation/documentation/sidebars.ts
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ const sidebars: SidebarsConfig = {
"tracing/integrations/anthropic",
"tracing/integrations/bedrock",
"tracing/integrations/crewai",
"tracing/integrations/deepseek",
"tracing/integrations/dify",
"tracing/integrations/dspy",
"tracing/integrations/gemini",
Expand Down

0 comments on commit 547eeb5

Please sign in to comment.