Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/integration nebius #16783

Open
wants to merge 20 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
a1b4e0c
Nebius AI Studio integration added
Aktsvigun Nov 1, 2024
b4edbbc
nebius llm/embeddings examples added;
Aktsvigun Nov 1, 2024
85df70f
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 1, 2024
9ca646b
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 2, 2024
84fcb4b
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 4, 2024
15d590a
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 6, 2024
d9dea19
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 11, 2024
56c1e32
Readme added to Nebius LLM & Embeddings; trash removed
Aktsvigun Nov 11, 2024
871b5c6
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 12, 2024
1676587
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 13, 2024
db822eb
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 14, 2024
5ebb5bc
linting
logan-markewich Nov 14, 2024
af08aba
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 14, 2024
67b3602
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 15, 2024
e103dfb
Merge branch 'main' into feature/integration-nebius
logan-markewich Nov 19, 2024
5c00471
multimodal integration with nebius added; code improved
Aktsvigun Nov 19, 2024
44ba588
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 19, 2024
1ca55fd
tests bug fixed
Aktsvigun Nov 20, 2024
6b222c5
Merge branch 'feature/integration-nebius' of https://github.com/Aktsv…
Aktsvigun Nov 20, 2024
250e423
Merge branch 'main' into feature/integration-nebius
Aktsvigun Nov 20, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions docs/docs/api_reference/embeddings/nebius.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
::: llama_index.embeddings.nebius
options:
members:
- NebiusEmbedding
4 changes: 4 additions & 0 deletions docs/docs/api_reference/llms/nebius.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
::: llama_index.llms.nebius
options:
members:
- NebiusLLM
4 changes: 4 additions & 0 deletions docs/docs/api_reference/multi_modal_llms/nebius.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
::: llama_index.multi_modal_llms.nebius
options:
members:
- NebiusMultiModal
229 changes: 229 additions & 0 deletions docs/docs/examples/embeddings/nebius.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,229 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a href=\"https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/docs/examples/embeddings/nebius.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Nebius Embeddings\n",
"\n",
"This notebook demonstrates how to use [Nebius AI Studio](https://studio.nebius.ai/) Embeddings with LlamaIndex. Nebius AI Studio implements all state-of-the-art embeddings models, available for commercial use."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"First, let's install LlamaIndex and dependencies of Nebius AI Studio."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%pip install llama-index-embeddings-nebius"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!pip install llama-index"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Insert your Nebius AI Studio key below. You can get it by registering for free at [Nebius AI Studio](https://auth.eu.nebius.com/ui/login) and issuing the key at [API Keys section](https://studio.nebius.ai/settings/api-keys).\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"NEBIUS_API_KEY = \"\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's get embeddings using Nebius AI Studio"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from llama_index.embeddings.nebius import NebiusEmbedding\n",
"\n",
"embed_model = NebiusEmbedding(api_key=NEBIUS_API_KEY)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Basic usage"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"4096\n",
"[-0.002410888671875, 0.0083770751953125, -0.00542449951171875, 0.007366180419921875, -0.022216796875]\n"
]
}
],
"source": [
"text = \"Everyone loves justice at another person's expense\"\n",
"embeddings = embed_model.get_text_embedding(text)\n",
"assert len(embeddings) == 4096\n",
"print(len(embeddings), embeddings[:5], sep=\"\\n\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Asynchronous usage"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"4096\n",
"[-0.002410888671875, 0.0083770751953125, -0.00542449951171875, 0.007366180419921875, -0.022216796875]\n"
]
}
],
"source": [
"text = \"Everyone loves justice at another person's expense\"\n",
"embeddings = await embed_model.aget_text_embedding(text)\n",
"assert len(embeddings) == 4096\n",
"print(len(embeddings), embeddings[:5], sep=\"\\n\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Batched usage"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[-0.0003886222839355469, 0.0004887580871582031, 0.011199951171875]\n",
"[-0.003734588623046875, 0.01143646240234375, 0.008758544921875]\n",
"[0.005901336669921875, 0.005161285400390625, 0.00142669677734375]\n",
"[-0.00946807861328125, -0.0048675537109375, 0.004817962646484375]\n"
]
}
],
"source": [
"texts = [\n",
" \"As the hours pass\",\n",
" \"I will let you know\",\n",
" \"That I need to ask\",\n",
" \"Before I'm alone\",\n",
"]\n",
"\n",
"embeddings = embed_model.get_text_embedding_batch(texts)\n",
"assert len(embeddings) == 4\n",
"assert len(embeddings[0]) == 4096\n",
"print(*[x[:3] for x in embeddings], sep=\"\\n\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Async batched usage"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[-0.0003886222839355469, 0.0004887580871582031, 0.011199951171875]\n",
"[-0.003734588623046875, 0.01143646240234375, 0.008758544921875]\n",
"[0.005901336669921875, 0.005161285400390625, 0.00142669677734375]\n",
"[-0.00946807861328125, -0.0048675537109375, 0.004817962646484375]\n"
]
}
],
"source": [
"texts = [\n",
" \"As the hours pass\",\n",
" \"I will let you know\",\n",
" \"That I need to ask\",\n",
" \"Before I'm alone\",\n",
"]\n",
"\n",
"embeddings = await embed_model.aget_text_embedding_batch(texts)\n",
"assert len(embeddings) == 4\n",
"assert len(embeddings[0]) == 4096\n",
"print(*[x[:3] for x in embeddings], sep=\"\\n\")"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
Loading