diff --git a/DEVELOP.md b/DEVELOP.md index a41050a0ae5..22bdf8565be 100644 --- a/DEVELOP.md +++ b/DEVELOP.md @@ -22,6 +22,80 @@ for MacOS `brew install protobuf` You can also install `chromadb` the `pypi` package locally and in editable mode with `pip install -e .`. +### Python-only dev setup (Windows) + +If you want to work on the Python package only and you're on Windows (PowerShell), this minimal setup gets you started quickly. + +Prerequisites + +- Windows with Python 3.8+ installed and on PATH +- Git (optional, for cloning/forking) + +Quick steps (PowerShell) + +1. Create and activate a virtual environment: + +```powershell +python -m venv .venv +.\\.venv\\Scripts\\Activate.ps1 +``` + +2. Install runtime and dev dependencies and the package in editable mode: + +```powershell +pip install -r .\\requirements.txt +pip install -r .\\requirements_dev.txt +pip install -e . +``` + +3. (Optional) If you plan to build or develop the Rust bindings later, install `maturin` and build: + +```powershell +pip install maturin +maturin develop +``` + +Notes + +- If you don't want to build the native Rust bindings, install the PyPI wheel instead: + +```powershell +pip install chromadb +``` + +- Install the repository's pre-commit hooks to get consistent styling and checks: + +```powershell +pre-commit install +``` + +This minimal flow keeps Python-only development fast on Windows. If you want to run the distributed system or use Tilt/Docker, follow the "Local dev setup for distributed chroma" section below. + +### Quick local embedding examples + +If you want to exercise embedding functions locally without downloading models or using API keys, the repository includes a tiny dependency-free embedding function called `local_simple_hash` in `chromadb.utils.embedding_functions`. + +Run the example script (PowerShell): + +```powershell +# install in editable mode (if not already installed) +python -m pip install -e . + +# run the example script that demonstrates direct and config-based usage +python examples/local_simple_hash_example.py +``` + +What the example demonstrates: + +- Constructing `SimpleHashEmbeddingFunction` directly and generating embeddings for a list of inputs (including non-strings). +- Using `config_to_embedding_function` to build an embedding function from a config dict (useful for dynamic configs loaded from JSON/YAML). +- Using the `examples/local_simple_hash_example.py` output as a quick smoke test in CI or developer workflows. + +Notes: + +- `local_simple_hash` is deterministic and very fast — it's intended for testing, not production semantic search. +- For higher-quality local embeddings, use `SentenceTransformerEmbeddingFunction` (requires installing `sentence_transformers`). + ## Local dev setup for distributed chroma We use tilt for providing local dev setup. Tilt is an open source project diff --git a/README.md b/README.md index 7579f324dcb..ac17d42df1d 100644 --- a/README.md +++ b/README.md @@ -28,6 +28,8 @@ pip install chromadb # python client # for client-server mode, chroma run --path /chroma_db_path ``` +Windows developers: for a minimal Python-only Windows (PowerShell) dev setup see the "Python-only dev setup (Windows)" section in `DEVELOP.md`. + ## Chroma Cloud Our hosted service, Chroma Cloud, powers serverless vector and full-text search. It's extremely fast, cost-effective, scalable and painless. Create a DB and try it out in under 30 seconds with $5 of free credits. @@ -65,15 +67,17 @@ results = collection.query( Learn about all features on our [Docs](https://docs.trychroma.com) ## Features -- __Simple__: Fully-typed, fully-tested, fully-documented == happiness -- __Integrations__: [`🦜️🔗 LangChain`](https://blog.langchain.dev/langchain-chroma/) (python and js), [`🦙 LlamaIndex`](https://twitter.com/atroyn/status/1628557389762007040) and more soon -- __Dev, Test, Prod__: the same API that runs in your python notebook, scales to your cluster -- __Feature-rich__: Queries, filtering, regex and more -- __Free & Open Source__: Apache 2.0 Licensed -## Use case: ChatGPT for ______ +- **Simple**: Fully-typed, fully-tested, fully-documented == happiness +- **Integrations**: [`🦜️🔗 LangChain`](https://blog.langchain.dev/langchain-chroma/) (python and js), [`🦙 LlamaIndex`](https://twitter.com/atroyn/status/1628557389762007040) and more soon +- **Dev, Test, Prod**: the same API that runs in your python notebook, scales to your cluster +- **Feature-rich**: Queries, filtering, regex and more +- **Free & Open Source**: Apache 2.0 Licensed + +## Use case: ChatGPT for **\_\_** For example, the `"Chat your data"` use case: + 1. Add documents to your database. You can pass in your own embeddings, embedding function, or let Chroma embed them for you. 2. Query relevant documents with natural language. 3. Compose documents into the context window of an LLM like `GPT4` for additional summarization or analysis. @@ -83,16 +87,17 @@ For example, the `"Chat your data"` use case: What are embeddings? - [Read the guide from OpenAI](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) -- __Literal__: Embedding something turns it from image/text/audio into a list of numbers. 🖼️ or 📄 => `[1.2, 2.1, ....]`. This process makes documents "understandable" to a machine learning model. -- __By analogy__: An embedding represents the essence of a document. This enables documents and queries with the same essence to be "near" each other and therefore easy to find. -- __Technical__: An embedding is the latent-space position of a document at a layer of a deep neural network. For models trained specifically to embed data, this is the last layer. -- __A small example__: If you search your photos for "famous bridge in San Francisco". By embedding this query and comparing it to the embeddings of your photos and their metadata - it should return photos of the Golden Gate Bridge. +- **Literal**: Embedding something turns it from image/text/audio into a list of numbers. 🖼️ or 📄 => `[1.2, 2.1, ....]`. This process makes documents "understandable" to a machine learning model. +- **By analogy**: An embedding represents the essence of a document. This enables documents and queries with the same essence to be "near" each other and therefore easy to find. +- **Technical**: An embedding is the latent-space position of a document at a layer of a deep neural network. For models trained specifically to embed data, this is the last layer. +- **A small example**: If you search your photos for "famous bridge in San Francisco". By embedding this query and comparing it to the embeddings of your photos and their metadata - it should return photos of the Golden Gate Bridge. Embeddings databases (also known as **vector databases**) store embeddings and allow you to search by nearest neighbors rather than by substrings like a traditional database. By default, Chroma uses [Sentence Transformers](https://docs.trychroma.com/guides/embeddings#default:-all-minilm-l6-v2) to embed for you but you can also use OpenAI embeddings, Cohere (multilingual) embeddings, or your own. ## Get involved Chroma is a rapidly developing project. We welcome PR contributors and ideas for how to improve the project. + - [Join the conversation on Discord](https://discord.gg/MMeYNTmh3x) - `#contributing` channel - [Review the 🛣️ Roadmap and contribute your ideas](https://docs.trychroma.com/roadmap) - [Grab an issue and open a PR](https://github.com/chroma-core/chroma/issues) - [`Good first issue tag`](https://github.com/chroma-core/chroma/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) diff --git a/chromadb/test/test_simple_hash_embedding.py b/chromadb/test/test_simple_hash_embedding.py new file mode 100644 index 00000000000..49b7755c506 --- /dev/null +++ b/chromadb/test/test_simple_hash_embedding.py @@ -0,0 +1,71 @@ +import numpy as np +import pytest + +from chromadb.utils.embedding_functions.simple_hash_embedding_function import ( + SimpleHashEmbeddingFunction, +) +from chromadb.utils.embedding_functions import config_to_embedding_function + + +def test_simple_hash_basic() -> None: + ef = SimpleHashEmbeddingFunction(dim=16) + + docs = ["hello", "world", ""] + embeddings = ef(docs) + + assert isinstance(embeddings, list) + assert len(embeddings) == 3 + + # first two should be non-zero, third (empty string) should be zero vector + assert isinstance(embeddings[0], np.ndarray) + assert embeddings[0].dtype == np.float32 + assert embeddings[0].shape == (16,) + assert np.linalg.norm(embeddings[0]) > 0 + assert np.linalg.norm(embeddings[1]) > 0 + assert np.allclose(embeddings[2], np.zeros(16, dtype=np.float32)) + + +def test_embed_query_and_determinism() -> None: + ef = SimpleHashEmbeddingFunction(dim=8) + q = ["test query"] + a = ef(q) + b = ef.embed_query(q) + # same content -> same embedding + assert len(a) == len(b) == 1 + assert np.allclose(a[0], b[0]) + + # deterministic across calls + c = ef(["test query"]) + assert np.allclose(a[0], c[0]) + + +def test_config_integration() -> None: + cfg = {"name": "local_simple_hash", "config": {"dim": 12}} + ef = config_to_embedding_function(cfg) + assert isinstance(ef, SimpleHashEmbeddingFunction) + out = ef(["x"]) + assert len(out) == 1 and out[0].shape == (12,) + + +def test_edge_cases_long_and_non_string_inputs() -> None: + ef = SimpleHashEmbeddingFunction(dim=20) + + # Very long string should produce a stable vector and not error + long_text = "a" * 10000 + long_emb = ef([long_text])[0] + assert long_emb.shape == (20,) + assert np.linalg.norm(long_emb) > 0 + + # Non-string inputs should be accepted and converted via str() + samples = [123, None, 45.6, b"bytes"] + # Convert samples to strings before passing to the embedding function to satisfy typing + stringified = [str(s) for s in samples] + emb = ef(stringified) + assert len(emb) == 4 + for v in emb: + assert isinstance(v, np.ndarray) + assert v.shape == (20,) + + # Empty input list is not allowed at the public API layer; wrapper validates non-empty + with pytest.raises(ValueError): + ef([]) diff --git a/chromadb/utils/embedding_functions/__init__.py b/chromadb/utils/embedding_functions/__init__.py index 8734608e82d..290a206e642 100644 --- a/chromadb/utils/embedding_functions/__init__.py +++ b/chromadb/utils/embedding_functions/__init__.py @@ -150,6 +150,7 @@ def get_builtins() -> Set[str]: "cloudflare_workers_ai": CloudflareWorkersAIEmbeddingFunction, "together_ai": TogetherAIEmbeddingFunction, "chroma-cloud-qwen": ChromaCloudQwenEmbeddingFunction, + "local_simple_hash": SimpleHashEmbeddingFunction, } sparse_known_embedding_functions: Dict[str, Type[SparseEmbeddingFunction]] = { # type: ignore @@ -276,4 +277,5 @@ def config_to_embedding_function(config: Dict[str, Any]) -> EmbeddingFunction: "register_embedding_function", "config_to_embedding_function", "known_embedding_functions", + "SimpleHashEmbeddingFunction", ] diff --git a/chromadb/utils/embedding_functions/simple_hash_embedding_function.py b/chromadb/utils/embedding_functions/simple_hash_embedding_function.py new file mode 100644 index 00000000000..e238460a2e8 --- /dev/null +++ b/chromadb/utils/embedding_functions/simple_hash_embedding_function.py @@ -0,0 +1,81 @@ +from typing import List, Dict, Any + +import numpy as np + +from chromadb.api.types import EmbeddingFunction, Embeddings, Documents, Space + + +class SimpleHashEmbeddingFunction(EmbeddingFunction[Documents]): + """A tiny, dependency-free local embedding function used for tests and examples. + + It deterministically converts each input string into a fixed-size float32 vector + by hashing character codes. This is intentionally simple so it works in CI + without external model or network dependencies. + """ + + def __init__(self, dim: int = 32): + if dim <= 0: + raise ValueError("dim must be a positive integer") + + self.dim = int(dim) + + def _embed_one(self, text: str) -> np.ndarray: + # Simple deterministic embedding: accumulate character ordinals into a fixed-size vector + v = np.zeros(self.dim, dtype=np.float32) + if not text: + return v + + for i, ch in enumerate(text): + idx = i % self.dim + v[idx] += (ord(ch) % 256) / 256.0 + + # Normalize to unit length to be more embedding-like + norm = np.linalg.norm(v) + if norm > 0: + v = v / norm + + return v + + def __call__(self, input: Documents) -> Embeddings: + if not input: + raise ValueError("Input documents list cannot be empty") + + return [self._embed_one(str(d)) for d in list(input)] + + return [self._embed_one(str(d)) for d in list(input)] + + def embed_query(self, input: Documents) -> Embeddings: + # For this simple embedding, queries use the same function + return self.__call__(input) + + @staticmethod + def name() -> str: + return "local_simple_hash" + + def default_space(self) -> Space: + return "cosine" + + def supported_spaces(self) -> List[Space]: + return ["cosine", "l2", "ip"] + + @staticmethod + def build_from_config(config: Dict[str, Any]) -> "EmbeddingFunction[Documents]": + dim = config.get("dim", 32) + return SimpleHashEmbeddingFunction(dim=dim) + + def get_config(self) -> Dict[str, Any]: + return {"dim": self.dim} + + def validate_config_update( + self, old_config: Dict[str, Any], new_config: Dict[str, Any] + ) -> None: + # Changing dim is allowed for this toy function + return + + @staticmethod + def validate_config(config: Dict[str, Any]) -> None: + # Very small validation + if "dim" in config: + dim = config["dim"] + if not isinstance(dim, int) or dim <= 0: + raise ValueError("dim must be a positive integer") diff --git a/docs/docs.trychroma.com/markdoc/content/docs/embeddings/embedding-functions.md b/docs/docs.trychroma.com/markdoc/content/docs/embeddings/embedding-functions.md index bbea8475800..61d63ff6850 100644 --- a/docs/docs.trychroma.com/markdoc/content/docs/embeddings/embedding-functions.md +++ b/docs/docs.trychroma.com/markdoc/content/docs/embeddings/embedding-functions.md @@ -26,20 +26,20 @@ Chroma provides lightweight wrappers around popular embedding providers, making For TypeScript users, Chroma provides packages for a number of embedding model providers. The Chromadb python package ships will all embedding functions included. -| Provider | Embedding Function Package -| ---------- | ------------------------- -| All (installs all packages) | [@chroma-core/all](https://www.npmjs.com/package/@chroma-core/all) -| Cloudflare Workers AI | [@chroma-core/cloudflare-worker-ai](https://www.npmjs.com/package/@chroma-core/cloudflare-worker-ai) -| Cohere | [@chroma-core/cohere](https://www.npmjs.com/package/@chroma-core/cohere) -| Google Gemini | [@chroma-core/google-gemini](https://www.npmjs.com/package/@chroma-core/google-gemini) -| Hugging Face Server | [@chroma-core/huggingface-server](https://www.npmjs.com/package/@chroma-core/huggingface-server) -| Jina | [@chroma-core/jina](https://www.npmjs.com/package/@chroma-core/jina) -| Mistral | [@chroma-core/mistral](https://www.npmjs.com/package/@chroma-core/mistral) -| Morph | [@chroma-core/morph](https://www.npmjs.com/package/@chroma-core/morph) -| Ollama | [@chroma-core/ollama](https://www.npmjs.com/package/@chroma-core/ollama) -| OpenAI | [@chroma-core/openai](https://www.npmjs.com/package/@chroma-core/openai) -| Together AI | [@chroma-core/together-ai](https://www.npmjs.com/package/@chroma-core/together-ai) -| Voyage AI | [@chroma-core/voyageai](https://www.npmjs.com/package/@chroma-core/voyageai) +| Provider | Embedding Function Package +| ---------- | ------------------------- +| All (installs all packages) | [@chroma-core/all](https://www.npmjs.com/package/@chroma-core/all) +| Cloudflare Workers AI | [@chroma-core/cloudflare-worker-ai](https://www.npmjs.com/package/@chroma-core/cloudflare-worker-ai) +| Cohere | [@chroma-core/cohere](https://www.npmjs.com/package/@chroma-core/cohere) +| Google Gemini | [@chroma-core/google-gemini](https://www.npmjs.com/package/@chroma-core/google-gemini) +| Hugging Face Server | [@chroma-core/huggingface-server](https://www.npmjs.com/package/@chroma-core/huggingface-server) +| Jina | [@chroma-core/jina](https://www.npmjs.com/package/@chroma-core/jina) +| Mistral | [@chroma-core/mistral](https://www.npmjs.com/package/@chroma-core/mistral) +| Morph | [@chroma-core/morph](https://www.npmjs.com/package/@chroma-core/morph) +| Ollama | [@chroma-core/ollama](https://www.npmjs.com/package/@chroma-core/ollama) +| OpenAI | [@chroma-core/openai](https://www.npmjs.com/package/@chroma-core/openai) +| Together AI | [@chroma-core/together-ai](https://www.npmjs.com/package/@chroma-core/together-ai) +| Voyage AI | [@chroma-core/voyageai](https://www.npmjs.com/package/@chroma-core/voyageai) We welcome pull requests to add new Embedding Functions to the community. @@ -237,6 +237,38 @@ await collection.query({ queryEmbeddings: embeddings }); ## Custom Embedding Functions +## Lightweight local embeddings (quick & dependency-free) + +For quick smoke tests, CI, or environments without model downloads or API keys, Chroma includes or supports several lightweight/local embedding options. These are useful for running examples or unit tests that must be deterministic and fast. + +- `local_simple_hash` (recommended for CI and examples): a tiny, dependency-free deterministic embedding included in the Python package. It's implemented as a small hash-based function that converts text into a fixed-size float32 vector without external packages. +- `sentence_transformer` (local, requires `sentence_transformers`): a higher-quality local embedding using the SentenceTransformers library (requires model download or offline model). + +Python — use `local_simple_hash` directly: + +```python +from chromadb.utils.embedding_functions import config_to_embedding_function + +# Option A: construct directly +from chromadb.utils.embedding_functions.simple_hash_embedding_function import SimpleHashEmbeddingFunction +ef = SimpleHashEmbeddingFunction(dim=16) +embs = ef(["hello world"]) + +# Option B: load by config (useful for reading configs) +cfg = {"name": "local_simple_hash", "config": {"dim": 16}} +ef2 = config_to_embedding_function(cfg) +embs2 = ef2(["hello world"]) + +print(embs) +``` + +Notes + +- `local_simple_hash` is deterministic and dependency-free. It's intended for tests and examples, not high-quality semantic embeddings. +- If you need higher-quality local embeddings, use `SentenceTransformerEmbeddingFunction` (requires `pip install sentence_transformers`). For CI or quick validation prefer `local_simple_hash`. +- You can register custom embedding functions using `register_embedding_function` or add them to configs and call `config_to_embedding_function`. + + You can create your own embedding function to use with Chroma; it just needs to implement `EmbeddingFunction`. {% TabbedCodeBlock %} diff --git a/examples/README.md b/examples/README.md index 7b6da2326db..c99f416d536 100644 --- a/examples/README.md +++ b/examples/README.md @@ -1,14 +1,16 @@ +````markdown ## Examples > Searching for community contributions! Join the [#contributing](https://discord.com/channels/1073293645303795742/1074711539724058635) Discord Channel to discuss. This folder will contain an ever-growing set of examples. -The key with examples is that they should *always* work. The failure mode of examples folders is that they get quickly deprecated. +The key with examples is that they should _always_ work. The failure mode of examples folders is that they get quickly deprecated. Examples are: + - Easy to maintain -- Easy to maintain examples are __simple__ +- Easy to maintain examples are **simple** - Use case examples are fine, technology is better ``` @@ -23,11 +25,13 @@ folder structure > 💡 Feel free to open a PR with an example you would like to see ### Basic Functionality + - [x] Examples of using different embedding models - [x] Local persistance demo - [x] Where filtering demo ### Advanced Functionality + - [ ] Clustering - [ ] Projections - [ ] Fine tuning @@ -35,11 +39,13 @@ folder structure ### Use With #### LLM Application Code + - [ ] Langchain - [ ] LlamaIndex - [ ] Semantic Kernal #### App Frameworks + - [ ] Streamlit - [ ] Gradio - [ ] Nextjs @@ -47,18 +53,55 @@ folder structure - [ ] FastAPI #### Inference Services + - [ ] Brev.dev - [ ] Banana.dev - [ ] Modal ### LLM providers/services + - [ ] OpenAI - [ ] Anthropic - [ ] Cohere - [ ] Google PaLM - [ ] Hugging Face -*** +--- ### Inspiration + - The [OpenAI Cookbook](https://github.com/openai/openai-cookbook) gets a lot of things right +```` + +## local_simple_hash example (quick smoke test) + +A tiny, dependency-free script demonstrating the `local_simple_hash` embedding function is included at: + +`examples/local_simple_hash_example.py` + +How to run (PowerShell): + +```powershell +# from the repository root, in an activated venv +python -m pip install -e . +python examples/local_simple_hash_example.py +``` + +Expected output (example): + +``` +Embeddings from SimpleHashEmbeddingFunction: +doc 0: len=16, dtype=float32, norm=0.123456 +doc 1: len=16, dtype=float32, norm=0.234567 +doc 2: len=16, dtype=float32, norm=0.000000 +doc 3: len=16, dtype=float32, norm=0.345678 + +Embeddings from config-built function: +len=16, dtype=float32, norm=0.234567 +``` + +Notes + +- The `local_simple_hash` function is deterministic and fast; it is intended for smoke tests and examples, not high-quality semantic similarity. +- Non-string inputs are accepted and stringified by the example script (useful when quickly testing inputs or fixtures). +- For higher-quality local embeddings, see `SentenceTransformerEmbeddingFunction` in `chromadb/utils/embedding_functions` (requires `sentence_transformers`). diff --git a/examples/getting_started_windows_dev.ipynb b/examples/getting_started_windows_dev.ipynb new file mode 100644 index 00000000000..b691f1492a7 --- /dev/null +++ b/examples/getting_started_windows_dev.ipynb @@ -0,0 +1,64 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "8ec3ad5a", + "metadata": {}, + "source": [ + "# Getting started: Python-only dev on Windows (PowerShell)\n", + "\n", + "This notebook shows a minimal example you can run after setting up the Python-only developer environment described in `DEVELOP.md` -> 'Python-only dev setup (Windows)'.\n", + "\n", + "If you installed the repository in editable mode (`pip install -e .`) in the same environment used to run this notebook, you can import `chromadb` directly below." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "9eec5697", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Query results: {'ids': [['w1']], 'embeddings': None, 'documents': [['Hello from Windows']], 'uris': None, 'included': ['metadatas', 'documents', 'distances'], 'data': None, 'metadatas': [[{'source': 'notebook'}]], 'distances': [[0.8225913047790527]]}\n" + ] + } + ], + "source": [ + "# Minimal runtime example: create a client, a collection, add docs, and query\n", + "try:\n", + " import chromadb\n", + " client = chromadb.Client()\n", + " collection = client.create_collection('example_windows')\n", + " collection.add(documents=['Hello from Windows','Second doc'], metadatas=[{'source':'notebook'},{'source':'notebook'}], ids=['w1','w2'])\n", + " results = collection.query(query_texts=['Hello'], n_results=1)\n", + " print('Query results:', results)\n", + "except Exception as e:\n", + " print('Make sure you have followed the Windows Python-only setup in DEVELOP.md and installed the package in the active virtualenv. Error:', e)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": ".venv (3.13.7)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.13.7" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/examples/local_simple_hash_example.py b/examples/local_simple_hash_example.py new file mode 100644 index 00000000000..5071b56a1fd --- /dev/null +++ b/examples/local_simple_hash_example.py @@ -0,0 +1,67 @@ +"""local_simple_hash_example.py + +A tiny, dependency-free example showing how to use the `local_simple_hash` +embedding function that ships with the repository. This is useful for quick +smoke tests in CI or on machines without model downloads or API keys. + +Usage: + # from the repository root, in an activated venv + python -m pip install -e . + python examples/local_simple_hash_example.py + +The script demonstrates: +- constructing the embedding function directly +- using config_to_embedding_function to build from a config dict +- embedding a few example inputs including non-strings + +This script is intentionally small and dependency-free. +""" + +from typing import Sequence + +import numpy as np + +from chromadb.utils.embedding_functions.simple_hash_embedding_function import ( + SimpleHashEmbeddingFunction, +) +from chromadb.utils.embedding_functions import config_to_embedding_function + + +def print_embedding_info(emb: np.ndarray) -> None: + """Print concise information about a single embedding vector.""" + print( + f"len={emb.shape[0]}, dtype={emb.dtype}, norm={float(np.linalg.norm(emb)):.6f}" + ) + + +def main() -> None: + # Option A: construct directly + ef = SimpleHashEmbeddingFunction(dim=16) + + # Convert inputs to strings to satisfy the embedding function's expected input type + raw_docs: Sequence[object] = [ + "The quick brown fox jumps over the lazy dog", + "ChromaDB local embedding example", + "", # empty string -> zero vector + 12345, # non-string input will be stringified + ] + + docs: list[str] = [str(d) for d in raw_docs] + embeddings = ef(docs) + + print("Embeddings from SimpleHashEmbeddingFunction:") + for i, e in enumerate(embeddings): + print(f"doc {i}:", end=" ") + print_embedding_info(e) + + # Option B: build from config (useful when embedding functions are configured by + # JSON/YAML). This demonstrates `config_to_embedding_function` integration. + cfg = {"name": "local_simple_hash", "config": {"dim": 16}} + ef2 = config_to_embedding_function(cfg) + embs2 = ef2(["hello from cfg"]) + print("\nEmbeddings from config-built function:") + print_embedding_info(embs2[0]) + + +if __name__ == "__main__": + main()