Skip to content

Commit

Permalink
doc:ollama document (#1512)
Browse files Browse the repository at this point in the history
  • Loading branch information
Aries-ckt authored May 11, 2024
1 parent d313155 commit f389a0c
Show file tree
Hide file tree
Showing 4 changed files with 47 additions and 2 deletions.
2 changes: 1 addition & 1 deletion dbgpt/storage/vector_store/chroma_store.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@

CHROMA_COLLECTION_NAME = "langchain"


@register_resource(
_("Chroma Vector Store"),
"chroma_vector_store",
Expand Down Expand Up @@ -152,7 +153,6 @@ def vector_name_exists(self) -> bool:
files = list(filter(lambda f: f != "chroma.sqlite3", files))
return len(files) > 0


def load_document(self, chunks: List[Chunk]) -> List[str]:
"""Load document to vector store."""
logger.info("ChromaStore load document")
Expand Down
2 changes: 1 addition & 1 deletion dbgpt/storage/vector_store/pgvector_store.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ def __init__(self, vector_store_config: PGVectorConfig) -> None:
embedding_function=self.embeddings,
collection_name=self.collection_name,
connection_string=self.connection_string,
) # mypy: ignore
) # mypy: ignore

def similar_search(
self, text: str, topk: int, filters: Optional[MetadataFilters] = None
Expand Down
41 changes: 41 additions & 0 deletions docs/docs/installation/advanced_usage/ollama.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# ollama
ollama is a model serving platform that allows you to deploy models in a few seconds.
It is a great tool.

### Install ollama
If your system is linux.
```bash
curl -fsSL https://ollama.com/install.sh | sh
```
other environments, please refer to the [official ollama website](https://ollama.com/).
### Pull models.
1. Pull LLM
```bash
ollama pull qwen:0.5b
```
2. Pull embedding model.
```bash
ollama pull nomic-embed-text
```

3. install ollama package.
```bash
pip install ollama
```

### Use ollama proxy model in DB-GPT `.env` file

```bash
LLM_MODEL=ollama_proxyllm
PROXY_SERVER_URL=http://127.0.0.1:11434
PROXYLLM_BACKEND="qwen:0.5b"
PROXY_API_KEY=not_used
EMBEDDING_MODEL=proxy_ollama
proxy_ollama_proxy_server_url=http://127.0.0.1:11434
proxy_ollama_proxy_backend="nomic-embed-text:latest"
```

### run dbgpt server
```bash
python dbgpt/app/dbgpt_server.py
```
4 changes: 4 additions & 0 deletions docs/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -237,6 +237,10 @@ const sidebars = {
type: 'doc',
id: 'installation/advanced_usage/More_proxyllms',
},
{
type: 'doc',
id: 'installation/advanced_usage/ollama',
},
{
type: 'doc',
id: 'installation/advanced_usage/vLLM_inference',
Expand Down

0 comments on commit f389a0c

Please sign in to comment.