Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue on running ollama locally with mem0 and neo4j. #2223

Open
quzhixue-Kimi opened this issue Feb 19, 2025 · 0 comments
Open

Issue on running ollama locally with mem0 and neo4j. #2223

quzhixue-Kimi opened this issue Feb 19, 2025 · 0 comments

Comments

@quzhixue-Kimi
Copy link

🐛 Describe the bug

Hi there,

Below are the steps that I ran the python code:

  1. Starting neo4j docker container via "docker run -itd -p 7474:7474 -p 7687:7687 -v /root/neo4j/data:/data -v /root/neo4j/plugins:/plugins --name neo4j-apoc -e NEO4J_apoc_export_file_enabled=true -e NEO4J_apoc_import_file_enabled=true -e NEO4J_apoc_import_file_use__neo4j__config=true -e NEO4J_dbms_security_procedures_unrestricted=apoc.\* -e NEO4JLABS_PLUGINS=["apoc"] neo4j:latest"

Image

  1. Using miniforge3 to create conda environment and installing the python dependencies:

Image

  1. Using ollama to pull embedding and chat model, which ollama ran successfully.

Image

  1. Below is the python code:

import os
from mem0 import Memory

os.environ["OPENAI_API_KEY"] = "hello world"

config = {
"graph_store": {
"provider": "neo4j",
"config": {
"url": "neo4j://localhost:7687",
"username": "neo4j",
"password": "Whatisup2024"
},
"llm": {
"provider": "ollama",
"config": {
"model": "llama3.2:1b",
"temperature": 0.2,
"max_tokens": 4096,
"ollama_base_url": "http://localhost:11434",
},
}
},
"llm": {
"provider": "ollama",
"config": {
"model": "llama3.2:1b",
"temperature": 0.2,
"max_tokens": 1024,
"ollama_base_url": "http://localhost:11434",
},
},
"embedder": {
"provider": "ollama",
"config": {
"ollama_base_url": "http://localhost:11434",
"model": "nomic-embed-text:latest"
},
},
"version": "v1.1"
}

m = Memory.from_config(config)
m.add("I'm visiting Paris", user_id="john")

Got the below error message:

Image

Below is my test code with qdrant as vector_store:

import os
from mem0 import Memory

os.environ["OPENAI_API_KEY"] = "hello world"

config = {
"vector_store": {
"provider": "qdrant",
"config": {
"collection_name": "mem0",
"host": "localhost",
"embedding_model_dims": 768,
"port": 6333,
}
},
"llm": {
"provider": "ollama",
"config": {
"model": "llama3.2:1b",
"temperature": 0.2,
"max_tokens": 1024,
"ollama_base_url": "http://localhost:11434",
},
},
"embedder": {
"provider": "ollama",
"config": {
"ollama_base_url": "http://localhost:11434",
"embedding_dims": 768,
"model": "nomic-embed-text:latest"
},
}
#"version": "v1.1"
}

print(config)
m = Memory.from_config(config)
m.add("I'm visiting Paris", user_id="john")

Got below error:

Image

Any feedback is appreciated.

BR
Kimi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant