You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to run a test script to ensure Ollama is being utilized and I keep getting a Connection Refused error.
My test script is pretty basic:
from crewai import Agent, Task, Crew, LLM
Create a test agent
test_agent = Agent(
name="TestAgent",
role="Tester",
goal="Test Ollama integration",
backstory="A test agent to verify Ollama is working correctly.",
llm=LLM(
model="ollama/qwen2.5-coder:7b",
timeout=5000
),
)
Create a test task
test_task = Task(
description="Verify Ollama integration by generating a simple response.",
agent=test_agent,
expected_output="A confirmation that Ollama is working correctly."
)
This is a portion of the error I am getting. Note that I can go to 'http://localhost:11434' and ollama is running
File "/mnt/d/development/CrewAI_Projects/Site-Builder/phataiproject/venv/lib/python3.12/site-
packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 1794, in exception_type
raise ServiceUnavailableError(
litellm.exceptions.ServiceUnavailableError: litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='localhost',
port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection
object at 0x7efeb2240260>: Failed to establish a new connection: [Errno 111] Connection refused'))
Note: I did have the 'base_url' declared in the LLM() but one post I read while researching this is that it isn't needed, so I removed it.
I have also tried different models just to ensure that wasn't the issue.
I am not sure what else to check. Ollama server is running and I am currently using it for other projects.
I updated crewAi and crewAI-tools to ensure I had the latest.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I am trying to run a test script to ensure Ollama is being utilized and I keep getting a Connection Refused error.
My test script is pretty basic:
Create a test agent
test_agent = Agent(
name="TestAgent",
role="Tester",
goal="Test Ollama integration",
backstory="A test agent to verify Ollama is working correctly.",
llm=LLM(
model="ollama/qwen2.5-coder:7b",
timeout=5000
),
)
Create a test task
test_task = Task(
description="Verify Ollama integration by generating a simple response.",
agent=test_agent,
expected_output="A confirmation that Ollama is working correctly."
)
Create a test crew
test_crew = Crew(
agents=[test_agent],
tasks=[test_task],
verbose=True
)
Run the test
result = test_crew.kickoff()
print("Test result:", result)
I have set my environment variables to:
This is a portion of the error I am getting. Note that I can go to 'http://localhost:11434' and ollama is running
Note: I did have the 'base_url' declared in the LLM() but one post I read while researching this is that it isn't needed, so I removed it.
I have also tried different models just to ensure that wasn't the issue.
I am not sure what else to check. Ollama server is running and I am currently using it for other projects.
I updated crewAi and crewAI-tools to ensure I had the latest.
Beta Was this translation helpful? Give feedback.
All reactions