-
Hello, I was trying your https://github.com/langchain-ai/langgraph/blob/main/examples/rag/langgraph_rag_agent_llama3_local.ipynb via google colab, after plugging my own langchain api key, everything was fine until I executed print(retrieval_grader.invoke({"question": question, "document": doc_txt})), and got the following error: ConnectionRefusedError Traceback (most recent call last) 28 frames The above exception was the direct cause of the following exception: NewConnectionError Traceback (most recent call last) The above exception was the direct cause of the following exception: MaxRetryError Traceback (most recent call last) During handling of the above exception, another exception occurred: ConnectionError Traceback (most recent call last) ConnectionError: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/chat (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fccddf274c0>: Failed to establish a new connection: [Errno 111] Connection refused')) Could you help clarify how to resolve this? Thanks, |
Beta Was this translation helpful? Give feedback.
Replies: 6 comments 1 reply
-
Hello, Could you share your thoughts on this issue? I would really value your insights! Yang |
Beta Was this translation helpful? Give feedback.
-
It looks like you are trying to connect to a localhost server that is misconfigured or that you haven't set up creds for |
Beta Was this translation helpful? Give feedback.
-
I was using https://github.com/langchain-ai/langgraph/blob/main/examples/rag/langgraph_rag_agent_llama3_local.ipynb almost as is except for plugging in my own langchain api key. Which creds needs to be set up here? Thanks, |
Beta Was this translation helpful? Give feedback.
-
Is your ollama server running? |
Beta Was this translation helpful? Give feedback.
-
I'm having the same issue but my local ollama server IS running. This looks like the only error I'm getting, the formatting is rough!
|
Beta Was this translation helpful? Give feedback.
-
For anyone encountering this error in the future: I faced the same issue even though Ollama was running correctly. The problem was that LangGraph was running inside a container, which meant it couldn't access Ollama without additional configuration. Here's how I resolved it: Update ChatOllama Configuration: Modify docker-compose.yml: version: '3' |
Beta Was this translation helpful? Give feedback.
Is your ollama server running?