Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FIX]The deployment was not successful, regardless of whether I used Ollama openai, All show uvicorn. error: connection closed #782

Open
leoleelxh opened this issue May 30, 2024 · 10 comments
Labels
fix Fix something that isn't working as expected

Comments

@leoleelxh
Copy link

The deployment was not successful, regardless of whether I used Ollama openai, All show uvicorn. error: connection closed。

server-1 | [15:31:04.245190] DEBUG uvicorn.error: = connection is protocol.py:1227
server-1 | CLOSING
server-1 | [15:31:04.245975] DEBUG uvicorn.error: > CLOSE 1000 (OK) [2 protocol.py:1178
server-1 | bytes]
server-1 | [15:31:04.246598] DEBUG uvicorn.error: = connection is protocol.py:1497
server-1 | CLOSED
server-1 | [15:31:04.247182] DEBUG uvicorn.error: ! failing connection protocol.py:1412
server-1 | with code 1006
server-1 | [15:31:04.247735] ERROR uvicorn.error: closing handshake failed server.py:248
server-1 | ╭─ Traceback (most recent call last) ─╮
server-1 | │ /usr/local/lib/python3.10/dist-pack │
server-1 | │ ages/websockets/legacy/server.py:24 │
server-1 | │ 4 in handler │
server-1 | │ │
server-1 | │ 241 │ │ │ │ raise │
server-1 | │ 242 │ │ │ │
server-1 | │ 243 │ │ │ try: │
server-1 | │ ❱ 244 │ │ │ │ await self.c │
server-1 | │ 245 │ │ │ except Connectio │
server-1 | │ 246 │ │ │ │ raise │
server-1 | │ 247 │ │ │ except Exception │
server-1 | │ │
server-1 | │ /usr/local/lib/python3.10/dist-pack │
server-1 | │ ages/websockets/legacy/protocol.py: │
server-1 | │ 770 in close │
server-1 | │ │
server-1 | │ 767 │ │ """ │
server-1 | │ 768 │ │ try: │
server-1 | │ 769 │ │ │ async with async │
server-1 | │ ❱ 770 │ │ │ │ await self.w │
server-1 | │ 771 │ │ except asyncio.Timeo │
server-1 | │ 772 │ │ │ # If the close f │
server-1 | │ 773 │ │ │ # are full, the │
server-1 | │ │
server-1 | │ /usr/local/lib/python3.10/dist-pack │
server-1 | │ ages/websockets/legacy/protocol.py: │
server-1 | │ 1236 in write_close_frame │
server-1 | │ │
server-1 | │ 1233 │ │ │ │ data = close │
server-1 | │ 1234 │ │ │ │
server-1 | │ 1235 │ │ │ # 7.1.2. Start t │
server-1 | │ ❱ 1236 │ │ │ await self.write │
server-1 | │ 1237 │ │
server-1 | │ 1238 │ async def keepalive_ping │
server-1 | │ 1239 │ │ """ │
server-1 | │ │
server-1 | │ /usr/local/lib/python3.10/dist-pack │
server-1 | │ ages/websockets/legacy/protocol.py: │
server-1 | │ 1209 in write_frame │
server-1 | │ │
server-1 | │ 1206 │ │ │ │ f"Cannot wri │
server-1 | │ 1207 │ │ │ ) │
server-1 | │ 1208 │ │ self.write_frame_syn │
server-1 | │ ❱ 1209 │ │ await self.drain() │
server-1 | │ 1210 │ │
server-1 | │ 1211 │ async def write_close_fr │
server-1 | │ 1212 │ │ self, close: Close, │
server-1 | │ │
server-1 | │ /usr/local/lib/python3.10/dist-pack │
server-1 | │ ages/websockets/legacy/protocol.py: │
server-1 | │ 1198 in drain │
server-1 | │ │
server-1 | │ 1195 │ │ │ self.fail_connec │
server-1 | │ 1196 │ │ │ # Wait until the │
server-1 | │ 1197 │ │ │ # with the corre │
server-1 | │ ❱ 1198 │ │ │ await self.ensur │
server-1 | │ 1199 │ │
server-1 | │ 1200 │ async def write_frame( │
server-1 | │ 1201 │ │ self, fin: bool, opc │
server-1 | │ │
server-1 | │ /usr/local/lib/python3.10/dist-pack │
server-1 | │ ages/websockets/legacy/protocol.py: │
server-1 | │ 939 in ensure_open │
server-1 | │ │
server-1 | │ 936 │ │ │ │ return │
server-1 | │ 937 │ │ │
server-1 | │ 938 │ │ if self.state is Sta │
server-1 | │ ❱ 939 │ │ │ raise self.conne │
server-1 | │ 940 │ │ │
server-1 | │ 941 │ │ if self.state is Sta │
server-1 | │ 942 │ │ │ # If we started │
server-1 | ╰─────────────────────────────────────╯
server-1 | ConnectionClosedError: sent 1000 (OK);
server-1 | no close frame received
server-1 | [15:31:04.365601] INFO uvicorn.error: connection closed server.py:264
server-1 | [15:31:04.366237] DEBUG uvicorn.error: x half-closing TCP protocol.py:1319
server-1 | connection
database-1 | 2024-05-30 15:32:28.902 UTC [27] LOG: checkpoint starting: time
database-1 | 2024-05-30 15:32:30.232 UTC [27] LOG: checkpoint complete: wrote 16 buffers (0.1%); 0 WAL file(s) added, 0 removed, 0 recycled; write=1.316 s, sync=0.006 s, total=1.331 s; sync files=15, longest=0.003 s, average=0.001 s; distance=18 kB, estimate=18 kB
server-1 | [15:32:43.029176] INFO khoj.configure: 📡 Uploading configure.py:345
server-1 | telemetry to
server-1 | https://khoj.beta.haletic.com/v1/tel
server-1 | emetry...
server-1 | [15:32:43.030050] DEBUG khoj.configure: Telemetry state: configure.py:346
server-1 | [{'telemetry_type': 'api',
server-1 | 'server_version': '1.12.1', 'os':
server-1 | 'Linux', 'timestamp': '2024-05-30
server-1 | 15:28:15', 'client_host':
server-1 | '192.168.65.1', 'user_agent':
server-1 | 'Mozilla/5.0 (Windows NT 10.0;
server-1 | Win64; x64) AppleWebKit/537.36
server-1 | (KHTML, like Gecko) Chrome/125.0.0.0
server-1 | Safari/537.36', 'referer':
server-1 | 'http://localhost:42110/', 'host':
server-1 | 'localhost:42110', 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api': 'chat_options'},
server-1 | {'telemetry_type': 'api',
server-1 | 'server_version': '1.12.1', 'os':
server-1 | 'Linux', 'timestamp': '2024-05-30
server-1 | 15:28:15', 'client_host':
server-1 | '192.168.65.1', 'user_agent':
server-1 | 'Mozilla/5.0 (Windows NT 10.0;
server-1 | Win64; x64) AppleWebKit/537.36
server-1 | (KHTML, like Gecko) Chrome/125.0.0.0
server-1 | Safari/537.36', 'referer':
server-1 | 'http://localhost:42110/', 'host':
server-1 | 'localhost:42110', 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api': 'chat_sessions'},
server-1 | {'telemetry_type': 'api',
server-1 | 'server_version': '1.12.1', 'os':
server-1 | 'Linux', 'timestamp': '2024-05-30
server-1 | 15:28:15', 'client_host':
server-1 | '192.168.65.1', 'user_agent':
server-1 | 'Mozilla/5.0 (Windows NT 10.0;
server-1 | Win64; x64) AppleWebKit/537.36
server-1 | (KHTML, like Gecko) Chrome/125.0.0.0
server-1 | Safari/537.36', 'referer':
server-1 | 'http://localhost:42110/', 'host':
server-1 | 'localhost:42110', 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api': 'chat_history',
server-1 | 'client': 'web'}, {'telemetry_type':
server-1 | 'api', 'server_version': '1.12.1',
server-1 | 'os': 'Linux', 'timestamp':
server-1 | '2024-05-30 15:30:51',
server-1 | 'client_host': '192.168.65.1',
server-1 | 'user_agent': 'unknown', 'referer':
server-1 | 'unknown', 'host': 'unknown',
server-1 | 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api':
server-1 | 'get_all_filenames'},
server-1 | {'telemetry_type': 'api',
server-1 | 'server_version': '1.12.1', 'os':
server-1 | 'Linux', 'timestamp': '2024-05-30
server-1 | 15:30:51', 'client_host':
server-1 | '192.168.65.1', 'user_agent':
server-1 | 'unknown', 'referer': 'unknown',
server-1 | 'host': 'unknown', 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api':
server-1 | 'get_all_filenames'},
server-1 | {'telemetry_type': 'api',
server-1 | 'server_version': '1.12.1', 'os':
server-1 | 'Linux', 'timestamp': '2024-05-30
server-1 | 15:30:51', 'client_host':
server-1 | '192.168.65.1', 'user_agent':
server-1 | 'unknown', 'referer': 'unknown',
server-1 | 'host': 'unknown', 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api':
server-1 | 'get_all_filenames'},
server-1 | {'telemetry_type': 'api',
server-1 | 'server_version': '1.12.1', 'os':
server-1 | 'Linux', 'timestamp': '2024-05-30
server-1 | 15:30:51', 'client_host':
server-1 | '192.168.65.1', 'user_agent':
server-1 | 'Mozilla/5.0 (Windows NT 10.0;
server-1 | Win64; x64) AppleWebKit/537.36
server-1 | (KHTML, like Gecko) Chrome/125.0.0.0
server-1 | Safari/537.36', 'referer':
server-1 | 'http://localhost:42110/config',
server-1 | 'host': 'localhost:42110',
server-1 | 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api': 'update',
server-1 | 'client': 'web'}, {'telemetry_type':
server-1 | 'api', 'server_version': '1.12.1',
server-1 | 'os': 'Linux', 'timestamp':
server-1 | '2024-05-30 15:30:53',
server-1 | 'client_host': '192.168.65.1',
server-1 | 'user_agent': 'Mozilla/5.0 (Windows
server-1 | NT 10.0; Win64; x64)
server-1 | AppleWebKit/537.36 (KHTML, like
server-1 | Gecko) Chrome/125.0.0.0
server-1 | Safari/537.36', 'referer':
server-1 | 'http://localhost:42110/chat',
server-1 | 'host': 'localhost:42110',
server-1 | 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api': 'chat_options'},
server-1 | {'telemetry_type': 'api',
server-1 | 'server_version': '1.12.1', 'os':
server-1 | 'Linux', 'timestamp': '2024-05-30
server-1 | 15:30:53', 'client_host':
server-1 | '192.168.65.1', 'user_agent':
server-1 | 'Mozilla/5.0 (Windows NT 10.0;
server-1 | Win64; x64) AppleWebKit/537.36
server-1 | (KHTML, like Gecko) Chrome/125.0.0.0
server-1 | Safari/537.36', 'referer':
server-1 | 'http://localhost:42110/chat',
server-1 | 'host': 'localhost:42110',
server-1 | 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api': 'chat_sessions'},
server-1 | {'telemetry_type': 'api',
server-1 | 'server_version': '1.12.1', 'os':
server-1 | 'Linux', 'timestamp': '2024-05-30
server-1 | 15:30:53', 'client_host':
server-1 | '192.168.65.1', 'user_agent':
server-1 | 'Mozilla/5.0 (Windows NT 10.0;
server-1 | Win64; x64) AppleWebKit/537.36
server-1 | (KHTML, like Gecko) Chrome/125.0.0.0
server-1 | Safari/537.36', 'referer':
server-1 | 'http://localhost:42110/chat',
server-1 | 'host': 'localhost:42110',
server-1 | 'server_id':
server-1 | '9fe7a503-fa45-40bd-9243-a43b4ab2006
server-1 | 4', 'subscription_type': 'standard',
server-1 | 'is_recurring': False, 'client_id':
server-1 | 'default', 'api': 'chat_history',
server-1 | 'client': 'web'}]

@leoleelxh leoleelxh added the fix Fix something that isn't working as expected label May 30, 2024
@leoleelxh
Copy link
Author

image

@jppaolim
Copy link

Same here... same with ollama ... nothing works :(

@userdehghani
Copy link

same here

@debanjum
Copy link
Member

debanjum commented Jun 1, 2024

That's unfortunate! The stacktrace seems incomplete/insufficient to root-cause the issue. I'd need more info to debug.

It'd be great if you could share information about how you installed Khoj, which OS and which chat model you're using.

@jppaolim
Copy link

jppaolim commented Jun 1, 2024

Right. On my end it's Docker installation, downloaded the compose, build the thing and then downloaded the latest client, ARM version, so running Mac M1 Max, on Sonoma 14.5
It's doesn't connect to any openAI local endpoint, nor ollama, didn't try the remote version.

@sabaimran
Copy link
Collaborator

Was there more to the stack trace? Usually, when this error comes up, there's another error that causes it.

@leoleelxh
Copy link
Author

That's unfortunate! The stacktrace seems incomplete/insufficient to root-cause the issue. I'd need more info to debug.

It'd be great if you could share information about how you installed Khoj, which OS and which chat model you're using.

My configuration environment is: win11+wsl2+docker.
According to the installation documentation, install and start, regardless of whether the Ollama or OpenAI model is configured, this issue will be prompted.

@jppaolim
Copy link

jppaolim commented Jun 3, 2024

Opened #796

@sabaimran
Copy link
Collaborator

Can you make sure you're not using the offlinechat mode in your default (first) chat model + docker? This generally won't work because of RAM constraints. Ideally, I'd need more info about your openai processor conversation if you're using ollama, and any stack traces.

@stark1tty
Copy link

Also having this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fix Fix something that isn't working as expected
Projects
None yet
Development

No branches or pull requests

6 participants