You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
import requests
response = requests.post(f"{base_url}/chat/completions", json=data, stream=True)
for line in response.iter_content(None, decode_unicode=True):
print(line)
返回信息:
Langchain-Chatchat提示信息:
httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
2024-06-30 07:42:51,744 root 2476 ERROR RemoteProtocolError: Caught exception: peer closed connection without sending complete message body (incomplete chunked read)
Xinference提示信息:
2024-06-30 07:42:51,734 xinference.api.restful_api 1498 ERROR [address=0.0.0.0:46897, pid=1536] Model not found in the model list, uid: glm4-chat
Traceback (most recent call last):
File "/root/anaconda3/envs/xf/lib/python3.11/site-packages/xinference/api/restful_api.py", line 1459, in create_chat_completion
model = await (await self._get_supervisor_ref()).get_model(model_uid)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
......................
File "/root/anaconda3/envs/xf/lib/python3.11/site-packages/xinference/core/supervisor.py", line 968, in get_model
raise ValueError(f"Model not found in the model list, uid: {model_uid}")
^^^^^^^^^^^^^^^^^
ValueError: [address=0.0.0.0:46897, pid=1536] Model not found in the model list, uid: glm4-chat
安装6月30日最新版本
Xinference
qwen1.5-chat+bge-large-zh-v1.5
Langchain-Chatchat
chatchat-config model --default_llm_model qwen1.5-chat
Langchain-Chatchat
web方式与LLM对话正常
web方式与本地知识库正常
https://github.com/chatchat-space/Langchain-Chatchat/blob/dev/docs/contributing/api.md
用此api运行
纯 LLM 对话正常
知识库对话错误
运行
base_url = "http://127.0.0.1:7861/chat"
data = {
"messages": [
{"role": "user", "content": "如何提问以获得高质量答案"},
],
"model": "qwen1.5-chat",
"tool_choice": "search_local_knowledgebase",
"extra_body": {"tool_input": {"database": "zb", "query": "如何提问以获得高质量答案"}},
"stream": True,
}
import requests
response = requests.post(f"{base_url}/chat/completions", json=data, stream=True)
for line in response.iter_content(None, decode_unicode=True):
print(line)
返回信息:
data: {"id": "chataef2cc4b-bb97-48e6-b325-17062ea15880", "object": "chat.completion.chunk", "model": "glm4-chat", "created": 1719732663, "status": 1, "message_type": 1, "message_id": null, "is_ref": false, "choices": [{"delta": {"content": "", "tool_calls": []}, "role": "assistant"}]}
data: {"id": "chata865e33e-18e0-487e-8224-d8293028717a", "object": "chat.completion.chunk", "model": "glm4-chat", "created": 1719732663, "status": 8, "message_type": 1, "message_id": null, "is_ref": false, "choices": [{"delta": {"content": "peer closed connection without sending complete message body (incomplete chunked read)", "tool_calls": []}, "role": "assistant"}]}
Langchain-Chatchat提示信息:
httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
2024-06-30 07:42:51,744 root 2476 ERROR RemoteProtocolError: Caught exception: peer closed connection without sending complete message body (incomplete chunked read)
Xinference提示信息:
2024-06-30 07:42:51,734 xinference.api.restful_api 1498 ERROR [address=0.0.0.0:46897, pid=1536] Model not found in the model list, uid: glm4-chat
Traceback (most recent call last):
File "/root/anaconda3/envs/xf/lib/python3.11/site-packages/xinference/api/restful_api.py", line 1459, in create_chat_completion
model = await (await self._get_supervisor_ref()).get_model(model_uid)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
......................
File "/root/anaconda3/envs/xf/lib/python3.11/site-packages/xinference/core/supervisor.py", line 968, in get_model
raise ValueError(f"Model not found in the model list, uid: {model_uid}")
^^^^^^^^^^^^^^^^^
ValueError: [address=0.0.0.0:46897, pid=1536] Model not found in the model list, uid: glm4-chat
如果在Xinference中运行多个模型,在纯 LLM 对话api 更换"model": "qwen1.5-chat",比如"model": "glm4-chat",没有问题。
是不是"知识库对话的api“传递过程中,传错"model"这个参数了啊
The text was updated successfully, but these errors were encountered: