We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cuda12.4 centos stream 9 python 3.12.10
latest
python -m xinference-cmdline 。。。。。。
1.加载模型 2.在webui页面使用对话功能 3.提示在本地模型文件夹找不到generation_config.json文件,ChatGLM2-6b没有这个文件,所以是不支持吗还是
不知道是不是不支持chatglm2-6b
The text was updated successfully, but these errors were encountered:
ChatGLM2 太老了,Xinference 不再支持。
Sorry, something went wrong.
No branches or pull requests
System Info / 系統信息
Cuda12.4
centos stream 9
python 3.12.10
Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?
Version info / 版本信息
latest
The command used to start Xinference / 用以启动 xinference 的命令
python -m xinference-cmdline 。。。。。。
Reproduction / 复现过程
1.加载模型
2.在webui页面使用对话功能
3.提示在本地模型文件夹找不到generation_config.json文件,ChatGLM2-6b没有这个文件,所以是不支持吗还是
Expected behavior / 期待表现
不知道是不是不支持chatglm2-6b
The text was updated successfully, but these errors were encountered: