We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ai提示说是值与MODEL_METADATA字典不匹配,但是并没有字符上的异常。
正常部署,创建虚拟环境,git抓取并配置使用chatglm2-6b模型。
Opening ChuanhuChatGPT... venv "F:\APPLICATION\amaconda\envs\ChuanhuChat\Python.exe" 2024-05-06 23:31:39,573 [INFO] [_client.py:1026] HTTP Request: GET https://api.gradio.app/gradio-messaging/en "HTTP/1.1 200 OK" 2024-05-06 23:31:40,834 [INFO] [config.py:327] 默认模型设置为了:chatglm2-6b 2024-05-06 23:31:42,142 [INFO] [utils.py:148] Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8. 2024-05-06 23:31:42,143 [INFO] [utils.py:161] NumExpr defaulting to 8 threads. Traceback (most recent call last): File "F:\CodeAPP\MapGBT\ChuanhuChatGPT\ChuanhuChatbot.py", line 204, in <module> value=i18n(MODEL_METADATA[MODELS[DEFAULT_MODEL]]["description"]), KeyError: 'chatglm2-6b'
- OS: windows 11 23H2 - Browser: edge - Gradio version: 4.26.0 - Python version: 3.10.14
cuda版本3.11
The text was updated successfully, but these errors were encountered:
一样的问题,请问解决了吗?
Sorry, something went wrong.
没,我改用ollama了
我都也是这个问题
No branches or pull requests
是否已存在现有反馈与解答?
是否是一个代理配置相关的疑问?
错误描述
ai提示说是值与MODEL_METADATA字典不匹配,但是并没有字符上的异常。
复现操作
正常部署,创建虚拟环境,git抓取并配置使用chatglm2-6b模型。
错误日志
运行环境
补充说明
cuda版本3.11
The text was updated successfully, but these errors were encountered: