-
Notifications
You must be signed in to change notification settings - Fork 810
Issues: wenda-LLM/wenda
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
请问llm_type为llama时,模型权重文件必须为model/stable-vicuna-13B.ggml.q4_2.bin吗?权重文件是否可以为llama2?以及策略是否支持fp16?
#534
opened Apr 30, 2024 by
15229684931
mac intel 芯片运行报错 RuntimeError: "addmm_impl_cpu_" not implemented for 'Half'
#518
opened Jan 6, 2024 by
devon-ye
构建知识库报错python plugins/gen_data_st.py时候ModuleNotFoundError: No module named 'exceptions'
#517
opened Dec 24, 2023 by
hopeforus
夸克下载的懒人包启动的时候报错:OSError : [WinError 193] %1不是有效的Win32应用程序。模型加载完成
#509
opened Nov 26, 2023 by
QTNiCheng
用官方源码部署 baichuan2-13B-chat的效果和 wenda部署 baichuan2-13B-chat的效果差距很大
#506
opened Nov 1, 2023 by
ExpressGit
Previous Next
ProTip!
Find all open issues with in progress development work with linked:pr.