-
Notifications
You must be signed in to change notification settings - Fork 309
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Run 'inference.py' and 'model parallel group is not initialized' #86
Comments
Me too,I was able to run it before, and today I took it out and ran it again, and there was this problem |
Have you solved it? I have the same problem |
My solution was to compare it with the official code and make a change. Here is the official link |
好的 感谢您的解答。 您跑出来的LLama模型效果怎么样呢?我跑出来的llama-7B的模型回答好奇怪啊,是不是都是这样的呢?
…------------------ 原始邮件 ------------------
发件人: "juncongmoo/pyllama" ***@***.***>;
发送时间: 2023年5月15日(星期一) 下午3:21
***@***.***>;
***@***.******@***.***>;
主题: Re: [juncongmoo/pyllama] Run 'inference.py' and 'model parallel group is not initialized' (Issue #86)
我当时的解决方案是:和官方代码对比,做出修改,这是官方的链接
https://github.com/facebookresearch/llama
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
The 7B model came out and answered really strange, you can try to run the larger model |
Have you tried models over 7B? How is the effect?
…------------------ 原始邮件 ------------------
发件人: "juncongmoo/pyllama" ***@***.***>;
发送时间: 2023年5月16日(星期二) 上午10:28
***@***.***>;
***@***.******@***.***>;
主题: Re: [juncongmoo/pyllama] Run 'inference.py' and 'model parallel group is not initialized' (Issue #86)
The 7B model came out and answered really strange, you can try to run the larger model
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
check for this env PYLLAMA_META_MP if its not set then it should work without model parallel |
I use 2 nvidia 1080ti and try to start 7B model
The text was updated successfully, but these errors were encountered: