-
Notifications
You must be signed in to change notification settings - Fork 4.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG]: 该如何安装colossal到NPU上,看项目有相关描述,但没找到相关教程 #6205
Comments
Title: [BUG]: How to install colossal on NPU, see the project has a relevant description, but no relevant tutorial was found |
我们提供了昇腾的Torch基础镜像: |
@ver217 Hi ~ I try to install coati in npu docker environment, bug get get error for "NCCL" not ready. How should we guide the install logic to recognize Ascend "HCCL"。
And we want to run the lora fine-tuning with deepseek 671b model and 4 nodes / 8 910b3 per node.
|
flash_attn is not available on NPU devices. DON'T install flash_attn and make a dummy directory in your python packages path. E.g. mkdir .conda/envs/myenv/lib/python3.10/site-packages/flash_attn
touch .conda/envs/myenv/lib/python3.10/site-packages/flash_attn/__init__.py |
Is there an existing issue for this bug?
The bug has not been fixed in the latest main branch
Do you feel comfortable sharing a concise (minimal) script that reproduces the error? :)
Yes, I will share a minimal reproducible script.
🐛 Describe the bug
不知道该怎么安装colossal到NPU上,希望能有一个对应的教程,如何使用extentions部分来为npu安装colossal
Environment
No response
The text was updated successfully, but these errors were encountered: