-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' (/root/LLaVA/llava/model/__init__.py) #1840
Comments
I also encount for this problem, is there lack of any file named the LlavaLlamaForCausalLM ? |
oh, I see the init file, when you solve this problem, there will get another one |
I am facing this exact same issue. Was anyone able to resolve successfully? |
这是 flash-attn 包与 PyTorch 版本不兼容导致的问题。首先pytorch版本要是12.1,然后如果你只是想先试一下demo,可以通过pip uninstall flash-attn ,卸载flash-attn。在其他条件正常的情况下,就不会出现这个报错了 |
Question
I encountered an issue where I was unable to import the model while running Lora fine-tuning again,this is the printed log:
Traceback (most recent call last):
File "/root/LLaVA/llava/train/train_mem.py", line 1, in
from llava.train.train import train
File "/root/LLaVA/llava/init.py", line 1, in
from .model import LlavaLlamaForCausalLM
ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' (/root/LLaVA/llava/model/init.py)
The script I am running is: sh ./scripts/v1_5/finetune_task_lora.sh
The text was updated successfully, but these errors were encountered: