-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Special Token not generated When running playground.py during demo #50
Comments
I have tried to debug myself, and I find out that the text generated is not understandable, and the special token is None (minigpt5) Intern1@research22: |
This is my /home/Intern1/Yun_SRP/MiniGPT-5/minigpt4/configs/models/minigpt4.yaml: model: vit encoderimage_size: 224 Q-Formernum_query_token: 32 Vicunallama_model: "/home/Intern1/Yun_SRP/MiniGPT-5/VicunaV07B" generation configsprompt: "" preprocess: |
Have you set the python environment variable IS_STAGE2=True for running the demo? It seems the model weights are not correctly imported. |
When I'm trying to run the demo, it says the following:
python3 playground.py --stage1_weight ../WEIGHT_FOLDER/stage1_cc3m.ckpt --test_weight ../WEIGHT_FOLDER/stage2_vist.ckpt
/home/Intern1/anaconda3/envs/minigpt5/lib/python3.9/site-packages/diffusers/utils/outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
torch.utils._pytree._register_pytree_node(
Seed set to 42
/home/Intern1/anaconda3/envs/minigpt5/lib/python3.9/site-packages/huggingface_hub/file_download.py:1132: FutureWarning:
resume_download
is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, useforce_download=True
.warnings.warn(
Loading VIT
Loading VIT Done
Loading Q-Former
Loading Q-Former Done
Loading LLAMA
You are using the legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This means that tokens that come after special tokens will not be properly handled. We recommend you to read the related pull request available at huggingface/transformers#24565
Loading checkpoint shards: 100%|█| 2/2 [01:19<00:00, 39.9
Loading LLAMA Done
Load BLIP2-LLM Checkpoint: ../config/prerained_minigpt4_7b.pth
/home/Intern1/anaconda3/envs/minigpt5/lib/python3.9/site-packages/torch/nn/modules/transformer.py:306: UserWarning: enable_nested_tensor is True, but self.use_nested_tensor is False because encoder_layer.norm_first was True
warnings.warn(f"enable_nested_tensor is True, but self.use_nested_tensor is False because {why_not_sparsity_fast_path}")
Loading pipeline components...: 0%| | 0/6 [00:00<?, ?it/home/Intern1/anaconda3/envs/minigpt5/lib/python3.9/site-packages/diffusers/utils/outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
torch.utils._pytree._register_pytree_node(
Loading pipeline components...: 100%|█| 6/6 [00:00<00:00,
Traceback (most recent call last):
File "/home/Intern1/Yun_SRP/MiniGPT-5/examples/playground.py", line 76, in
ax.imshow(image_out)
File "/home/Intern1/anaconda3/envs/minigpt5/lib/python3.9/site-packages/matplotlib/init.py", line 1442, in inner
return func(ax, *map(sanitize_sequence, args), **kwargs)
File "/home/Intern1/anaconda3/envs/minigpt5/lib/python3.9/site-packages/matplotlib/axes/_axes.py", line 5665, in imshow
im.set_data(X)
File "/home/Intern1/anaconda3/envs/minigpt5/lib/python3.9/site-packages/matplotlib/image.py", line 701, in set_data
raise TypeError("Image data of dtype {} cannot be converted to "
TypeError: Image data of dtype object cannot be converted to float
The text was updated successfully, but these errors were encountered: