Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

when i use into local laptop face issue #10

Open
noorkhokhar99 opened this issue Jan 18, 2025 · 1 comment
Open

when i use into local laptop face issue #10

noorkhokhar99 opened this issue Jan 18, 2025 · 1 comment

Comments

@noorkhokhar99
Copy link

Traceback (most recent call last):
File "", line 1, in
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\autodistill_grounded_sam_2_init_.py", line 1, in
from autodistill_grounded_sam_2.grounded_sam_2 import GroundedSAM2
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\autodistill_grounded_sam_2\grounded_sam_2.py", line 17, in
SamPredictor = load_SAM()
^^^^^^^^^^
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\autodistill_grounded_sam_2\helpers.py", line 53, in load_SAM
predictor = SAM2ImagePredictor(build_sam2(model_cfg, checkpoint))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\ASUS.cache\autodistill\segment_anything_2\segment-anything-2\sam2\build_sam.py", line 93, in build_sam2
_load_checkpoint(model, ckpt_path)
File "C:\Users\ASUS.cache\autodistill\segment_anything_2\segment-anything-2\sam2\build_sam.py", line 167, in _load_checkpoint
missing_keys, unexpected_keys = model.load_state_dict(sd)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\omegaconf\dictconfig.py", line 355, in getattr
self._format_and_raise(
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\omegaconf\base.py", line 231, in _format_and_raise
format_and_raise(
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\omegaconf_utils.py", line 899, in format_and_raise
_raise(ex, cause)
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\omegaconf_utils.py", line 797, in _raise
raise ex.with_traceback(sys.exc_info()[2]) # set env var OC_CAUSE=1 for full trace
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\omegaconf\dictconfig.py", line 351, in getattr
return self._get_impl(
^^^^^^^^^^^^^^^
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\omegaconf\dictconfig.py", line 442, in _get_impl
node = self._get_child(
^^^^^^^^^^^^^^^^
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\omegaconf\basecontainer.py", line 73, in _get_child
child = self._get_node(
^^^^^^^^^^^^^^^
File "C:\Users\ASUS\anaconda3\envs\sam_env\Lib\site-packages\omegaconf\dictconfig.py", line 480, in _get_node
raise ConfigKeyError(f"Missing key {key!s}")
omegaconf.errors.ConfigAttributeError: Missing key load_state_dict
full_key: model.load_state_dict
object_type=dict

@saurabheights
Copy link

See Dao-AILab/flash-attention#955 (comment) on how to select which flash attention release to install.

First go to flash attention release page. Example: for version 2.7.3, open all list of assets.

You need 3 things to find which whl package to install. If your pytorch version is 2.1 with cuda 11.x with CXX abi set to False and cpython version 3.10, use package -

https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu11torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl

# Notice cu11 is for cuda
# cp310 fpr cpython
# CXX abi can be found via: 
python -c "import torch; import torch;print(torch.__version__); print(torch.version.cuda); print(torch._C._GLIBCXX_USE_CXX11_ABI)"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants