You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Building wheels for collected packages: llava
Building editable for llava (pyproject.toml) ... done
Created wheel for llava: filename=llava-1.2.2.post1-0.editable-py3-none-any.whl size=17884 sha256=16641805b47f4eeb7a204b4da44a24d0bd1a1743fc45e91e9dd83ba1659910af
Stored in directory: /tmp/pip-ephem-wheel-cache-o_z3astm/wheels/86/af/85/d3b6e65da4a7eb2238de6d23fd5ef72ce07477a9c6f0a08d16
Successfully built llava
Installing collected packages: sentencepiece, pytz, pydub, mpmath, websockets, urllib3, tzdata, typing-extensions, tomlkit, threadpoolctl, sympy, svgwrite, sniffio, six, shortuuid, shellingham, semantic-version, ruff, rpds-py, pyyaml, python-multipart, pyparsing, pygments, psutil, pillow, packaging, orjson, nvidia-nvtx-cu12, nvidia-nvjitlink-cu12, nvidia-nccl-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, numpy, networkx, narwhals, mdurl, markupsafe, markdown2, latex2mathml, kiwisolver, joblib, importlib-resources, idna, h11, fonttools, filelock, ffmpy, exceptiongroup, einops, cycler, click, charset-normalizer, certifi, attrs, annotated-types, aiofiles, wavedrom, uvicorn, triton, scipy, requests, referencing, python-dateutil, pydantic-core, nvidia-cusparse-cu12, nvidia-cudnn-cu12, markdown-it-py, jinja2, einops-exts, contourpy, anyio, starlette, scikit-learn, rich, pydantic, pandas, nvidia-cusolver-cu12, matplotlib, jsonschema-specifications, httpcore, typer, torch, tokenizers, jsonschema, httpx, fastapi, transformers, torchvision, gradio_client, bitsandbytes, altair, accelerate, timm, peft, gradio, llava
Attempting uninstall: tokenizers
Found existing installation: tokenizers 0.20.0
error: uninstall-no-record-file
× Cannot uninstall tokenizers 0.20.0
╰─> The package's contents are unknown: no RECORD file was found for tokenizers.
hint: You might be able to recover from this via: pip install --force-reinstall --no-deps tokenizers==0.20.0
I'm sure that the code is the latest, and when I import llava or tokenizers, here is a Segmentation fault.
$ python
Python 3.10.16 (main, Dec 11 2024, 16:24:50) [GCC 11.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import llava
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'llava'
>>> import tokenizers
Segmentation fault (core dumped)
I tried to re-install, or the hint "pip install --force-reinstall --no-deps tokenizers==0.20.0", they all not work.
Finally, I change the python version to 3.11, it is installed correctly. The command is:
Describe the issue
Issue:
I following the commands in the "Install Package":
Command:
However, there's a Error at this stage:
Log:
I'm sure that the code is the latest, and when I import llava or tokenizers, here is a Segmentation fault.
I tried to re-install, or the hint "pip install --force-reinstall --no-deps tokenizers==0.20.0", they all not work.
Finally, I change the python version to 3.11, it is installed correctly. The command is:
The text was updated successfully, but these errors were encountered: