-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sentence-transformers 2.2.2 pulling in nvidia packages #2637
Comments
Hello! Yes, these are requirements by the Note that if you don't have a GPU, then you may want to install
|
Thank you for your reply! |
If you're using the CPU only, then you won't need those CUDA packages. You can install it with:
(assuming that you're on Linux).
|
So at the moment, I have been running two pip commands, the first was installing a load of dependencies in a
Maybe the order of installing sentence-transfromers in the first requirements.txt and then installing torch was pulling the 2.3.0 (with nvidia) version of torch along as well? |
If I do
So not sure why/how we are getting the nvidia packages in our scans? |
That is rather odd. Perhaps you can
|
If I run I also think if I was pulling them cuda files, the docker image would be a lot larger (its only 2.5GB ish total, think with CUDA files it would be 8GB+).
|
For extra info I also installed
|
I think that looks fine, then! In fact, if you increase from
|
Do you happen to know if theres a check I can make to just completely know if them nvidia***.whl files got installed? I had a look in /usr/bin and /usr/lib/python3.9/site-packages and didn't find anything, also running |
Searching for |
I had a look and it found a load of related files, from torch, torchgen, and transformers... most of the files are like: I think these are just source code files from these packages tho, not the Nvidia Propriety Software |
I'm also facing the same issue, where nvidia* packages are not getting downloaded, not being used also in our product.(our application runs on windows, where the inventory report shows wheel packages). |
Are you using an OSS scanning tool such as Mend? Our issue was around Mend under the covers doing a pip download and ignoring the fact we were doing --no-deps when installing the package, so the full pip download was getting dependencies we were not getting. |
Yes, we are using mend, integrated with github repo and Mend inventory shows these nvidia* packages. |
I'm a bit confused
So the packages are not being downloaded, but would you like to download them or not? In short, to use Sentence Transformers, you will have to use
If you only want to run Sentence Transformers on CPU, then you don't need to install torch with CUDA, e.g.:
The latter should not install NVIDIA's CUDA packages, I believe.
|
We do not use nvidia* packages. Our application is doesn't need this. Problem with Mend inventory as it shows nvidia* packages. |
If you are using it like our use case, we were installing using the CPU version which doesn't get the GPU related stuff, but mend looks at the packages installed and seems to just ignore the option (CPU specific, no dependencies etc) and just downloads everything and then sees that "ah, Sentence Transformers requires Nvidia packages" which would be right if we weren't using the CPU specific variant. It's an issue on Mend.io than on this library though. It's how they do they're checking that causes the Nvidia packages to be detected when they aren't actually present. We are using them in a Docker image and you can tell we don't get them as we looked through the system and cant find them and the image is small, if we were pulling them the image would be 100s MBs larger than it is. |
I ran into this issue and what worked for me was to install the torch cpu version outside of and before installing the requirements file. Example with Docker:
Then I would install sentence-transformers from my requirements file. |
Below are the versions that are working fine for me
|
What would be the apt way to control the deps when one uses poetry? There we cannot quite say download torch first from CPU and then download "sentence-transformers". Wouldn't it be cleaner if we've something like "sentence-transformers+cpu"? I'm on Darwin M1 Chip and despite using torch+cpu on poetry side, sentence transformers is causing it to bring everything. I don't think it's a poetry only issue as when the same lock file is executed on a Jenkins (linux / ubuntu env), it doesn't bring CUDA but when done on local / M1 it does. PS: I'm trying to double verify things on my side Thanks |
@AdityaSoni19031997 |
@AleefBilal is exactly right. At least on Linux devices, installing |
I am using sentence-transformers-2.2.2.tar.gz while it pulls the following nvidia packages
nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl
nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl
nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl
nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl
nvidia_cudnn_cu12-8.9.2.26-py3-none-manylinux1_x86_64.whl
nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl
nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl
nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl
nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl
nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl
nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl
When I search them online, it shows they are under license: NVIDIA Proprietary Software.
Can I freely use sentence-transformers-2.2.2.tar.gz?
Thanks!
The text was updated successfully, but these errors were encountered: