Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cuda 12.6 cuann9.3.0我不能运行,环境变量已经正确配置 #359

Open
Felix3322 opened this issue Aug 28, 2024 · 6 comments
Open

Comments

@Felix3322
Copy link

RuntimeError: C:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:866 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

@zzzweakman
Copy link
Collaborator

您好,您可以参考一下这个文档降级您的CUDA版本,它非常详细,我们非常感谢 @dimitribarbot 的贡献。

Hi, you can refer to this document to downgrade your CUDA version. It is very detailed, and we are very grateful for @dimitribarbot's contribution.

@BeimingCharles
Copy link

我也一样的问题

@dimitribarbot
Copy link

dimitribarbot commented Aug 29, 2024

您好,您可以参考一下这个文档降级您的CUDA版本,它非常详细,我们非常感谢 @dimitribarbot 的贡献。

Hi, you can refer to this document to downgrade your CUDA version. It is very detailed, and we are very grateful for @dimitribarbot's contribution.

Please, note however that this document has been written in the context of the LivePortrait Automatic1111's Stable Diffusion WebUI extension. It should also work if you're not using Automatic1111's Stable Diffusion WebUI but the key point for Windows users is to not have "Visual Studio 2022 Build Tools" installed (v17.10 or above), otherwise it will not work as they're not compatible with CUDA 11.8. You should ensure that no "Visual Studio 2022 Build Tools" (17.10 or above) is installed before proceeding with XPose installation for CUDA 11.8 or uninstall them if it is the case.

@BeimingCharles
Copy link

BeimingCharles commented Aug 29, 2024 via email

@BeimingCharles
Copy link

BeimingCharles commented Aug 30, 2024 via email

@dimitribarbot
Copy link

dimitribarbot commented Aug 30, 2024

Actually, I don't think we're talking about the same issue. My documentation aims at solving an issue with animal model inference when having CUDA 12.x and PyTorch v2.1.x and executing instructions under the Fast hands-on (animals) or under the 快速上手(动物模型) section of the README.

Here it seems to me that the problem is about the installation of onnxruntime, which is used during human model inference. The error message indicates that the installed version of onnxruntime is not compatible with the installed version of CUDA/cudNN. Indeed, when looking at the documentation of onnxruntime, it seems to me that only version 1.18.1 is compatible with cudNN 9.x whereas LivePortrait requirements is asking for version 1.18.0.

In addition, when looking at the instructions of onnxruntime-gpu here, they're saying we should add an --extra-index-url parameter.

For me, the human models inference was correctly working after I ran this command (don't forget to activate your conda environement first via conda activate LivePortrait):

pip install onnxruntime-gpu==1.18.1 --extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/onnxruntime-cuda-12/pypi/simple/

If the previous command is not working, note that you can rollback to the default Live Portrait installation by running:

pip install -r requirements.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants