Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

making video error #645

Open
hlafmanta opened this issue Sep 26, 2024 · 6 comments
Open

making video error #645

hlafmanta opened this issue Sep 26, 2024 · 6 comments

Comments

@hlafmanta
Copy link

hlafmanta commented Sep 26, 2024

onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Gemm node. Name:'fc1' Status Message: D:\a\_work\1\s\onnxruntime\core\providers\cuda\cuda_call.cc:121 onnxruntime::CudaCall D:\a\_work\1\s\onnxruntime\core\providers\cuda\cuda_call.cc:114 onnxruntime::CudaCall CUBLAS failure 3: CUBLAS_STATUS_ALLOC_FAILED ; GPU=0 ; hostname=TOBI ; file=D:\a\_work\1\s\onnxruntime\core\providers\cuda\cuda_execution_provider.cc ; line=168 ; expr=cublasCreate(&cublas_handle_);

onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Conv node. Name:'Conv_111' Status Message: D:\a\_work\1\s\onnxruntime\core\framework\bfc_arena.cc:376 onnxruntime::BFCArena::AllocateRawInternal Failed to allocate memory for requested buffer of size 13111296
when I add image and target I add video after export frames it stop the process after showing the message above.

@girish147
Copy link

Try running the model on the CPU (though this might slow down execution). You can disable the CUDA execution provider in your ONNX runtime session to force it to run on the CPU.

@hlafmanta
Copy link
Author

how to do it please.

@Updatedme
Copy link

What's your computer configuration

@hlafmanta
Copy link
Author

Screenshot_4
Screenshot_3

@hacksider
Copy link
Owner

How are you running the application? Are you using the run_cuda.bat? If yes, just lower the memory and threads depending on your specs.

@KRSHH
Copy link
Collaborator

KRSHH commented Oct 30, 2024

How are you running the application? Are you using the run_cuda.bat? If yes, just lower the memory and threads depending on your specs.

++ Read this - https://www.reddit.com/r/CUDA/comments/z6r7aj/does_cuda_not_work_on_gtx_1650/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants