-
Notifications
You must be signed in to change notification settings - Fork 363
❓ [Question] Is SAM2 supported when compiling with the Dynamo backend on JetPack 6.1 or 6.2? #3478
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@AyanamiReiFan I don't know of any workarounds for upgrading TRT 10.3 on Jetpack. That being said, you could give 25.03-py3-igpu a container a try. This container has TRT 10.9 and the corresponding Torch-TRT version. This might work although I haven't tested this yet. In the future, Jetpack 7 will have TRT 10.6+ which could also fix this issue. |
The iGPU container should also have a much more recent version of Torch-TRT |
Thanks very much! I will try it later. |
@AyanamiReiFan meanwhile I will also give a try for SAM2 on jetson. |
@AyanamiReiFan since https://github.com/pytorch/TensorRT/pull/3524/files has not been merged to main yet. |
Thank you very much. I've read this JetPack guide. Can I assume that this guide essentially installs PyTorch 2.7 and the not-yet-officially-released Torch-TensorRT 2.8 on JetPack 6.2-based Jetson devices? If so, once Torch-TensorRT 2.8 is officially released, will it be properly compatible with JetPack 6.2 Jetson devices? Additionally, I’m currently trying to follow these installation steps and will provide feedback as soon as possible. |
I build troch-tensorrt from the main branch. detail log: xxxxxxx@ubuntu:~/Develops/env_prepare/TensorRT$ python setup.py bdist_wheel --jetpack |
I have identified the reason for the "Loading: 0 packages loaded" issue mentioned above. When configuring the system proxy on Ubuntu, I forgot to set no_proxy, causing traffic that should have gone to the Bazel server to be incorrectly routed to the proxy server. |
I got this error when install the whl, the torch2.7 is not supported by main branch? xxxxxxx@ubuntu:~/Develops/env_prepare/TensorRT/dist$ python -m pip install torch_tensorrt-2.8.0.dev0+727cbd2e9-cp310-cp310-linux_aarch64.whl |
@AyanamiReiFan in the pyproject, if your environment is tegra, it should pull Line 13 in 61b3480
|
I use the main branch, not this branch. |
When using your branch, I could compile and run SAM2-Large, but the image quality is significantly worse than in the examples. Additionally, when attempting to use base+ or tiny models, compilation fails with errors:
The experimental code I used is from there and I remove the line 38: Resulting images: |
After making some modifications, I was able to successfully compile Hiera-Tiny models (since I only need the Hiera part of SAM2). However, I'm not entirely sure which specific change(s) in my modifications actually resolved the issue. Below are the modification details—I hope this might be helpful for your development work.
|
Uh oh!
There was an error while loading. Please reload this page.
❓ Question
Will SAM2 be compatible with the Dynamo backend on JetPack 6.1/6.2?
Are there any workarounds for the TensorRT version mismatch?
What you have already tried
Here are my attempts and issues encountered, my device is jetson AGX Orin, I only compile the ImageEncoder (Hiera & FPN which remove position_encoding) of SAM2, the SAM2 code is from https://github.com/chohk88/sam2/tree/torch-trt:
JetPack 6.1 + PyTorch 2.5 (from https://developer.download.nvidia.cn) + Torch-TensorRT 2.5
Tried compiling SAM2 but encountered errors.
Observed that the PyTorch 2.5 documentation does not mention SAM2 support, likely indicating SAM2 is not yet adapted for this version.
JetPack 6.1 + PyTorch 2.6 (from https://pypi.jetson-ai-lab.dev/jp6/cu126) + Torch-TensorRT 2.6
Installed PyTorch 2.6 from jp6/cu126 and Torch-TensorRT 2.6.
Importing torch_tensorrt failed with ModuleNotFoundError: No module named 'tensorrt.plugin'.
Root cause: Torch-TensorRT 2.6 requires TensorRT 10.7, but JetPack 6.1 provides only TensorRT 10.3.
Found no straightforward way to upgrade TensorRT within JetPack 6.1 due to dependency conflicts.
Cross-Platform Attempt: Compile on x86 + Run on JetPack 6.1
Compiled SAM2 on x86 with Torch-TensorRT 2.6 and exported the model.
Tried running it on JetPack 6.1 with Torch-TensorRT 2.5.
Failed unsurprisingly due to serialization version incompatibility between 2.6 and 2.5.
The text was updated successfully, but these errors were encountered: