Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Install CUDA, PyTorch or any DNN Library on the duckiebot itself? #264

Open
bishoyroufael opened this issue Apr 24, 2022 · 2 comments
Open

Comments

@bishoyroufael
Copy link

I have a DB21M assembled, getting confused about how to run neural networks on the bot itself. I tried SSHing into the bot but didn't find any NVIDIA drivers installed or CUDA installed? Is there a way to do that easily? Can't find anything useful in the docs talking about this.

From the NVIDIA docs, there should be a way to install JetPack and get some neural networks running out from their examples presented here. I want to use that with ROS on the duckiebot

@FelixMildon
Copy link

Can this please be explained @tanij , this understanding would help allot.

@bishoyroufael
Copy link
Author

After going through a lot of research and pain. I was able to make something work here as part of my thesis project. Feel free to use the Dockerfile in your own project.

It uses dustynv/jetson-inference as a base container which is basically a container having CUDA, PyTorch and some cool DNN models there to be used straight away. It also has ROS melodic on top of that where you can set ROS_MASTER_URI with the duckiebot IP for communication to work.

If you just want PyTorch you can create rather derive from l4t-pytorch. More details about that here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants