Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does it work with Jetson Xavier NX? #2

Open
shivamordanny opened this issue Jun 19, 2024 · 4 comments
Open

Does it work with Jetson Xavier NX? #2

shivamordanny opened this issue Jun 19, 2024 · 4 comments

Comments

@shivamordanny
Copy link

What all add ons I would need to make it work with Xavier NX?

@tokk-nv
Copy link
Member

tokk-nv commented Jun 19, 2024

Hi shivamordanny,
There is JetPack 5 version of the Docker container.
You can edit this line on the launch script for now (replace jetrag:r36.3.0 with jetrag:r35.4.1) and give a try, after you make sure you are on JetPack 5.
https://github.com/NVIDIA-AI-IOT/jetson-copilot/blob/main/launch_jetson_copilot.sh#L40

@tokk-nv
Copy link
Member

tokk-nv commented Jun 19, 2024

Hi shivamordanny,
I just updated the launch scripts, so you don't need to manually edit them. Please go ahead and try.
A quick test on my Jetson Xavier NX running JetPack 5.1.1 shows that Jetson Copilot runs, however, the Ollama seemed to run without GPU acceleration so it was running very slow. I will look into this issue separately.

@tokk-nv
Copy link
Member

tokk-nv commented Jun 19, 2024

So, Ollama server in the jetrag container was configured alright.
It was just the tight memory space of Xavier NX allowed only the portion of Llama3 model loaded on the GPU memory, effectively making it spend a long time on CPU.

Here are some of the workaround you can try.

  • Disable Desktop GUI (sudo init 3)
  • Use smaller LLM (ollama pull phi3)

@wyzeguyz44
Copy link

Thank you Tokk for doing this I was going to ask you the same questions for the Xavier. Dustin successfully backported the Ai studio tool to r35.4 for the Xavier and I will be testing next week. I definitely was hoping to do the same for copilot on my Xavier

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants