Question about multiple gpus #80
Replies: 1 comment
-
That's a good question. The short story is yes, they go in the PCI express ports. But you have to make sure that you have the right motherboard, cooling, etc. Modern graphics cards for deep learning are also quite large. I think for a standard workstation tower, you probably have to go with a single GPU (@vmirly has a nice tutorial for that: https://www.vahidmirjalili.com/blog/posts/build-a-deep-learning-workstation/) When I was at the university, I had a Lambda workstation with 4 GPUs for my office, which was pre-built (https://lambdalabs.com/gpu-workstations/vector). But I would not recommend it, it gets too loud and hot (not that it overheats, but my office got very warm). The way to go is getting a server I'd say. But then this requires a separate server room. For most people, GPU workstations with 1-2 GPUs at home (for side projects) and cloud GPUs for work projects are probably the way to go. |
Beta Was this translation helpful? Give feedback.
-
Thank you so much for your amazing lecture! I'm curious about how multiple gpus can be equiped on a single pc. Did you insert 4 gpus into 4 PCI-E slots on a motherboard?
Beta Was this translation helpful? Give feedback.
All reactions