7

I just bought a "new" homelab server and am considering adding in some used/refurbished NVIDIA Tesla K80s. They have 24 GB of VRAM and tons of compute power for very cheap if you get them used.

The issue is that these cards run super hot and require extra cooling set ups. I was able to find this fan adapter kit on eBay. But I still worry that if I pop one or two of these bad boys in my server that the fan won't be enough to overcome the raw heat put off by the K80.

Have any of you run this kind of card in a home lab setting? What kind of temps do you get when running models? Would a fan like this actually be enough to cool the thing? I appreciate any insight you guys might have!

you are viewing a single comment's thread
view the rest of the comments
[-] plotting_homelab@lemmy.world 1 points 1 year ago

so looking at your "server" it seems like a workstation i have no experience with k80's but from what i know all server gpu's are designed passive to be cooled by some loud fingerremover5000's so i think if you would upgrade the fans it should be fine since its only 300w. if cooling really is a problem than maybe some shrouds might help it but i don't think a single k80 is difficult to cool since in the data center they probably ran 4-6 in one 4u chassis

this post was submitted on 07 Jul 2023
7 points (88.9% liked)

LocalLLaMA

2165 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS