6
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 11 Jun 2023
6 points (100.0% liked)
Machine Learning
1765 readers
12 users here now
founded 4 years ago
MODERATORS
Depends on the use cases I guess. If any larger scale deep learning is going on, you cannot afford buying all the required GPUs anyways.
However, I found myself using my tower PC quite a lot during my Masters. Especially for Uni projects my GPU came in very handy and was much appreciated by group members. Having your own GPU was often more convenient than using the resources provided by the lab.
Also, while relying mostly on cloud resources in my last job, I would have found having a GPU available on my work machine very convenient at certain times. Very nice for EDA and playing with models during the early phase of a project.
Besides from that, IMO a good CPU and > 32GB RAM on your own machine are sufficient for EDA and related things while I would rely on cloud resources for everything else, e.g., model training and large scale analyses.