8
Best GPUs for self-hosted AI?
(alien.top)
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
For Example
We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.
Useful Lists
A 4090 is good enough for running many models. You probably want an A6000 for larger ones. But many models that don't fit in your VRAM can be scaled down without much loss of effectiveness.