this post was submitted on 26 Dec 2025
337 points (99.1% liked)

Linux

10827 readers
373 users here now

A community for everything relating to the GNU/Linux operating system (except the memes!)

Also, check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] SillySausage@lemmynsfw.com 9 points 4 days ago (1 children)

I successfully ran local Llama with llama.cpp and an old AMD GPU. I'm not sure why you think there's no other option.