367
you are viewing a single comment's thread
view the rest of the comments
[-] bi_tux@lemmy.world 6 points 2 months ago

you don't even need a supported gpu, I run ollama on my rx 6700 xt

[-] BaroqueInMind@lemmy.one 3 points 2 months ago

You don't even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af

[-] bi_tux@lemmy.world 2 points 2 months ago

I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)

[-] tomjuggler@lemmy.world 2 points 2 months ago

I ran it on my dual core celeron and.. just kidding try the mini llama 1B. I'm in the same boat with Ryzen 5000 something cpu

[-] passepartout@feddit.org 2 points 2 months ago

I have the same gpu my friend. I was trying to say that you won't be able to run ROCm on some Radeon HD xy from 2008 :D

this post was submitted on 01 Oct 2024
367 points (91.1% liked)

Programmer Humor

19813 readers
139 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS