this post was submitted on 01 Mar 2026
79 points (100.0% liked)
United States | News & Politics
8987 readers
199 users here now
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It runs pretty well. I didn't notice a speed difference between it and DeepSeek's web chat. I haven't used it for anything big, I'm primarily trying to stay current with the technology so I know what I'm talking about during job interviews.
Cool. I’m looking forward to getting the hardware to run locally. Any alternatives you know about where I can borrow computing to run my own model?