this post was submitted on 30 Aug 2025
88 points (97.8% liked)

Technology

41771 readers
156 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 6 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] slacktoid@lemmy.ml 14 points 5 months ago* (last edited 5 months ago) (3 children)

Where can I buy this?

Edit: I realized after I commented this was the product page.. My bad. It was more of a take my money now scenario

[–] frongt@lemmy.zip 11 points 5 months ago

This is literally a product page to buy them

[–] eldavi@lemmy.ml 6 points 5 months ago (1 children)

i wonder if the driver to run is compatible with linux.

[–] slacktoid@lemmy.ml 5 points 5 months ago (1 children)

Why wouldn't it? (Like I'm thinking why would they support Microsoft, and the only other viable option is FreeBSD)

[–] eldavi@lemmy.ml 1 points 5 months ago

the world still uses windows heavily so adoption for the end consumer relies on it.

[–] locuester@lemmy.zip 2 points 5 months ago

Try the link of the post you’re responding to.

[–] uberstar@lemmy.ml 8 points 5 months ago

I kinda want an individual consumer-friendly, low-end/mid-end alternative that can run my games and video editing software for very small projects.. so far I'm only eyeing the Lisuan G100, which seems to fit that bill..

This seems cool though, other than AI, it could be used for distributed cloud computing or something of that sort

[–] geneva_convenience@lemmy.ml 7 points 5 months ago (2 children)

For inference only. NVIDIA GPU's are so big because they can train models. Not just run them. All other GPU's seem to lack that capacity.

[–] lorty@lemmygrad.ml 6 points 5 months ago

And training them requires a LOT of VRAM, and this is why they do as much as they can to limit VRAM on their gaming cards: better market segmentation.

[–] nutbutter@discuss.tchncs.de 6 points 5 months ago (12 children)

You can train or fine-tune a model on any GPU. Surely, It will be slower, but higher VRAM is better.

load more comments (12 replies)
[–] WalnutLum@lemmy.ml 6 points 5 months ago

These only work with ARM cpus I think

[–] ICastFist@programming.dev 2 points 5 months ago (1 children)

Does anyone know if it can run CUDA code? Because that's the silver bullet ensuring Nvidia dominance in the planet-wrecking servers

[–] peppers_ghost@lemmy.ml 5 points 5 months ago

llama and pytorch support it right now. CUDA isn't available on its own as far as I can tell. I'd like to try one out but the bandwidth seems to be ass. About 25% as fast as a 3090. It's a really good start for them though.

load more comments
view more: next ›