this post was submitted on 25 Mar 2026
27 points (96.6% liked)

Hardware

6694 readers
258 users here now

All things related to technology hardware, with a focus on computing hardware.


Some other hardware communities across Lemmy:


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
 

Intel brings 32GB of VRAM and plenty of bandwidth to the local AI inference party

you are viewing a single comment's thread
view the rest of the comments
[–] renegadespork@lemmy.jelliefrontier.net 10 points 3 days ago* (last edited 3 days ago) (1 children)

"Corporation Pivots to Follow the Money" I suppose.

I'm so tired of this bubble. The diminishing returns of more compute on AI started rearing its head in 2024. Can we start allocating tech to stuff that people actually want, again?

[–] Die4Ever@retrolemmy.com 3 points 2 days ago* (last edited 2 days ago)

At least this is for consumers/prosumers to buy and OWN, instead of data center products that we can only rent. And we desperately need Intel in the GPU market. (Or any 3rd player)

Although IDK how useful they are outside of AI