this post was submitted on 22 Nov 2025
142 points (98.6% liked)

Hardware

5647 readers
39 users here now

All things related to technology hardware, with a focus on computing hardware.


Some other hardware communities across Lemmy:


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Zetta@mander.xyz 3 points 2 months ago

Last I heard, which I don't have any sources for so could be completely wrong for all I know, but some of the big LLM providers do make enough money off of their users to pay for inference and infrastructure. And the only reason they aren't profitable is because of the insane amount of money they spend on developing new models.