this post was submitted on 10 Mar 2026
59 points (96.8% liked)

Books

13781 readers
64 users here now

Book reader community.

founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] riskable@programming.dev 1 points 4 days ago

training models requires expensive hardware

Right now. In ten to twenty years this won't be the case. Also consider the diminishing returns from adding more hardware to the problem of training AI: Despite monopolizing the entire world's supply of DRAM, AI models are only gaining marginal improvements.

The curve of hardware expense against model capabilities is moving to intersect unless something drastic changes. Big AI needs a huge breakthrough in order to stay ahead of that curve. I don't see that happening because all the big breakthroughs that are happening now are in regards to efficiency which makes things worse for them and makes training cheaper, faster.