this post was submitted on 23 Mar 2026
29 points (91.4% liked)

Technology

42234 readers
23 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 7 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] yogthos@lemmy.ml 5 points 3 days ago (1 children)

I'd argue it's inevitable for the simple reason that the whole AI as a service business model is a catch 22. Current frontier models aren't profitable, and all the current service providers live off VC funding. And if models become cheap enough to be profitable, then they're cheap enough to run locally too. And there's little reason to expect that models aren't going to continue being optimized going forward, so we are going to hit an inflection point where local becomes the dominant paradigm.

We've seen the pendulum swing between mainframe and personal computer many times before. I expect this will be no different.

[–] biggerbogboy@sh.itjust.works 2 points 2 days ago (1 children)

Actually, I agree. And so far, small local models are really solid, and can punch above its weight even when compared to frontier models.

I believe what I meant when I said I doubted it was since these AI corpos seemingly give no indication that local is an option, so most people would think they can only access an LLM through the web. This would bolster the SaaS ecosystem dominating over local AI, although local will keep increasingly growing as a more favourable option.

Although I do agree that the industry will shift from being server based to PC based inference as well, I don’t see that shift being large enough to make these companies change their training paradigms to include telemetry from local AI, but I’m sure some will.

[–] yogthos@lemmy.ml 3 points 2 days ago

Oh yeah corps will absolutely do that. We can kinda see the same thing happening with everything moving to streaming services too.