this post was submitted on 25 Jan 2026
67 points (97.2% liked)

Technology

40990 readers
385 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 6 years ago
MODERATORS
top 23 comments
sorted by: hot top controversial new old
[–] Ascend910@lemmy.ml 8 points 16 hours ago* (last edited 16 hours ago)

Wow! it is like open sourcing the technology and allow all of contribute and modify prevent said software to enshitify the the future
Cough
Cough
Microsoft Windows

[–] PanArab@lemmy.ml 15 points 21 hours ago (1 children)

Airbnb boss Brian Chesky told Bloomberg in October his company relied "a lot" on Alibaba's Qwen to power its AI customer service agent.

He gave three simple reasons - it's "very good", "fast" and "cheap".

Going into 2025, the consensus was despite billions of dollars being spent by US tech firms, Chinese companies were threatening to pull ahead.

"That's not the story anymore," Boudier said. "Now, the best model is an open-source model."

A report published last month by Stanford University found Chinese AI models "seem to have caught up or even pulled ahead" of their global counterparts - both in terms of what they're capable of, and how many people are using them.

Interesting stuff. I hope Sam Altman loses everything.

[–] yogthos@lemmy.ml 12 points 20 hours ago (1 children)

The whole AI as a service business model is cooked now. My prediction is that even stuff like coding will soon work well enough with local models. There are going to be very few cases to justify paying subscription for AI services either for companies or individuals. And this stuff is moving so incredibly fast. For example https://dev.to/yakhilesh/china-just-released-the-first-coding-ai-of-2026-and-its-crushing-everything-we-know-3bbj

[–] Grapho@lemmy.ml 6 points 16 hours ago (1 children)

Can't wait for the bargain bin data center hdds

[–] yogthos@lemmy.ml 2 points 10 hours ago

and cheap GPUs :)

[–] herseycokguzelolacak@lemmy.ml 8 points 21 hours ago

The latest GLM 4.7 Flash is insanely good. Just sayin...

[–] ReallyCoolDude@lemmy.ml 3 points 17 hours ago

I use a lot of models and Chinese ones are the best. They are a bit restricted for certain fields, but not more than Microsoft phi models. And they are imcredibly cheaper even when using inference APIs.