this post was submitted on 01 Mar 2026
286 points (99.3% liked)
Games
24148 readers
308 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
Beehaw.org gaming
Lemmy.ml gaming
lemmy.ca pcgaming
founded 2 years ago
MODERATORS
I just don't see how consoles can be "shielded" from the AI onslaught.
If Nvidia/AMD/Intel are going to abandon certain sectors and product types(discrete GPUs, desktop parts), I don't see how they will be fine with custom SOCs. I don't like "all or nothing" scenarios, but if these higher end chips are going away for desktop, I am assuming it will happen to all consumer sectors as well. So the next consoles would be cloud/streaming consoles only.
But, I don't believe discrete GPUs and desktop parts are going away.
I think you're being quite a bit disingenuous here. AMD hasn't made a "highest end GPU variant" in a literal decade. They've never had a competitor to the Titan cards nor the *90 variants, and with the *80 variant slowly taking over the top-end consumer spec(because the *90 took over the TItan classification), all of this isn't because of AI. It's just AMD lagging behind the entire time. And I love AMD, but they've never been known for highest end.
And Intel has NEVER made a highest end GPU variant. So not sure where that claim is coming from.
Also, I genuinely believe a lot of people's perspectives are skewed on all of these aspects. Hardware has gotten more powerful and more efficient, we've gone through a hyper-inflation period, and AI is gobbling everything up, so yeah prices and hardware availability sucks. But a great mid-range build I just spec'd out(9600X, 9060 XT, 1TB nvme, 32GB of DDR5) is around $1400 USD. Just adjusting for inflation, in January 2016 that'd be a little over $1000 USD. Austin Evans has a video(i5-6500, R9 390, 250GB SSD/1TB spinner, 8GB DDR4) from Jan 2016 with a $1000 USD PC build.
Obviously, and hopefully, a build from 2026 beats a build from 2016. But looking at the pricing and adjusting for inflation and current market conditions, I'd say we aren't doing bad at all in 2026. Yeah, shit's expensive, but I don't think we're at a "doomsday level of high end desktop parts going extinct" situation.
They very well could be.
The hardware is near-identical though, or at least it was for PS Now. So the barrier to re-use game streaming hardware for a physical console is fairly low.
It's about silicon size to me. Even if a bit behind Nvidia's mega dies, AMD made "big die" cards consistently, like the 6970, 7970, 290, Fiji, Vega 64, the 6900, 7900 XTX. But the 9000 series is different. The top-end 9070 XT is "only" 356.5 mm2 and 256-bit; a mid-range size. The only recent precedent for that is the RX 480, but those were cheaper and sold alongside higher end GPUs.
And with Arc Battlemage, Intel allegedly had a bigger die in the works, but canceled it. Presumably because they didn't think it was financially viable.
You make fair points. I'm probably panicking and being a little dramatic here... Custom SoCs would probably be questionable if regular graphics are.
But I still don't like the trajectory. It feels like AMD/Intel are struggling to even stay alive in the space, while Nvidia seems to think it's not so important, and I don't like where that goes.