16
this post was submitted on 17 Mar 2026
16 points (90.0% liked)
Games
1894 readers
4 users here now
█▓▒░📀☭ g a m e s 💾⚧░▒▓█
Tag game recommendations with [rec]. Tag your critique or commentary threads with [discussion]. Both table-top and video game content is welcome! Original content or indie/DRM free material is encouraged!
Not a place for gamer gate talk or other reactionary behavior. TERFs and incels get the wall.
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Nothing "cheap" about this mileage.
They were using two(!) RTX 5090s, literally having one of them just for AI rendering.
Furthermore, these AI (let's call them) "improvements" come at a cost of destroying the consumer PC market with skyrocketing prices. The very market that is supposed to be using this.
I have to assume this was for the showcase video, DLSS offers options and devs can tailor it to their game - if they use it correctly hopefully. It's way easy to just say "unreal engine will take care of it just ship" nowadays which is a problem in software in general but not entirely nvidia's.
It's not a great showcase video but having had DLSS on some games, it does make a pretty good difference. I tried it with a game I own just now and I get from 40 FPS on basic antialiasing to 50 with DLSS to 90 with frame generation enabled. Quality is just as good as without DLSS, just some flickering in this game on the snow, but most of all it doesn't make distant objects blurry like 'traditional' antialiasing does.
A lot of it has to do with how it's implemented, and in the comments nvidia said themselves that devs will be able to tailor it and choose how the filter applies and where. I believe you that they used two 5090s for this showcase but I have to assume it's because they tried to get the most out of it and didn't even seem to get the game devs involved in deciding on the filter. We also won't have to turn it on necessarily and use it.
It's not a great showcase video lol I agree though, it just raises a lot more questions. And it asks bigger questions, like why we 'need' games to look as photorealistic as possible when they are not movies nor real life and elements in the scene communicate things to the player, because it's a game. But stuff like frame generation does have a positive impact and it works great, so I really can't find any problem with having this available in games now.
To the point of GPUs being out of price (and out of stock), at the core it has to do with project stargate and the half trillion dollars US tech companies are receiving from it. OpenAI bought up 40% of the world's supply of wafers, which are a pre-component in memory (gpus, ram, ssds etc for consumers) not because they needed them but because they didn't want the competition to have them. They don't even do anything with wafers, they need the working memory.
nvidia participates in this scheme of course, they've announced a shift away from consumer GPUs, so it's fair to ask who dlss5 even is for when nobody will be able to use it for years to come, but my broader point to the internet response to this video it's that it's not the mystical "AI" pulling the strings from behind the scenes. Like I have no doubt we both agree that components are more than just video games and it's important to have consumer components, but the gamer reaction is predictably "how will this affect my treats" without ever asking themselves how much their hobby costs in terms of electricity and computing power, but suddenly when the AI buzzword is there they are very concerned about the environment (and about graphics looking bad as if they'll be forced to turn dlss 5 on for their game, as if the market hasn't been chasing the dragon of hyperrealistic graphics for decades, and as if the consumers at large don't base their purchase decision on graphics)