this post was submitted on 16 Mar 2026
227 points (94.5% liked)

Games

24312 readers
422 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 2 years ago
MODERATORS
 

Jyk0L8eLs7jd7es.png

I'm completely speechless. This looks so terrible I thought it was a joke, but apparently Nvidia released these demos to impress people. DLSS 5 runs the entire game through an AI filter, making every character look like it's running through an ultra realistic beauty filter.

The photo above is used as the promo image for the official blog post by the way. It completely ignores artistic intent and makes Grace's face look "sexier" because apparently that's what realism looks like now.

I wouldn't be so baffled if this was some experimental setting they were testing, but they're advertising this as the next gen DLSS. As in, this is their image of what the future of gaming should be. A massive F U to every artist in the industry. Well done, Nvidia.

you are viewing a single comment's thread
view the rest of the comments
[–] bridgeenjoyer@sh.itjust.works 66 points 15 hours ago* (last edited 15 hours ago) (4 children)

Looks horrid.

This will be the new motion plus shit that ruined all TV. Now, the kids think it IS good.

I can't express how much I hate motion plus and the fact that YOU CANT TURN IT OFF on a lot of TVs.

Much like severely compressed limited music. People today hear a dynamic song and dont like it because its too peaky or hurts their ears. They want a sausage waveform.

I'm not old, and I'm already yelling at clouds, ha. Just can't stand these corpos brainwashing people into thinking their shit is good. Its not.

[–] Tollana1234567@lemmy.today 1 points 6 hours ago

they could never seem to get past the cartoonish look.

[–] tomalley8342@lemmy.world 3 points 8 hours ago

I always wondered why samsung phones and tvs had that "vivid" high contrast color tuning turned on by default that just blows out the contrast and saturation. I thought surely no one actually prefers this kind of look. Reading some of the comments on here and on youtube, now I understand.

[–] Doc_Crankenstein@slrpnk.net 11 points 13 hours ago (1 children)

"A bird who has lived its life in a cage learns to fear the sky"

I hate that this is our reality. People growing up without ever knowing what true freedom feels like. Even we never truly knew it, we just had a bigger cage.

[–] bridgeenjoyer@sh.itjust.works 8 points 13 hours ago

Damn. That is exactly how ive been feeling lately. I think a lot about how sad a childhood would be right now unless you have a really good parent teaching you that everything today sucks and is corporate trash that needs to be destroyed, like capitalism.

[–] brucethemoose@lemmy.world 0 points 15 hours ago* (last edited 15 hours ago) (3 children)

Hard disagree on motion interpolation. Bad interpolation looks awful, of course, but when it's good, it's like night and day to my eyes, and every TV I've ever used can disable it.

Sometimes you can't disable "jitter reduction" or whatever that's branded as, but that's not the same thing.

[–] joelfromaus@aussie.zone 5 points 11 hours ago (1 children)

First off; downvoted for a lukewarm opinion? Come on Lemmy, be better.

I’ve thought about this subject a lot and my thoughts are that it boils down to whether someone has been raised on movies (specifically 24fps) or video games (specifically 60fps).

For me, movies look like a jittery mess. I have two TV’s and the motion smoothing on one is very good but I’ve never been able to get it just right on my other one. They’re the same brand of TV just a decade apart.

[–] brucethemoose@lemmy.world 3 points 11 hours ago* (last edited 11 hours ago) (1 children)

Yeah, the ASICs in newer TVs are crazy powerful, and crazy good at it. They're nothing like what you'd find in a phone or even a PC, and even a one-generation jump for our Sony TVs was an improvement.

That's what I was trying to emphasize. I think interpolation on old TVs, and maybe early versions of SVP, left a bad taste in people's mouths. Kind of like fake HDR.


...But I also think there's a lot more sentiment against any kind of "processing" since the rise of AI slop.

As an example I often cite, there was this old TV show I helped touch up for a "fan" release, a long time ago. One small component in a very long pipeline was a GAN upscaler... It worked fine. The original TV release was broken as hell, and people loved the improvement.

Fast forward many years later, and I mention this was used in the "remaster" still floating around, and the same subreddit goes ballistic. They literally did not believe me, or cooed about the "flaws" of the original, or called it slop and against the rules and wanted me banned.

And I suspect frame interpolation and resolution scaling in other contexts get tossed in that same bucket. Not that I blame anyone. AI does suck.

[–] joelfromaus@aussie.zone 2 points 10 hours ago (1 children)

Funny enough it’s actually the older of my two TVs that does it well. I think it marks a noticeable drop in product quality for that particular manufacturer. So still the same idea; that worse hardware gives bad results, but it’s not limited to the age of the TV just its component quality.

[–] brucethemoose@lemmy.world 2 points 10 hours ago* (last edited 10 hours ago)

Oh yeah, definitely. Lines enshittify.

I just mean, generally if you look at a 2014 TV and a 2025 one, the experience of that old one is likely not represenative of the new.

[–] gravitas_deficiency@sh.itjust.works 7 points 14 hours ago* (last edited 14 hours ago)

It’s great for sports. And some sitcoms. And maybe news (but why are you even watching cable news these days). That’s it.

Persistence of vision serves a real purpose in filmography. “Optimizing” it away is very literally a corruption of the art and a betrayal of the director and cameraman’s skills and intent.

I’ll stick with my vintage 2010 Phillips plasma 55”, thank you very much.

[–] bridgeenjoyer@sh.itjust.works 3 points 14 hours ago (1 children)

Yeah sorry I'm not into high def TV myself. It looks awful unless all you watch is sports and brand new marvel movies (hard no).

You may think you're disabling it ,until you compare it with another TV that actually does zero processing . night and day.

Same effect as me thinking "huh, I guess the lag on my flat screen isn't too bad for gaming" then plug into my CRT and holy snap, the clarity and precision response. (Clarifying, this is with old and new consoles, obviously anything with an analog output into a new TV ia horirible without a upscaler, but even with a retrotink 2x upscaler, it still sucks. You need to send over $700 to make it look decent enough).

people don't know what They took from us.

[–] brucethemoose@lemmy.world 2 points 14 hours ago* (last edited 14 hours ago) (1 children)

I have. I A/B test it all the time. I pause and pixel peep.

And I don't watch any sports, nor any marvel movies.

"huh, I guess the lag on my flat screen isn’t too bad for gaming"

I've had CRTs. And I have one of those "zero latency" overclocked LCD monitors with no internal scaler. As much as I like them, they feel sluggish compared to something newer.

Yeah sorry I’m not into high def TV myself.

In that case, I suspect you haven't tried it on more modern displays, or when its baked into transcoded footage with one of the better filters.

Yes, it looks awful and artifacty processed by older LCDs. But it looks really good these days.

[–] bridgeenjoyer@sh.itjust.works 3 points 13 hours ago* (last edited 13 hours ago) (1 children)

Yeah, I'm not one to pay a lot for TVs. I'd like an oled, but with the prices, I really have no need for it for gaming and the TV I have is fine for normal watching.

Also isn't it crazy how its taken this long for a display to be as good as a CRT (blacks and response time wise)?? Kind of the same thing with audio, how bad digital sucked originally and how we are just now fixing that with great DACS. Humans got it right the first time with tube amps and CRTs ! Not to mention they're repairable.

[–] brucethemoose@lemmy.world 3 points 11 hours ago

I’d like an oled, but with the prices, I really have no need for it for gaming and the TV I have is fine for normal watching.

That is entirely fair. Electronics are all crazy expensive, really.

Yeah, LCDs went from bad to “mixed” and stayed that way for a long time. Granted, some things like absolute sharpness are not great on a CRT, but still.