109
submitted 6 months ago by alessandro@lemmy.ca to c/pcgaming@lemmy.ca
top 50 comments
sorted by: hot top controversial new old
[-] dangblingus@lemmy.dbzer0.com 60 points 6 months ago

Okay, so all they're saying is that they won't certify any monitor under 144 for FreeSync technology. Okay? That basically changes nothing. If you are using a 60hz monitor, basic vsync is all you need.

[-] Perfide@reddthat.com 21 points 6 months ago

Basic vsync worsens response time, often by a lot. I'd take screen tearing over vsync.

[-] Sibbo@sopuli.xyz 10 points 6 months ago* (last edited 6 months ago)

True. For FPS I often do that too. For top-down view games it's often nicer with vsync. Even triple buffered.

[-] snugglesthefalse@sh.itjust.works 5 points 6 months ago

I'd take response time over screen tearing.

[-] And009@lemmynsfw.com 1 points 6 months ago* (last edited 6 months ago)

That's news to me. I haven't had any gaming machine for sometime now, my last PC was using an AMD FX6300 and Radeon 285 slotted into and ASUS board. I didn't have the budget to buy any monitor beyond 1080p 60hz. Ah, lots of Skyrim, GTA V and Witcher 3 memories..

I really miss gaming on MacOS, the Switch is fun but nowhere close. A Steam Deck might scratch the itch?

[-] Strykker@programming.dev 1 points 6 months ago

The biggest reason for all of the free sync/ gsync stuff is because vsync sucks and makes latency worse.

[-] Sibbo@sopuli.xyz 14 points 6 months ago

Probably their freesync is mostly ineffective on 60Hz but better on higher refresh rates, and they just made a good spin out of that in the marketing department.

[-] Still@programming.dev 12 points 6 months ago

I've have a monitor that locks to 33-92hz when free sync is enabled (144hz otherwise) it's way more useful at lower fps values that higher ones

[-] Sibbo@sopuli.xyz 4 points 6 months ago

Interesting. Yeah my comment was just a shitpost. But what you are saying makes it seem completely nonsensical what AMD is doing.

But then maybe they just want to push faster video cards, and freesync is something that people care about. Then they can still sell better cards to those people that don't know that freesync is less useful at higher refresh rates.

[-] Kyrgizion@lemmy.world 31 points 6 months ago

60hz4k until I die. Or can afford 144hz4k. Which'll probably be around the same date, give or take.

[-] CanadianCorhen@lemmy.ca 39 points 6 months ago

I'm very much a 1440p 144hz guy.

I'd like 4k, but would take this compromise for now.

I want my next screen to be 4k, 144hz oled

[-] domi@lemmy.secnd.me 5 points 6 months ago

1440p high refresh rate gamers unite. I have an Alienware AW3423DWF and boy are those new OLED panels beautiful. Expensive but beautiful. I still remember playing Left 4 Dead right after I got it and even without HDR I was baffled by the credits at the end of the match. Just white text floating in nothingness.

They also recently released the AW3225QF which is 4k@240.

[-] Kolanaki@yiffit.net 24 points 6 months ago

Buck up, chum. Maybe the world will go to shit and you can experience 144hz @ 4K after the looting starts.

[-] noobnarski@feddit.de 5 points 6 months ago

I just bought a 4k 144hz screen, and let me say, it is worth the price.

[-] cyberpunk007@lemmy.ca 1 points 6 months ago

I have a 2.5K ultrawide 144Hz. Even when this PC was new it struggled on that era of games :(. We need better graphics cards that don't cost the price of a mortgage.

[-] noobnarski@feddit.de 1 points 6 months ago

Yeah, I invested in a 4070, I wish I could get a 4090 for that price.

[-] mox 18 points 6 months ago* (last edited 6 months ago)

FreeSync is for variable refresh rates, which 60Hz monitors generally don't support anyway. So this headline is nothing but clickbait.

Also, I don't know of any sub-120Hz VRR monitors that are still being made, but if they exist, they're not aimed at anyone who cares about FreeSync branding.

So this whole article is a pointless waste of time.

[-] notfromhere@lemmy.ml 3 points 6 months ago

I recently got a 75hz 1080p monitor with FreeSync branding from Costco, so yea they are still made.

[-] Fiivemacs@lemmy.ca 15 points 6 months ago

Provided the 60hz monitors still work who cares..if they do some arbitrary bullshit to prevent stuff from not working just because profit, then get fucked. I personally don't care about their certification or claims.

[-] SchmidtGenetics@lemmy.world 15 points 6 months ago

Freesync is open source, so wouldn’t be profit motivated.

load more comments (5 replies)
[-] ArbitraryValue@sh.itjust.works 4 points 6 months ago

Maybe I'm weird because anything over 20 FPS looks smooth to me (and I know it doesn't to other people) but what's the point of going over 60 FPS? Can anyone actually see the difference or is this just a matter of "bigger numbers must be better"?

[-] CaptainEffort@sh.itjust.works 13 points 6 months ago

Theres a huge difference once you use it for long enough. I have a 144hz monitor and love getting to play games that high, they’re so smooth! If you play long enough the difference becomes night/day.

load more comments (3 replies)
[-] glimse@lemmy.world 12 points 6 months ago

Lucky you. Seriously. I wish I didn't care because it means displays are more expensive for me.

I definitely thought it was all hype but once I saw games 120+ fps, even 60fps looks choppy to me. I also very much notice the difference between 30fps and 60fps video but 120fps (at full speed) didn't do much for me

For what it's worth, I was a professional video editor for years so I'm a bit more inclined to notice than the average person

[-] Formes@lemmy.ca 1 points 6 months ago

I'm kind of in that boat - digital art, and so on more. I never understood buying a computer monitor of over about 22" that was 1080p resolution. I want decent colour reproduction - I get it, it won't be perfect unless you spend a fortune but it should be at least decent.

120hz w/ good HDR support is fantastic for content that supports it, and 240hz is just buttery smooth. Variable refresh is pretty much a must for modern gaming.

[-] TheOneCurly@lemm.ee 11 points 6 months ago

There are diminishing returns but I can absolutely tell the difference between my 165Hz display and my wife's 240Hz.

[-] RaoulDook@lemmy.world 9 points 6 months ago

It's very easy to tell the difference when you see them in person. I have a 60Hz monitor and a 144Hz monitor on the same PC and you can drag a window across the desktop from one to the other and the lack of animation frames in the movement going from 144 to 60 makes the movement look choppy on the slower one. In games, the animation becomes smooth to the point of being lifelike and visually vibrant when your framerate is able to go up to 90 to 100 or more FPS

[-] Fermion@mander.xyz 8 points 6 months ago* (last edited 6 months ago)

It really depends on what and how you play. If reaction time is important then you'll feel more than see the difference in refresh rates. If none of your games require sub second reaction time accuracy, then it's much more of a nice to have luxury than a game changer.

Also, frametime pacing matters a lot. If your system very consistently puts out 30 fps, you'll have more accurate keypresses than if you normally get 50 and it gets hung on a few frames and it dips to 30fps. Your nervous system adapts pretty well to consistent delay, but it's much more difficult to compensate for delay that varies a lot.

I don't really play first person shooters so resolution matters more to me than framerate.

[-] TheSambassador@lemmy.world 7 points 6 months ago

For 3d games where the whole screen is moving and changing as the camera moves, I've noticed a big difference between 60 and 144. It just makes the game feel absurdly smooth.

For smaller games with more static views it doesn't really make much difference.

It mostly depends on the speed of the game.

[-] ramjambamalam@lemmy.ca 6 points 6 months ago

Here's a site which nicely demonstrates the effect: https://www.testufo.com/

[-] cevn@lemmy.world 2 points 6 months ago

Occasionally my fps gets set to 60. As soon as I start playing rocket league I can tell it is off. I went to a friends house and asked why everything is so choppy, checked his monitor settings and it was set to 60 instead of 144. There are people that can see the difference

[-] Incandemon@lemmy.ca 2 points 6 months ago

I don't play competitive games so I don't need the extra shooting accuracy. What I have found is that the higher refresh rate has made panning maps in RTS or looking around quickly in FPS much smoother. Its an overall nicer experience, but not really any better gaming than at 60hz.

load more comments (1 replies)
[-] someguy3@lemmy.ca 3 points 6 months ago* (last edited 6 months ago)

What does "certification" mean? It won't work?

[-] gravitas_deficiency@sh.itjust.works 5 points 6 months ago* (last edited 6 months ago)

It’s basically ~~ATI~~ AMD (lol) saying “this product meets or exceeds the required hardware standards to be granted this label”.

[-] uninvitedguest@lemmy.ca 14 points 6 months ago

ATI is a name I haven't seen in awhile 🙂

[-] BradleyUffner@lemmy.world 4 points 6 months ago* (last edited 6 months ago)

It means that they are allowed to put another sticker on the monitor.

[-] CountVon@sh.itjust.works 2 points 6 months ago

To be "FreeSync certified", a monitor has to have certain minimum specs and must pass some tests regarding its ability to handle Variable Refresh Rate (VRR). In exchange for meeting the minimum spec and passing the tests, the monitor manufacturer gets to put the FreeSync logo on the box and include FreeSync support in its marketing. If a consumer buys an AMD graphics card and a FreeSync certified monitor then FreeSync (AMD's implementation of VRR) should work out of the box. The monitor might also be certified by Nvidia as GSync compatible, in which case another customer with an Nvidia graphics card should have the same experience with Gsync.

[-] stanka@lemmy.ml 1 points 6 months ago

What does this mean for standard TVs that people us for gaming. LG/Sony/Samsung OLEDs tend to be able to do 4k@120, having native 120hz panels. Maybe this only covers "monitors" getting freesymc certified.

[-] Dudewitbow@lemmy.zip 2 points 6 months ago

those handle VRR with the HDMI 2.1 hardware spec which is a little bit different than the traditional method of VRR.

its the main reason how current gen consoles have VRR (through hdmi 2.1 spec)

[-] stanka@lemmy.ml 1 points 6 months ago

Rtings says that the LG (B2 at least) TV's support VRR via several standards: HDMI 2.1 , FreeSync, and GSYNC. I have a console hooked up, but no GPU good enough in a PC.

[-] Dudewitbow@lemmy.zip 1 points 6 months ago* (last edited 6 months ago)

its freesync/gsync over hdmi 2.1 standard. Nivida does not have a Gsync over HDMI in the standard hdmi connection. There is no non 2.1 hdmi monitor/tv that will accept VRR over HDMI on Nvidia. Only AMD had Freesync over HDMI (on very low end budget monitors)

Gsync Compatible is basically gsync over the display port standard. Gsync Ultimate is over the FPGA which uses display port as a medium.

[-] stanka@lemmy.ml 1 points 6 months ago

I would love to learn more about this. Know of any technical papers or references?

[-] Dudewitbow@lemmy.zip 2 points 6 months ago

idk about technical documents perse, but heres a news article when AMD introduced VRR over hdmi ways back, noting how vrr on hdmi wasnt a thing yet, so AMD partnered with monitor makers to use a different scaler that would make it compatible with freesync.

VRR over display port would be in the Displayport 1.2a specification sheet. VRR over HDMI (officially) is under the HDMI 2.1b sheet.

load more comments
view more: next ›
this post was submitted on 07 Mar 2024
109 points (93.6% liked)

PC Gaming

8250 readers
887 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS