That's okay. Probably going to take 10 years before I'll even buy my first 4K display.
Technology
Which posts fit here?
Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.
Post guidelines
[Opinion] prefix
Opinion (op-ed) articles must use [Opinion] prefix before the title.
Rules
1. English only
Title and associated content has to be in English.
2. Use original link
Post URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communication
All communication has to be respectful of differing opinions, viewpoints, and experiences.
4. Inclusivity
Everyone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacks
Any kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangents
Stay on topic. Keep it relevant.
7. Instance rules may apply
If something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.
Companion communities
!globalnews@lemmy.zip
!interestingshare@lemmy.zip
Icon attribution | Banner attribution
If someone is interested in moderating this community, message @brikox@lemmy.zip.
I bought a bunch of sub $400 (Australian dollaroos at that) 4k monitors (Samsung, viewsonic etc). Their not the greatest monitors but this was like 5 years ago. Some were on special.
You can play a lot of older games in 4k and it makes a big difference.
4k gaming is more then accessible and far better in my view then high refresh rate gaming at 1080p or 1440p.
I think it's just "to each their own".
My wife and I have a 4K TV and an Apple TV. Initially, the Apple TV was automatically tuned to 4K, but we were having glitches with the picture every now and then. After a bit of troubleshooting, I wondered if it might be either the cable or the TV struggling, given that the cable is old and the TV has frame generation but is old and thus underpowered.
Lo and behold, after manually tuning the Apple TV down to 1080p, it solved the issue.
I notified my wife and we tested back and forth between 4K vs 1080p, and frame gen on vs off for each. Neither of us could tell the difference between 4K and 1080p when sitting on the couch (though we could if we went up close to the TV), but both of us immediately noticed and preferred frame gen on. And yes, we've had our eyes tested and have at least decent vision.
For me, if the downsides to 4K were much lower then of course I'd turn it on and never look back. But we don't notice it on the TV, and while I'd probably notice it on my PC monitor, to upgrade it and my gaming rigs would cost many thousands (also dollarydoos for me) for a pretty mild upgrade (for me).
We've reached the point where FPS is far more impactful to feel than pixel count imo. The difference of looks between 4k and 8k isn't as high as the decrease in performance is.
We're past that point as well. 4k @ 240 Hz is so good, most people won't be able to tell the difference to an 8k, 480 Hz monitor. Even if they pay special attention to it. Probably not even in A/B testing.
There is still room for improvement in the area of HDR, but monitors are almost as good as they will ever get.
That's what everyone always thinks
It's because we're at the limits of the human visual system. The difference in pixel pitch between 4k and 8k at the distances we watch TV is literally imperceptible.
It also doesn't help that there's not much content authored and distributed for higher resolutions. It's exponentially more expensive to produce, store, and deliver.
Home Internet connections on average aren't any better than they were ten years ago, either, at least not in the US. I doubt a lot of them can even support 8k streaming, let alone with anyone else using it at the same time.
Most of them can't really even support 4k streaming at a bitrate that is significantly better than 1080p.
They have been saying that since 30hz and they still say that with 240hz
Yeah but we're talking diminishing returns here. Doubling the resolution to 8k makes about as much sense as doubling refresh rates to 480hz. At that point it's going to be mostly dependent on the individual, and likely heavily subject to the placebo effect.
By my math, a 55" 8k screen has pixels that are 0.056" (56 thou) wide.
At ten feet, that subtends an angle of 0.268 degrees or 1.6 arcminutes.
There's obviously a lot of variation and it depends on exactly what you're measuring, but normal human visual acuity struggles to distinguish details less than about 5 arcminutes, maybe 1-2 arcminutes depending on the test.
To add... It would only matter in large format displays anyway. Pixel density is only going to matter so much.
I remember when Sharp put out their Aquos 70" FHD TV and I thought, "eww, so grainy"! But now I've got a 85" UHD with the same density as a ~42" FHD which helps with clarity since my viewing distance hadn't really changed (~10ft).
FPS is great and all, but not when most content is 24fps-60fps. 120 is an awesome sweet spot for 24fps content since its 5hz per frame.
IMO UHD still has room for growth and adoption before another tech hits. Not to mention the financial strains everyone's in due to the fucking billionaire squeeze... And they wonder why people are tight on money?! Fucking idiots!
I doubt the streaming model is going to support 8k content anytime soon. Actual 4k is already more data than anyone wants to be pushing around every time they watch something, to the point that what most people actually watch as "4k" in streaming is at bitrates that make it almost indistinguishable from 1080p.
Well, yes and no. h.265 (HEVC) made it far better for UHD streaming to an extent. Around half the bandwidth of h.264 but 4x more pixels, so you only go up 2x bandwidth.
Now we have .AV1 and h.266 (VCC) formats which need adoption first before we can really push 8K/UHD content. Again, not 100% accurate, but around 3-5x bandwidth of h.264 but is ~15x pixels.
We've come a long way!
The screen size needed for 8K to make a difference doesn't fit in a typical living room.
That's kind of what I'm getting at. Once you hit a certain size, it only makes sense to have a certain resolution. I know jumping from 65" to 85" made all my Plex content "blurry" bc it wasn't good enough quality/bitrate. Reripping BD and 4K BD used h.265 and 12-15GB/hr per UHD file was way better!
Idk what 8K looks like, but for those new 98"+ displays, I wouldn't go any bigger unless 8K. 42-50" Max FHD, I would say 85" Max UHD. You can't really sit any further in a LR, so being that close I'd want it that way. Plus, it'd require the faster refresh rate to not look so bad moving over that much surface area.
I'm just excited for PeLED or PeNC (Perovskite LED / Nano Crystal). 😎🤯 sorry, off topic...
In regular use I prefer 4k, but with games I legit cant tell the difference between 4k and 1440p, I can tell the different between 60 and 120, 180 not really
The main difference to me is that I can completely disable anti-aliasing on higher resolutions, jagged pixels just aren’t as much an issue when there are so many more
The future is in round pixels.
Or just one pixel that moves around really quickly.
Maybe if we had some kind of electron beam we could use to draw on a screen, and the brain just does it's 'persistence of vision' shenanigans?
Yeah, maybe guide that beam with electromagnetic fields. Maybe we can dye phosphors on the front of wherever it's projecting onto so that we can make it a color display.
Yeah and what's up with this flat/concave boring shape we give to TVs. We should make them convex and give them a hump so I can move them easily.
Also who decided to make TVs lightweight? Real TVs have curves.
my 40 inch curved monitor is like a foot from my head too
I am surprised this did not occur like in a month after a certain sporting event.
Still on a 1080p plasma tv. Those old charts showing if you can tell the difference in quality shows at my distance I'd be fine with 720p. Do people sit inches from their 80 inch TV?
I can 100% tell the difference between 1080p and 720p. I can tell with 1080p and UHD as well, but I honestly think that has more to do with the size of the compression artifacts. Compared to the image.
I mean, if you have compression artifacts wouldn't that mean the codec/delivery of the content is the issue and not the resolution.
I'm pretty sure that most 4k content isn't actually 4k (especially when streaming). I'd link a source talking about it, but they're all ad garbage.
Most of the content I view is downloaded and encoded from Blu-ray.
PC gaming should head towards 21:9 for ubiquitous support in games. 1680x720, 1920x800, 2560x1080, 3440x1440, ...
Also OLED or higher density dimming zones. Full coverage DCI-P3. Then color reproduction and brightness highlights will also be hitting a point of diminishing returns. Then it'll be onto VR/head mounted displays where density and brightness/contrasts will better show off
I early adopted 3840x2160 way back and recently went with a no name $200 3440x1440 monitor in 2024 and that was a way better upgrade than 1080p to 2160p. I'd take 2560x1080 over 3840x2160. 8k has no relevance until it's the best value for up to $1000 for a 65" TV
Check out the next evolution, PeLED or PeNC. Perovskite LED or Perovskite Nano Crystal.
Should oulast LED 3x+, better brightness, better contrast, way better refresh rates with less ghosting, smaller pixel/higher density, etc.
Honestly, 4K is more then enough for me
Most people arent even at 4k and cant see the difference past that.
8k is just there to sell shit.
I would like to see 8K for movies in cinema, especially for remasters/digitizations from existing film negatives and archival purposes.
While I wouldn't say no to it, 8K on a TV under 75" is not going to add much value to the average consumer.
4320p (8k) video makes as much sense as 96kHz audio. Both have a legitimate role in capture and creation, but a vanishingly minuscule role at the point of playback and consumption.
5k and 6k is where it’s at. Get hip.
I didn't even say hello...
My eyesight isn't perfect but I honestly struggle to tell the difference between a 1440p monitor and 4K television so I'd much rather we stop the resolution nonsense and get back to tech that is at least mildly more interesting, like Q-dot or 3D