998
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 26 Jul 2024
998 points (99.5% liked)
Technology
59875 readers
2489 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
Amd processors have literally always been a better value and rarely have been surpassed by much for long. The only problem they ever had was back in the day they overheated easily. But I will never ever buy an Intel processor on purpose, especially after this.
That's not true. It was just last year that some of the Ryzen 7000 models were burning themselves out from the insides at default settings (within AMD specs) due to excessive SoC voltage. They fixed it through new specs and working with board manufacturers to issue new BIOS, and I think they eventually gave in to pressure to cover the damaged units. I guess we'll see if Intel ends up doing the same.
I generally agree with your sentiment, though. :)
I just wish both brands would chill. Pushing the hardware so hard for such slim gains is wasting power and costing customers.
I think he was referring to "back-in-the-day" when Athlons, unlike the competing Pentium 3 and 4 CPUs of the day, didn't have any thermal protections and would literally go up in smoke if you ran them without cooling.
https://www.youtube.com/watch?v=yRn8ri9tKf8
Some motherboards did have overheating protection back then though. Personally I had my Athlon XP computer randomly shut down several times back then, because the system had some issue, where fans would randomly start slowing down and eventually completely stop. This then triggered overheat protection of the motherboard, which simply cut the power as soon as the temperature was too hight.
When I started using computers, I wasn't aware of any thermal protections in popular CPUs. Do you happen to know when they first appeared in Intel chips?
Pentium 2 and 3 had rudimentary protection. They would simply shutdown if they got too hot. Pentium 4 was the first one that would throttle down clock speeds.
Anything before that didn't have any protection as far as I'm aware.
Yeah. I just meant AMD cpus used to easily overheat if your cooling system had an issue. My ryzen 7 3700x has been freaking awesome though. Feels more solid than any PC I've built. And it's fast AF. I think I saved over $150 when comparing to a similarly rated Intel CPU. And the motherboards generally seem cheaper for AMD too. I would feel ripped off with Intel even without the crashing issues
Problem is that it's getting extremely hard to get more single-threaded performance out of a chip, and this is one of the few ways to do so. And a lot of software is not going to be rewritten to use multiple cores. In some cases, it's fundamentally impossible to parallelize a particular algorithm.
That was asus applying too much voltage to the x3d skus
Where do you think Asus got the specs for that voltage?
Then why were there essentially no blow ups from other motherboard manufacturers? Tell me if my information on this is wrong, but when there's only one brand causing issues then they're the ones to blame for it.
There were, including MSI, who also released corrected BIOS versions.
(But even if that were not the case, it could be explained by Asus being the only board maker to use the high end of a voltage range allowed by AMD, or by Asus having a significantly larger share of users who are vocal about such problems.)
Not from AMD. From the autogenerated transcript (with minor edits where it messed up the names of things):
This was pretty much all on motherboard manufacturers, and ASUS was particularly bad (out scumbaging MSI, good job, guys).
At the start of this Intel mess, it was thought they had a similar issue on their hands and motherboard manufactures just needed to get in line, but it ended up going a lot deeper.
That doesn't contradict anything I wrote. Note that it says AMD's recommended cutoff is now 1.3 volts, implying that it wasn't before this mess began. Note also that the problem was worse on Asus boards because their components' tolerance was a bit too loose for a target voltage this high, not because they used a voltage target beyond AMD's specified cutoff. If the cutoff hadn't been pushed so high for this generation in the first place, that same tolerance probably would have been okay.
In any case, there's no sense in bickering about it. Asus was not without blame (I was upset with them myself) but also not the only affected brand, so it's not possible that they were the cause of the underlying problem, now is it?
AMD and Intel have been pushing their CPUs to very high voltages and temperatures for small performance gains recently. 95°C as the new "normal" was unheard of just a few years ago. It's no surprise that it led to damage in some cases, especially for early adopters. It's a thin line to walk.
I've been on team AMD for over 20 years now but that's not true. The CoreDuo and the first couple of I CPUS were better than what AMD was offering and were for a decade. The Athlon were much better than the Pentium 3 and P4, the Ryzen are better than the current I series but the Phenom weren't. Don't get me wrong, I like my Phenom II X4 but it objectively wasn't as good as Intel's offerings back in the day.
My i5-4690 and i7-4770 machines remain competitive to this day, even with spectre patches in place. I saw no reason to 'upgrade' to 6/7/8th gen CPUs.
I'm looking for a new desktop now, but for the costs involved I might just end up parting together a HP Z6 G4 with server surplus cpu/ram. The costs of going to 11th+ desktop Intel don't seem worth it.
I'm going to look at the more recent AMD offerings, but I'm not sure they'll compete with surplus server kit.
I'd say that regardless of the brand, X86 CPU don't need to be upgraded as often as they used to. No awesome new extension like SSE or something like that, not much more powerful, power consumption not going down significantly. If you don't care about power consumption, the server CPU will be more interesting, there's no doubt about that.
My issue with surplus server kit at home is that it tends to idle at very high power usage compared to desktop kit. For home use that won't be pushing high CPU utilization, the savings in cost off eBay aren't worth much.
This is also why you're seeing AM5 on server motherboards. If you don't need to have tons of PCIe lanes--and especially with PCIe 5, you probably don't--the higher core count AM5 chips do really well for servers.
They're still useful, but they're not competitive in overall performance with recent CPUs in the same category. They do still compete with some of the budget and power-efficient CPUs, but they use more power and get hotter.
That said, those 4th gen Intel CPUs are indeed good enough for most everyday computing tasks. They won't run Windows 11 because MS locks them out, but they will feel adequately fast unless you're doing pretty demanding stuff.
I still have an i5-2400, an i7-4770K and an i7-6700 for occasional or server use, and my i7-8550U laptop runs great with Linux (though it overheated with Windows).
I buy AMD now though.
Very easily.
In college (early aughts), I worked as tech support for fellow students. Several times I had to take the case cover off, point a desktop fan into the case, and tell the kid he needed to get thermal paste and a better cooler (services we didn't offer).
Also, as others have said, AMD CPUs have not always been superior to Intel in performance or even value (though AMDs have almost always been cheaper). It's been a back-and-forth race for much of their history.
Yeah. I never said they were always better in performance. But I have never had an issue other than the heat problem which all but one time was fully my fault. And I don't need a processor to perform 3% better on random tasks... which was the kind of benchmark results I would typically find when comparing similar AMD/intel processors (also in some categories amd did win). I saved probably a couple grand avoiding Intel. And as another user said, I prefer to support the underdog. The company making a great product for a lot less money. Again I say: fuck Intel.