this post was submitted on 08 Apr 2025
53 points (96.5% liked)

Hardware

1652 readers
134 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
 

We look at how NVIDIA has downsized essentially all of its gaming GPUs in terms of relative configuration compared to each generation’s flagship

  • This article expands upon our "RTX 4080 problem" by looking at the entirety of the RTX 50 series, including how the RTX 5070 looks an awful lot like a prior 50-class or 60-class GPU.
  • NVIDIA is giving you the least amount of CUDA cores for a given class of GPU than ever before.
  • GPU prices have crept higher across the board, but NVIDIA's, in particular, have lost step with what we came to expect from generations of GPU launches.
you are viewing a single comment's thread
view the rest of the comments
[–] grue@lemmy.world 3 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

On one hand I agree with GamersNexus Steve that 'line go down = bad,' but on the other hand you could consider it as the flagship becoming more and more of an outlier. (In other words, if the graph were normalized such that the 80 series line were flat, then I think the lower-model lines would also be flat but the 90 series line would be going up.)

If Nvidia "fixed" it by just not offering the 5090 in its current form at all and instead having the fastest non-pro/compute/AI card be one with a lower price and fewer cores, would that make Steve and gamers happy?

[–] Alphane_Moon@lemmy.world 5 points 2 weeks ago

This is true. But it also ignores price dynamics.

One of the first GPUs that I "bought" (convinced my father to pay for an upgrade) was the GeForce 6600 for ~$250 or so (maybe $275 max) in 2004. This is the true price, not American-style list price. We bought it for that price (in local currency) at a computer store. I believe US true prices were (much?) lower that $275 at that time, but I could be wrong.

$275 in 2004 is around $470 in 2025. You are not getting a Nvidia 6600 class card for $470 (all in) from AMD or Nvidia. The closest would be the Intel B580 which goes for around $340 (true price) where I live. But I would argue the B580 is not comparable to what the 6600 was in 2004. And the 6600 was broadly available in 2004 (at relatively competitive prices) even though I did not live in the "western world".

And keep in mind that I don't remember the exact price of the 6600 that we bought in 2004. My memory tells me that it was around $250 which would be $420 is current dollars (solid price difference to the $470 mentioned earlier).