107
submitted 1 week ago* (last edited 1 week ago) by j4p@lemm.ee to c/technology@lemmy.world

"These price increases have multiple intertwining causes, some direct and some less so: inflation, pandemic-era supply crunches, the unpredictable trade policies of the Trump administration, and a gradual shift among console makers away from selling hardware at a loss or breaking even in the hopes that game sales will subsidize the hardware. And you never want to rule out good old shareholder-prioritizing corporate greed.

But one major factor, both in the price increases and in the reduction in drastic “slim”-style redesigns, is technical: the death of Moore’s Law and a noticeable slowdown in the rate at which processors and graphics chips can improve."

you are viewing a single comment's thread
view the rest of the comments
[-] FreedomAdvocate@lemmy.net.au 39 points 1 week ago* (last edited 1 week ago)

It’s not that they’re not improving like they used to, it’s that the die can’t shrink any more.

Price cuts and “slim” models used to be possible due to die shrinks. A console might have released on 100nm, and then a process improvement comes out that means it can be made on 50nm, meaning 2x as many chips on a wafer and half the power usage and heat generation. This allowed smaller and cheaper revisions.

Now that the current ones are already on like 4nm, there’s just nowhere to shrink to.

[-] toastmeister@lemmy.ca 5 points 1 week ago

Which itself is a gimmick, they've just made the gates taller, electron leakage would happen otherwise.

[-] dai@lemmy.world 4 points 1 week ago

NM has been a marketing gimmick since Intel launched their long-standing 14nm node. Actual transistor density depending on which fab you compare to is shambles.

It's now a title / name of a process and not representative of how small the transistors are.

I've not paid for a CPU upgrade since 2020, and before that I was using a 22nm CPU from 2014. The market isn't exciting (to me anymore), I don't even want to talk about the GPUs.

Back in the late 90s or early 2000s upgrades felt substantial and exciting, now it's all same-same with some minor power efficiency gains.

[-] lka1988@lemmy.dbzer0.com 1 points 1 week ago* (last edited 1 week ago)

This is why I'm more than happy with my 5800X3D/7900XTX; I know they'll perform like a dream for years to come. The games I play run beautifully on this hardware under Linux (BeamNG.Drive runs faster than on Windows 10), and I have no interest in upgrading the hardware any time soon.

Hell, the 4790k/750Ti system I built back in 2015 was still a beast in 2021, and if my ex hadn't gotten it in the divorce (I built it specifically for her, so I didn't lose any sleep over it), a 1080Ti upgrade would have made it a solid machine for 2025. But here we are - my PC now was a post-divorce gift for myself. Worth every penny. PC and divorce.

[-] FreedomAdvocate@lemmy.net.au 0 points 1 week ago

There’s no world in which a 750Ti or even 1080Ti is a “solid machine” for gaming in 2025 lol.

[-] ZC3rr0r@lemmy.ca 0 points 1 week ago

Depends on your expectations. If you okay mainly eSports titles at 1080p it would've probably been quite sufficient still.

But I agree it's a stretch as an all-rounder system in 2025. My 3090 is already showing signs of it's age, a card that's two generations older would certainly be struggling today.

[-] lka1988@lemmy.dbzer0.com 0 points 1 week ago* (last edited 1 week ago)

For what I do? It would be perfectly fine. Maybe not for AAA games, but for regular shit at ~40fps and 1080p, it would be perfectly fine.

Gotta remember that some of us are reaching 40 years old, with kids, and don't really give a shit about maxing out the 1% lows.

[-] FreedomAdvocate@lemmy.net.au 1 points 1 week ago* (last edited 1 week ago)

but for regular shit at ~40fps and 1080p

it would be perfectly fine.

That's not "perfectly fine" to most people, especially PC players.

Gotta remember that some of us are reaching 40 years old, with kids, and don’t really give a shit about maxing out the 1% lows.

Already there myself. I don't care about maxing out the 1% lows, but I care about reaching a minimum of 60fps average at the bare minimum, preferably closer to 100 - and definitely higher than 1080p. Us oldies need more p's than that with our bad eyesight haha

[-] Buddahriffic@lemmy.world 5 points 1 week ago

Not to mention that even when some components do shrink, it's not uniform for all components on the chip, so they can't just do 1:1 layout shrinks like in the past, but pretty much need to start the physical design portion all over with a new layout and timings (which then cascade out into many other required changes).

Porting to a new process node (even at the same foundry company) isn't quite as much work as a new project, but it's close.

Same thing applies to changing to a new foundry company, for all of those wondering why chip designers don't just switch some production from TSMC to Samsung or Intel since TSMC's production is sold out. It's almost as much work as just making a new chip, plus performance and efficiency would be very different depending in where the chip was made.

this post was submitted on 04 May 2025
107 points (99.1% liked)

Technology

69950 readers
2084 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS