So now we can finally go back to good old code optimization, right? Right? (Padme.jpg)
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
We'll ask AI to make it performant, and when it breaks, we'll just go back to the old version. No way in hell we are paying someone
Damn. I hate how it hurts to know that's what will happen
It’s not that they’re not improving like they used to, it’s that the die can’t shrink any more.
Price cuts and “slim” models used to be possible due to die shrinks. A console might have released on 100nm, and then a process improvement comes out that means it can be made on 50nm, meaning 2x as many chips on a wafer and half the power usage and heat generation. This allowed smaller and cheaper revisions.
Now that the current ones are already on like 4nm, there’s just nowhere to shrink to.
This is absolutely right. We are getting to the point where the circuit pathway is hundreds or even dozens of electrons wide. The fact that we can even make circuits that small in quantity is fucking amazing. But we are rapidly approaching laws-of-physics type limits in how much smaller we can go.
Plus let's not forget an awful lot of the super high-end production is being gobbled up by AI training farms and GPU clusters. Companies that will buy 10,000 chips at a time are absolutely the preferred customers.
Not to mention that even when some components do shrink, it's not uniform for all components on the chip, so they can't just do 1:1 layout shrinks like in the past, but pretty much need to start the physical design portion all over with a new layout and timings (which then cascade out into many other required changes).
Porting to a new process node (even at the same foundry company) isn't quite as much work as a new project, but it's close.
Same thing applies to changing to a new foundry company, for all of those wondering why chip designers don't just switch some production from TSMC to Samsung or Intel since TSMC's production is sold out. It's almost as much work as just making a new chip, plus performance and efficiency would be very different depending in where the chip was made.
Did you read the article? That's exactly what it said.
Which itself is a gimmick, they've just made the gates taller, electron leakage would happen otherwise.
NM has been a marketing gimmick since Intel launched their long-standing 14nm node. Actual transistor density depending on which fab you compare to is shambles.
It's now a title / name of a process and not representative of how small the transistors are.
I've not paid for a CPU upgrade since 2020, and before that I was using a 22nm CPU from 2014. The market isn't exciting (to me anymore), I don't even want to talk about the GPUs.
Back in the late 90s or early 2000s upgrades felt substantial and exciting, now it's all same-same with some minor power efficiency gains.
This is why I'm more than happy with my 5800X3D/7900XTX; I know they'll perform like a dream for years to come. The games I play run beautifully on this hardware under Linux (BeamNG.Drive runs faster than on Windows 10), and I have no interest in upgrading the hardware any time soon.
Hell, the 4790k/750Ti system I built back in 2015 was still a beast in 2021, and if my ex hadn't gotten it in the divorce (I built it specifically for her, so I didn't lose any sleep over it), a 1080Ti upgrade would have made it a solid machine for 2025. But here we are - my PC now was a post-divorce gift for myself. Worth every penny. PC and divorce.
Ironic the image is of a switch, like Nintendo has been on the cutting edge at all in the last 20+ years
Man they are going to ride the pandemic as a cause for high prices until it's a skeleton just skidding on the ground. It's been four years since pandemic supply issues, pretty sure those are over now. Unless they mean the price gouging that happened then that hasn't gone down.
This article doesn't factor in the new demand that is gobbling up all the CPU and GPU production: Ai server farms. For example, Nvidia, that was once only making graphic cards for gamers, has been trying to keep up with global demand for Ai. The whole market is different, then toss tarrifs and the rest of top.
I wouldn't blame moores law death, technology is still advancing, but per usual, based on demand.
technology is still advancing
Actually not really: performance per watt of the high end stuff has been stagnating since Ampere generation. NVidia hides it by changing models in its benchmarks or advertising raw performance without power figures.
AI has nothing to do with it. Die shrinks were the reason for “slim” consoles and big price drops in the past. Die shrinks are basically a thing of the past now.
game graphics and design peaked in 2008. N64 was more optimized than anything that came after. Im so over current gen, and last gen and the gen before that too. Let it all burn. :)
Edit: Furthermore,
https://lemmy.blahaj.zone/pictrs/image/222c26df-19d9-4fce-9ce3-2f3dcffefc60.webp
Was about to say this too. Can't tell a difference between most games made in 2013 vs 2023.
Battlefield 1 still beats 99% of games releasing now
Also they’re not going to play Silksong any better than a ten year old console.
is it just me or this title is weird?
It's not just you. The title gets causation totally wrong. If people made bad assumptions about how technology would change in the future, it's their assumptions that are the problem, not reality.