this post was submitted on 15 Jun 2025
642 points (99.5% liked)

PC Gaming

11421 readers
482 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
 

Well I am shocked, SHOCKED I say! Well, not that shocked.

top 50 comments
sorted by: hot top controversial new old
[–] ItsMrChristmas@lemmy.zip 3 points 1 day ago

GPU prices are what drove me back to consoles. It was time to overhaul my PC as it was getting painfully out of date. Video card alone was gonna be 700. Meanwhile a whole ass PS5 that plays the same games was 500.

It's been 2 years since and I don't regret it. I miss mods, but not nearly as much as I thought. It also SOOO nice to play multiplayer games without cheaters everywhere. I actually used to be one of those people who thought controllers gave an unfair advantage but... you can use a M/KB on PS5 and guess what? I do just fine! Turns out that the problem was never controllers, it was the cheaters.

But then there is that. The controller. Oh my lord it's so much more comfortable than even the best gaming mouse. I've done a complete 180 on this. So many game genres are just so terrible to play with M/KB that I now tell people whining about controller players this:

Use gaming equipment for gaming and leave office equipment in the office.

[–] gravitas_deficiency@sh.itjust.works 14 points 1 day ago (1 children)

Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

So my next card is probably gonna be an RX 9070XT.

[–] ameancow@lemmy.world 3 points 1 day ago* (last edited 1 day ago) (1 children)

even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they're playing their kid's console games.

Every year we say "Gonna look into upgrading" but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn't also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.

[–] jacksilver@lemmy.world 2 points 1 day ago

It makes me wonder if this will bring more people back to consoles. The library may be more limiting, but when a console costs less than just a gpu, itll be more tempting.

[–] excral@feddit.org 5 points 1 day ago

For me it's the GPU prices, stagnation of the technology (most performance gains come at the cost of stupid power draw) and importantly being fed up with AAA games. Most games I played recently were a couple years old, indie titles or a couple years old indie titles. And I don't need a high powered graphics card for that. I've been playing far more on my steam deck than my desktop PC, despite the latter having significantly more powerful hardware. You can't force fun through sheer hardware performance

[–] Throwaway4669332255@lemmy.world 7 points 1 day ago (2 children)

IT litterally costs $3000

Thats almost 4 time the cost of my 3090.

load more comments (2 replies)
[–] Deflated0ne@lemmy.world 8 points 1 day ago

Ah capitalism...

Endless infinite growth forever on a fragile and very much finite planet where wages are suppressed and most money is intentionally funneled into the coffers of a small handful of people who are already so wealthy that their descendants 5 generations down the line will still be some of the richest people on the planet.

[–] RadioFreeArabia@lemmy.world 9 points 1 day ago* (last edited 1 day ago)

I just looked up the price and I was "Yikes!". You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.

I don't buy every generation and skip 1 if not 2. I have a 40xx series and will probably wait until the 70xx (I'm assumimg series naming here) before upgrading.

[–] t_berium@lemmy.world 12 points 1 day ago

I remember when High-end-GPUs were around 500 €.

[–] Demognomicon@lemmy.world 7 points 1 day ago (2 children)

I have a 4090. I don't see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.

[–] Critical_Thinker@lemm.ee 4 points 1 day ago* (last edited 1 day ago) (4 children)

I don't think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.

The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there's zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?

As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia's side make hardware irrelevant.

[–] brucethemoose@lemmy.world 3 points 1 day ago* (last edited 1 day ago)

One issue is everyone is supply constrained by TSMC. Even Arc Battlemage is OOS at MSRP.

I bet Intel is kicking themselves for using TSMC. It kinda made sense when they decided years ago, but holy heck, they'd be swimming in market share if they used their own fabs instead (and kept the bigger die).

I feel like another is... marketing?

Like, many buyers just impulse buy, or go with what some shill recommended in a feed. Doesn't matter how competitive anything is anymore.

load more comments (3 replies)
load more comments (1 replies)
[–] JackbyDev@programming.dev 16 points 1 day ago (3 children)

Uhhh, I went from a Radeon 1090 (or whatever they're called, it's an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It's normal to not buy a GPU every year.

load more comments (3 replies)
[–] finitebanjo@lemmy.world 4 points 1 day ago (1 children)

I'm ngl, finances had no impact on my decisions to stay at 3080. Performance and support did. Everything I want to play runs at least 60 to 180 fps with my current loadout. I'm also afraid once Windows 10 LTSC dies I won't be able to use a high end GPU with Linux anyways.

[–] MisterCD@lemmy.world 3 points 1 day ago

You can always side-grade to AMD. I was using a 3070 and ditched Windows for Kubuntu and while it was very usable, I would get the slightest input lag and had to make sure the compositor (desktop effects) was turned off when playing a game.

After some research I decided to side-grade to the 6800 and it's a night and day difference. Buttery smooth gaming. It performs better with compositor on than Nvidia did with it off. I know 6800 isn't high end but it's no slouch either. AMD is king on Linux.

[–] Matriks404@lemmy.world 7 points 1 day ago* (last edited 1 day ago) (1 children)

I am still on my GTX 1060 3 GB, probably worth about $50 at this point lol

load more comments (1 replies)
[–] baatliwala@lemmy.world 3 points 1 day ago

Been hearing this for the past 3 years

[–] frenchfryenjoyer@lemmings.world 22 points 2 days ago (5 children)

I have a 3080 and am surviving lol. never had an issue

load more comments (5 replies)
[–] candyman337@lemmy.world 27 points 2 days ago* (last edited 2 days ago) (1 children)

It's just because I'm not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I'm waiting until AMD gets a little better with ray tracing and switching to team red.

[–] chunes@lemmy.world 30 points 2 days ago (5 children)

I stopped maintaining a AAA-capable rig in 2016. I've been playing indies since and haven't felt left out whatsoever.

load more comments (5 replies)
[–] simple@piefed.social 47 points 2 days ago (2 children)

Unfortunately gamers aren't the real target audience for new GPUs, it's AI bros. Even if nobody buys a 4090/5090 for gaming, they're always out of stock as LLM enthusiasts and small companies use them for AI.

[–] brucethemoose@lemmy.world 3 points 1 day ago* (last edited 1 day ago)

5090 is kinda terrible for AI actually. Its too expensive. It only just got support in pytorch, and if you look at 'normie' AI bros trying to use them online, shit doesn't work.

4090 is... mediocre because it's expensive for 24GB. The 3090 is basically the best AI card Nvidia ever made, and tinkerers just opt for banks of them.

Businesses tend to buy RTX Pro cards, rent cloud A100s/H100s or just use APIs.

The server cards DO eat up TSMC capacity, but insane 4090/5090 prices is mostly Nvidia's (and AMD's) fault for literally being anticompetitive.

[–] imetators@lemmy.dbzer0.com 8 points 1 day ago

Ex-fucking-actly!

Ajajaja, gamers are skipping. Yeah, they do. And yet 5090 is still somehow out of stock. No matter the price or state of gaming. We all know major tech went AI direction disregarding average Joe about either they want or not to go AI. The prices are not for gamers. The prices are for whales, AI companies and enthusiasts.

[–] kevinsbacon@lemmy.today 9 points 1 day ago (1 children)

All I want is more VRAM, it can already play all the games I want.

[–] RvTV95XBeo@sh.itjust.works 8 points 1 day ago (3 children)

But with our new system we can make up 10x as many fake frames to cram between your real ones, giving you 2500 FPS! Isn't that awesome???

[–] Blackmist@feddit.uk 4 points 1 day ago

Bullshitted pixels per second seem to be the new currency.

It may look smooth in videos, but 30fps upframed(?) to 120fps will still feel like a 30fps game.

Modern TVs do the same shit, and it both looks and feels like ass. And not good ass.

load more comments (2 replies)
[–] GaMEChld@lemmy.world 8 points 1 day ago

Don't think I'll be moving on from my 7900XTX for a long while. Quite pleased with it.

[–] coacoamelky@lemm.ee 143 points 3 days ago (5 children)

The good games don't need a high end GPU.

[–] epitaque@lemmy.world 2 points 1 day ago

Clair obscur runs like shit on my 3090 at 4k :(

load more comments (4 replies)
[–] Mongostein@lemmy.ca 1 points 1 day ago

There’s so many games out there I’d like to play, but I’m an adult with responsibilities. I don’t need the newest game or gaming hardware because no matter how hard I try to catch up I never will, so I don’t bother to try and I always have something to play on my hardware.

[–] JordanZ@lemmy.world 104 points 2 days ago (17 children)

When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal? I don’t understand people upgraded phones every year either. Both of those things are high cost for minimal gains between years. You really need 3+ years for any meaningful gains. Especially over the last few years.

load more comments (17 replies)
[–] Mwa@thelemmy.club 2 points 1 day ago* (last edited 1 day ago)

I am just using my GTX 1650 4GB VRAM (GDDR6) and it works fine for most of the things i do I can use Linux + FSR Hack to squeeze framerates out of games that perform poorly
and it runs my SCP:SL and tf2 fine
SCP:SL am using FSR HACK To squeeze more framerate until Nvidia fixes VKD3D
Maybe my next card is RX 6650 XT/AMD but still i might stick with my GTX 1650

[–] charonn0@startrek.website 1 points 1 day ago

The GTX1660 I bought in 2023 for $300 is still running fine.

[–] Evotech@lemmy.world 2 points 1 day ago

When a new gpu was 500-900 usd it was fine.

But yeah, 2070rtx keeps chugging on

[–] arc99@lemmy.world 6 points 1 day ago (3 children)

Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.

My GPU which is an RTX2060 is getting a little long in the tooth and I'll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I'm at it.

[–] SpaceCadet 3 points 1 day ago

bitcoin mining

That's a thing of the past, not profitable anymore unless you use ASIC miners. Some people still GPU mine it on niche coins, but it's nowhere near the scale as it was during the bitcoin and ethereum craze a few years ago.

AI is driving up prices or rather, it's reducing availability, which then translates into higher prices.

Another thing is that board manufacturers, distributors and retailers have figured out that they can jack up GPU prices above MSRP and enough suckers will still buy them. They'll sell less volume but they'll make more profit per unit.

load more comments (2 replies)
load more comments
view more: next ›