this post was submitted on 04 Jul 2025
367 points (95.3% liked)

Technology

72440 readers
2600 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Quibblekrust@thelemmy.club 2 points 4 hours ago* (last edited 4 hours ago)

Those 4% can make an RTX 5070 Ti perform at the levels of an RTX 4070 Ti Super, completely eradicating the reason you’d get an RTX 5070 Ti in the first place.

You'd buy a 5070 Ti for a 4% increase in performance over the 4070 Ti Super you already had? Ok.

[–] MITM0@lemmy.world 3 points 8 hours ago

AMD & Intel ARC are king now. All that CUDA nonsense, is just price-hiking justification

[–] FireWire400@lemmy.world 12 points 15 hours ago (1 children)

Bought my first AMD card last year, never looked back

[–] Krompus@lemmy.world 2 points 4 hours ago

AMD’s Windows drivers are a little rough, but the open source drivers on Linux are spectacular.

[–] Mr_Dr_Oink@lemmy.world 32 points 22 hours ago (1 children)

Since when did gfx cards need to cost more than a used car?

We are being scammed by nvidia. They are selling stuff that 20 years ago, the equivalent would have been some massive research prototype. And there would be, like, 2 of them in an nvidia bunker somewhere powering deep thought whilst it calculated the meaning of life, the universe, and everything.

3k for a gfx card. Man my whole pc cost 500 quid and it runs all my games and pcvr just fine.

Could it run better? Sure

Does it need to? Not for 3 grand...

Fuck me!.....

[–] Krompus@lemmy.world 3 points 4 hours ago

I haven’t bought a GPU since my beloved Vega 64 for $400 on Black Friday 2018, and the current prices are just horrifying. I’ll probably settle with midrange next build.

[–] kepix@lemmy.world -1 points 8 hours ago

"and the drivers, for which NVIDIA has always been praised, are currently falling apart"

what? they were shit since hl2

[–] 3dcadmin@lemmy.relayeasy.com 2 points 14 hours ago

After being AMD for years recently went back to nvidia for one reason. nvenc works way better for encoding livestreams and videos than amd

[–] 3aqn5k6ryk@lemmy.world 6 points 19 hours ago

My last nvidia card was gtx 980.I bought two of them. After i heard about 970 scandal. It didnt directly affect me but fuck nvidia for pulling that shit. Havent bought anything from them. Stopped playing games on pc afterwards, just occasionally on console and laptop igpu.

[–] nthavoc@lemmy.today 51 points 1 day ago (4 children)

Folks, ask yourselves, what game is out there that REALLY needs a 5090? If you have the money to piss away, by all means, it's your money. But let's face it, games have plateaued and VR isn't all that great.

Nvidia's market is not you anymore. It's the massive corporations and research firms for useless AI projects or number crunching. They have more money than all gamers combined. Maybe time to go outside; me included.

[–] Blackmist@feddit.uk 25 points 1 day ago

Oh, VR is pretty neat. It sure as shit don't need no $3000 graphics card though.

[–] nuko147@lemmy.world 7 points 19 hours ago (1 children)

Only Cyberpunk 2077 in path tracing. The only game i have not played until i can run it on ultra settings. But for that amount of money, i better wait until the real 2077 to see it happen.

[–] Shanmugha@lemmy.world 2 points 6 hours ago

The studio has done a great job. You most certainly have heard it already, but I am willing to say it again: the game is worth playing with whatever quality you can afford, save stutter-level low fps - the story is so touching it outplays graphics completely (though I do share the desire to play it on ultra settings - will do one day myself)

[–] infyrian@kbin.melroy.org 3 points 23 hours ago

I plan on getting at least a 4060 and I'm sitting on that for years. I'm on a 2060 right now.

My 2060 alone can run at least 85% of all games in my entire libraries across platforms. But I want at least 95% or 100%

[–] BT_7274@lemmy.world 8 points 1 day ago* (last edited 1 day ago) (1 children)

Cyberpunk 2077 with the VR mod is the only one I can think of. Because it’s not natively built for VR you have to render the world separately for each eye leading to a halving of the overall frame rate. And with 90 fps as the bare minimum for many people in VR you really don’t have a choice but to use the 5090.

Yeah it’s literally only one game/mod, but that would be my use case if I could afford it.

Also the Train World Sim Series. Those games make my tower complain, and my laptop give up.

[–] just_another_person@lemmy.world 88 points 1 day ago (15 children)

My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You'd think people would have learned their lessons a decade ago.

[–] eager_eagle@lemmy.world 55 points 1 day ago (4 children)

they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090

[–] Naz@sh.itjust.works 9 points 1 day ago (1 children)

I have overclocked my AMD 7900XTX as far as it will go on air alone.

Undervolted every step on the frequency curve, cranked up the power, 100% fan duty cycles.

At it's absolute best, it's competitive or trades blows with the 4090D, and is 6% slower than the RTX 4090 Founder's Edition (the slowest of the stock 4090 lineup).

The fastest AMD card is equivalent to a 4080 Super, and the next gen hasn't shown anything new.

AMD needs a 5090-killer. Dual socket or whatever monstrosity which pulls 800W, but it needs to slap that greenbo with at least a 20-50% lead in frame rates across all titles, including raytraced. Then we'll see some serious price cuts and competition.

[–] bilb@lemmy.ml 1 points 1 day ago* (last edited 1 day ago)

And/or Intel. (I can dream, right?) Hell, perform a miracle Moore Threads!

[–] Chronographs@lemmy.zip 44 points 1 day ago

That’s exactly it, they have no competition at the high end

[–] just_another_person@lemmy.world 34 points 1 day ago (10 children)

Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that's not where their product line intends to go. That's why it's smart.

For reference: AMD has the most deployed GPUs on the planet as of right now. There's a reason why it's in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn't just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.

[–] eager_eagle@lemmy.world 25 points 1 day ago (3 children)

this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.

[–] SheeEttin@lemmy.zip 19 points 1 day ago (2 children)

Yup. You want a server? Dell just plain doesn't offer anything but Nvidia cards. You want to build your own? The GPGPU stuff like zluda is brand new and not really supported by anyone. You want to participate in the development community, you buy Nvidia and use CUDA.

[–] qupada@fedia.io 19 points 1 day ago (2 children)

Fortunately, even that tide is shifting.

I've been talking to Dell about it recently, they've just announced new servers (releasing later this year) which can have either Nvidia's B300 or AMD's MI355x GPUs. Available in a hilarious 19" 10RU air-cooled form factor (XE9685), or ORv3 3OU water-cooled (XE9685L).

It's the first time they've offered a system using both CPU and GPU from AMD - previously they had some Intel CPU / AMD GPU options, and AMD CPU / Nvidia GPU, but never before AMD / AMD.

With AMD promising release day support for PyTorch and other popular programming libraries, we're also part-way there on software. I'm not going to pretend like needing CUDA isn't still a massive hump in the road, but "everyone uses CUDA" <-> "everyone needs CUDA" is one hell of a chicken-and-egg problem which isn't getting solved overnight.

Realistically facing that kind of uphill battle, AMD is just going to have to compete on price - they're quoting 40% performance/dollar improvement over Nvidia for these upcoming GPUs, so perhaps they are - and trying to win hearts and minds with rock-solid driver/software support so people who do have the option (ie in-house code, not 3rd-party software) look to write it with not-CUDA.

To note, this is the 3rd generation of the MI3xx series (MI300, MI325, now MI350/355). I think it might be the first one to make the market splash that AMD has been hoping for.

load more comments (2 replies)
load more comments (1 replies)
load more comments (2 replies)
load more comments (9 replies)
load more comments (1 replies)
[–] RazgrizOne@piefed.zip 17 points 1 day ago (1 children)

Once the 9070 dropped all arguments for Nvidia stopped being worthy of consideration outside of very niche/fringe needs.

[–] CheeseNoodle@lemmy.world 5 points 1 day ago (9 children)

Got my 9070XT at retail (well retail + VAT but thats retail for my country) and my entire PC costs less than a 5090.

load more comments (9 replies)
load more comments (13 replies)
[–] MHLoppy@fedia.io 23 points 1 day ago (3 children)

It covers the breadth of problems pretty well, but I feel compelled to point out that there are a few times where things are misrepresented in this post e.g.:

Newegg selling the ASUS ROG Astral GeForce RTX 5090 for $3,359 (MSRP: $1,999)

eBay Germany offering the same ASUS ROG Astral RTX 5090 for €3,349,95 (MSRP: €2,229)

The MSRP for a 5090 is $2k, but the MSRP for the 5090 Astral -- a top-end card being used for overclocking world records -- is $2.8k. I couldn't quickly find the European MSRP but my money's on it being more than 2.2k euro.

If you’re a creator, CUDA and NVENC are pretty much indispensable, or editing and exporting videos in Adobe Premiere or DaVinci Resolve will take you a lot longer[3]. Same for live streaming, as using NVENC in OBS offloads video rendering to the GPU for smooth frame rates while streaming high-quality video.

NVENC isn't much of a moat right now, as both Intel and AMD's encoders are roughly comparable in quality these days (including in Intel's iGPUs!). There are cases where NVENC might do something specific better (like 4:2:2 support for prosumer/professional use cases) or have better software support in a specific program, but for common use cases like streaming/recording gameplay the alternatives should be roughly equivalent for most users.

as recently as May 2025 and I wasn’t surprised to find even RTX 40 series are still very much overpriced

Production apparently stopped on these for several months leading up to the 50-series launch; it seems unreasonable to harshly judge the pricing of a product that hasn't had new stock for an extended period of time (of course, you can then judge either the decision to stop production or the still-elevated pricing of the 50 series).


DLSS is, and always was, snake oil

I personally find this take crazy given that DLSS2+ / FSR4+, when quality-biased, average visual quality comparable to native for most users in most situations and that was with DLSS2 in 2023, not even DLSS3 let alone DLSS4 (which is markedly better on average). I don't really care how a frame is generated if it looks good enough (and doesn't come with other notable downsides like latency). This almost feels like complaining about screen space reflections being "fake" reflections. Like yeah, it's fake, but if the average player experience is consistently better with it than without it then what does it matter?

Increasingly complex manufacturing nodes are becoming increasingly expensive as all fuck. If it's more cost-efficient to use some of that die area for specialized cores that can do high-quality upscaling instead of natively rendering everything with all the die space then that's fine by me. I don't think blaming DLSS (and its equivalents like FSR and XeSS) as "snake oil" is the right takeaway. If the options are (1) spend $X on a card that outputs 60 FPS natively or (2) spend $X on a card that outputs upscaled 80 FPS at quality good enough that I can't tell it's not native, then sign me the fuck up for option #2. For people less fussy about static image quality and more invested in smoothness, they can be perfectly happy with 100 FPS but marginally worse image quality. Not everyone is as sweaty about static image quality as some of us in the enthusiast crowd are.

There's some fair points here about RT (though I find exclusively using path tracing for RT performance testing a little disingenuous given the performance gap), but if RT performance is the main complaint then why is the sub-heading "DLSS is, and always was, snake oil"?


obligatory: disagreeing with some of the author's points is not the same as saying "Nvidia is great"

[–] poopkins@lemmy.world 6 points 1 day ago

Thanks for providing insights and inviting a more nuanced discussion. I find it extremely frustrating that in communities like Lemmy it's risky to write comments like this because people assume you're "taking sides."

The entire point of the community should be to have discourse about a topic and go into depth, yet most comments and indeed entire threads are just "Nvidia bad!" with more words.

Obligatory disclaimer that I, too, don't necessarily side with Nvidia.

[–] JuxtaposedJaguar@lemmy.ml 15 points 1 day ago (1 children)

I don’t really care how a frame is generated if it looks good enough (and doesn’t come with other notable downsides like latency). This almost feels like complaining about screen space reflections being “fake” reflections. Like yeah, it’s fake, but if the average player experience is consistently better with it than without it then what does it matter?

But it does come with increased latency. It also disrupts the artistic vision of games. With MFG you're seeing more fake frames than real frames. It's deceptive and like snake oil in that Nvidia isn't distinguishing between fake frames and real frames. I forget what the exact comparison is, but when they say "The RTX 5040 has the same performance as the RTX 4090" but that's with 3 fake frames for every real frame, that's incredibly deceptive.

[–] FreedomAdvocate@lemmy.net.au 3 points 1 day ago* (last edited 23 hours ago)

He’s talking about DLSS upscaling - not DLSS Frame Generation - which doesn’t add latency.

[–] CheeseNoodle@lemmy.world 16 points 1 day ago (2 children)

I think DLSS (and FSR and so on) are great value propositions but they become a problem when developers use them as a crutch. At the very least your game should not need them at all to run on high end hardware on max settings. With them then being options for people on lower end hardware to either lower settings or combine higher settings with upscaling. When they become mandatory they stop being a value proposition since the benefit stops being a benefit and starts just being neccesary for baseline performance.

load more comments (2 replies)
[–] cyberpunk007@lemmy.ca 20 points 1 day ago (2 children)

Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again :/

[–] BlameTheAntifa@lemmy.world 21 points 1 day ago (1 children)

The 9070 XT is excellent and FSR 4 actually beats DLSS 4 in some important ways, like disocclusion.

load more comments (1 replies)
load more comments (1 replies)
[–] ArchmageAzor@lemmy.world 9 points 1 day ago (1 children)

I wish I had the money to change to AMD

[–] Kolanaki@pawb.social 10 points 1 day ago

This is a sentence I never thought I would read.

^(AMD used to be cheap)^

[–] ExLisper@lemmy.curiana.net 6 points 1 day ago

Is it because it's not how they make money now?

load more comments
view more: next ›