Bought my first AMD card last year, never looked back
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
After being AMD for years recently went back to nvidia for one reason. nvenc works way better for encoding livestreams and videos than amd
Since when did gfx cards need to cost more than a used car?
We are being scammed by nvidia. They are selling stuff that 20 years ago, the equivalent would have been some massive research prototype. And there would be, like, 2 of them in an nvidia bunker somewhere powering deep thought whilst it calculated the meaning of life, the universe, and everything.
3k for a gfx card. Man my whole pc cost 500 quid and it runs all my games and pcvr just fine.
Could it run better? Sure
Does it need to? Not for 3 grand...
Fuck me!.....
My last nvidia card was gtx 980.I bought two of them. After i heard about 970 scandal. It didnt directly affect me but fuck nvidia for pulling that shit. Havent bought anything from them. Stopped playing games on pc afterwards, just occasionally on console and laptop igpu.
Folks, ask yourselves, what game is out there that REALLY needs a 5090? If you have the money to piss away, by all means, it's your money. But let's face it, games have plateaued and VR isn't all that great.
Nvidia's market is not you anymore. It's the massive corporations and research firms for useless AI projects or number crunching. They have more money than all gamers combined. Maybe time to go outside; me included.
Only Cyberpunk 2077 in path tracing. The only game i have not played until i can run it on ultra settings. But for that amount of money, i better wait until the real 2077 to see it happen.
Oh, VR is pretty neat. It sure as shit don't need no $3000 graphics card though.
I plan on getting at least a 4060 and I'm sitting on that for years. I'm on a 2060 right now.
My 2060 alone can run at least 85% of all games in my entire libraries across platforms. But I want at least 95% or 100%
Cyberpunk 2077 with the VR mod is the only one I can think of. Because it’s not natively built for VR you have to render the world separately for each eye leading to a halving of the overall frame rate. And with 90 fps as the bare minimum for many people in VR you really don’t have a choice but to use the 5090.
Yeah it’s literally only one game/mod, but that would be my use case if I could afford it.
Also the Train World Sim Series. Those games make my tower complain, and my laptop give up.
It covers the breadth of problems pretty well, but I feel compelled to point out that there are a few times where things are misrepresented in this post e.g.:
Newegg selling the ASUS ROG Astral GeForce RTX 5090 for $3,359 (MSRP: $1,999)
eBay Germany offering the same ASUS ROG Astral RTX 5090 for €3,349,95 (MSRP: €2,229)
The MSRP for a 5090 is $2k, but the MSRP for the 5090 Astral -- a top-end card being used for overclocking world records -- is $2.8k. I couldn't quickly find the European MSRP but my money's on it being more than 2.2k euro.
If you’re a creator, CUDA and NVENC are pretty much indispensable, or editing and exporting videos in Adobe Premiere or DaVinci Resolve will take you a lot longer[3]. Same for live streaming, as using NVENC in OBS offloads video rendering to the GPU for smooth frame rates while streaming high-quality video.
NVENC isn't much of a moat right now, as both Intel and AMD's encoders are roughly comparable in quality these days (including in Intel's iGPUs!). There are cases where NVENC might do something specific better (like 4:2:2 support for prosumer/professional use cases) or have better software support in a specific program, but for common use cases like streaming/recording gameplay the alternatives should be roughly equivalent for most users.
as recently as May 2025 and I wasn’t surprised to find even RTX 40 series are still very much overpriced
Production apparently stopped on these for several months leading up to the 50-series launch; it seems unreasonable to harshly judge the pricing of a product that hasn't had new stock for an extended period of time (of course, you can then judge either the decision to stop production or the still-elevated pricing of the 50 series).
DLSS is, and always was, snake oil
I personally find this take crazy given that DLSS2+ / FSR4+, when quality-biased, average visual quality comparable to native for most users in most situations and that was with DLSS2 in 2023, not even DLSS3 let alone DLSS4 (which is markedly better on average). I don't really care how a frame is generated if it looks good enough (and doesn't come with other notable downsides like latency). This almost feels like complaining about screen space reflections being "fake" reflections. Like yeah, it's fake, but if the average player experience is consistently better with it than without it then what does it matter?
Increasingly complex manufacturing nodes are becoming increasingly expensive as all fuck. If it's more cost-efficient to use some of that die area for specialized cores that can do high-quality upscaling instead of natively rendering everything with all the die space then that's fine by me. I don't think blaming DLSS (and its equivalents like FSR and XeSS) as "snake oil" is the right takeaway. If the options are (1) spend $X on a card that outputs 60 FPS natively or (2) spend $X on a card that outputs upscaled 80 FPS at quality good enough that I can't tell it's not native, then sign me the fuck up for option #2. For people less fussy about static image quality and more invested in smoothness, they can be perfectly happy with 100 FPS but marginally worse image quality. Not everyone is as sweaty about static image quality as some of us in the enthusiast crowd are.
There's some fair points here about RT (though I find exclusively using path tracing for RT performance testing a little disingenuous given the performance gap), but if RT performance is the main complaint then why is the sub-heading "DLSS is, and always was, snake oil"?
obligatory: disagreeing with some of the author's points is not the same as saying "Nvidia is great"
Thanks for providing insights and inviting a more nuanced discussion. I find it extremely frustrating that in communities like Lemmy it's risky to write comments like this because people assume you're "taking sides."
The entire point of the community should be to have discourse about a topic and go into depth, yet most comments and indeed entire threads are just "Nvidia bad!" with more words.
Obligatory disclaimer that I, too, don't necessarily side with Nvidia.
I don’t really care how a frame is generated if it looks good enough (and doesn’t come with other notable downsides like latency). This almost feels like complaining about screen space reflections being “fake” reflections. Like yeah, it’s fake, but if the average player experience is consistently better with it than without it then what does it matter?
But it does come with increased latency. It also disrupts the artistic vision of games. With MFG you're seeing more fake frames than real frames. It's deceptive and like snake oil in that Nvidia isn't distinguishing between fake frames and real frames. I forget what the exact comparison is, but when they say "The RTX 5040 has the same performance as the RTX 4090" but that's with 3 fake frames for every real frame, that's incredibly deceptive.
He’s talking about DLSS upscaling - not DLSS Frame Generation - which doesn’t add latency.
I think DLSS (and FSR and so on) are great value propositions but they become a problem when developers use them as a crutch. At the very least your game should not need them at all to run on high end hardware on max settings. With them then being options for people on lower end hardware to either lower settings or combine higher settings with upscaling. When they become mandatory they stop being a value proposition since the benefit stops being a benefit and starts just being neccesary for baseline performance.
They’re never mandatory. What are you talking about? Which games can’t run on a 5090 or even 5070 without DLSS?
Correct me if I am wrong but maybe they meant when Publisher/Devs list hardware requirement for their games and includes DLSS in the calculations. IIRC AssCreed Shadows and MH Wilds had that.
My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You'd think people would have learned their lessons a decade ago.
they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090
I have overclocked my AMD 7900XTX as far as it will go on air alone.
Undervolted every step on the frequency curve, cranked up the power, 100% fan duty cycles.
At it's absolute best, it's competitive or trades blows with the 4090D, and is 6% slower than the RTX 4090 Founder's Edition (the slowest of the stock 4090 lineup).
The fastest AMD card is equivalent to a 4080 Super, and the next gen hasn't shown anything new.
AMD needs a 5090-killer. Dual socket or whatever monstrosity which pulls 800W, but it needs to slap that greenbo with at least a 20-50% lead in frame rates across all titles, including raytraced. Then we'll see some serious price cuts and competition.
That’s exactly it, they have no competition at the high end
Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that's not where their product line intends to go. That's why it's smart.
For reference: AMD has the most deployed GPUs on the planet as of right now. There's a reason why it's in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn't just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.
this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.
Yup. You want a server? Dell just plain doesn't offer anything but Nvidia cards. You want to build your own? The GPGPU stuff like zluda is brand new and not really supported by anyone. You want to participate in the development community, you buy Nvidia and use CUDA.
Fortunately, even that tide is shifting.
I've been talking to Dell about it recently, they've just announced new servers (releasing later this year) which can have either Nvidia's B300 or AMD's MI355x GPUs. Available in a hilarious 19" 10RU air-cooled form factor (XE9685), or ORv3 3OU water-cooled (XE9685L).
It's the first time they've offered a system using both CPU and GPU from AMD - previously they had some Intel CPU / AMD GPU options, and AMD CPU / Nvidia GPU, but never before AMD / AMD.
With AMD promising release day support for PyTorch and other popular programming libraries, we're also part-way there on software. I'm not going to pretend like needing CUDA isn't still a massive hump in the road, but "everyone uses CUDA" <-> "everyone needs CUDA" is one hell of a chicken-and-egg problem which isn't getting solved overnight.
Realistically facing that kind of uphill battle, AMD is just going to have to compete on price - they're quoting 40% performance/dollar improvement over Nvidia for these upcoming GPUs, so perhaps they are - and trying to win hearts and minds with rock-solid driver/software support so people who do have the option (ie in-house code, not 3rd-party software) look to write it with not-CUDA.
To note, this is the 3rd generation of the MI3xx series (MI300, MI325, now MI350/355). I think it might be the first one to make the market splash that AMD has been hoping for.
Once the 9070 dropped all arguments for Nvidia stopped being worthy of consideration outside of very niche/fringe needs.
Got my 9070XT at retail (well retail + VAT but thats retail for my country) and my entire PC costs less than a 5090.
Yeah I got a 9070 + 9800x3d for around $1100 all-in. Couldn’t be happier with the performance. Expedition 33 running max settings at 3440x1440 and 80-90fps
But your performance isn’t even close to that of a 5090…….
80-90 fps @ 1440 isn’t great. That’s like last gen mid tier nvidia gpu performance.
Not 1440 like you’re thinking. 3440x1440 is 20% more pixel to render than standard 2560x1440’s. It’s a WS. And yes at max settings 80-90fps is pretty damn good. It regularly goes over 100 in less busy environments.
And yeah it’s not matching a 5090, a graphics card that costs more than 3x mine and sure as hell isn’t giving 3x the performance.
You’re moving the goalposts. My point is for 1/4th the cost you’re getting 60-80% of the performance of overpriced, massive, power hungry Nvidia cards (depending on what model you want to compare to). Bang for buck, AMD smokes Nvidia. It’s not even close.
Unless cost isn’t a barrier to you or you have very specific needs they make no sense to buy. If you’ve got disposable income for days then fuck it buy away.
I assume people mean 3440x1440 when they say 1440 as it’s way more common than 2560x1440.
Your card is comparable to a 5070, which is basically the same price as yours. There’s no doubt the 5080 and 5090 are disappointing in their performance compared to these mid-high cards, but your card can’t compete with them and nvidia offer a comparable card at the same price point as AMDs best card.
Also the AMD card uses more power than the nvidia equivalent (9700xt vs 5070).
Better bang for your buck, but way less bang and not as impressive of a bang.
Not all of us can afford to spend $3000 for a noticeable but still not massive performance bump over a $700 option. I don’t really understand how this is so difficult to understand lol. You also have to increase the rest of your machine cost for things like your PSU, because the draw on the 5xxx series is cracked out. Motherboard, CPU, all of that has to be cranked up unless you want bottlenecks. Don’t forget your high end 165hz monitor unless you want to waste frames/colors. And are we really going to pretend after 100fps the difference is that big of a deal?
Going Nvidia also means unless you want to be fighting your machine all the time, you need to keep a Windows partition on your computer. Have fun with that.
At the end of the day buy what you want dude, but I’m pulling down what I said above on a machine that cost about $1700. Do with that what you will
@RazgrizOne @FreedomAdvocate the reason why i decided for AMD after being nearly all my life team green ( aka >20 years ) , i feel like AI Frame Generation and Upscalling are anti consumer cause the hide the real performance behind none reproducable image generation. And if you look correctly ... this is how nvidia has a performance lead over AMD.
I’m not even against tricks like upscaling and such to be honest. If it looks good I’ll take it lol. But I do agree they don’t feel like long-term, hardened solutions vs something more like “raw performance.” And there’s no doubt There is a certain elegance to AMD’s cards
Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again :/
The 9070 XT is excellent and FSR 4 actually beats DLSS 4 in some important ways, like disocclusion.
Concur.
I went from a 2080 Super to the RX 9070 XT and it flies. Coupled with a 9950X3D, I still feel a little bit like the GPU might be the bottleneck, but it doesn't matter. It plays everything I want at way more frames than I need (240 Hz monitor).
E.g., Rocket League went from struggling to keep 240 fps at lowest settings, to 700+ at max settings. Pretty stark improvement.
I wish I had the money to change to AMD
This is a sentence I never thought I would read.
^(AMD used to be cheap)^
Is it because it's not how they make money now?
And only toke 15 years to figure it out?