this post was submitted on 21 Feb 2025
33 points (90.2% liked)

PC Gaming

9592 readers
578 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Tattorack@lemmy.world 9 points 2 days ago (1 children)

I think NVidia is already getting a kick in the ass.

The first GPU I bought was a GTX 1060 with 6GB. A legendary card I kept using until just last year November.

What did I upgrade to?

Why Intel of course. The A770 is cheaper than a AMD of the same performance range, and has a weird quirk where it actually does better at 1440p than similar cards. Very likely the spacious VRAM, which is also nice to have for the 3D work I do.

I didn't upgrade past the 1060 earlier because the 20 series wasn't that big enough of a leap, and the 30 series is where a lot of Nvidia's bullshit started.

And for the industrial market $ per performance is all that matters because in large deployments there is no issue with just parallelizing as many GPUs as you want. Even if an intel GPU for a 10th of the price has a 5th of the performance, then you just slap together 5 of them and get the same processing power for half the price.