this post was submitted on 02 Jul 2025
273 points (94.5% liked)

Fuck AI

3635 readers
1301 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Blaster_M@lemmy.world 0 points 1 month ago (1 children)

4GB card can run smol models, bigger ones require an nvidia and lots of system RAM, and performance will be proportionally worse by VRAM / DRAM usage balance.

[–] theunknownmuncher@lemmy.world 3 points 1 month ago

require an nvidia

Big models work great on macbooks or AMD GPUs or AMD APUs with unified memory