this post was submitted on 28 Jun 2025
7 points (88.9% liked)
Ollama - Local LLMs for everyone!
197 readers
4 users here now
A place to discuss Ollama, from basic use, extensions and addons, integrations, and using it in custom code to create agents.
founded 2 weeks ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yep, I keep reading that 32GB is considered a minimum. I also see that ollama is capable of sharing the model between GPU RAM and machine RAM, so the more RAM you have the better, starting with GPU RAM.