▲ 18 r/ollama
Different between RAM and VRAM
So I have a system that has 64GB RAM and a NVIDIA GeForce RTX 3080 Ti. I am confused about RAM and VRAM. I see that the GPU I have has a VRAM of 12GB.
I wanted to give Gemma4:31b a try but I see that it is 20GB in size. I am a noob so forgive me but can’t that loads into RAM instead of the GPU? Also based on my config, any good agentic coding models you can suggest? I know im not going to get the same as what Claude offers
u/ConclusionUnique3963 — 1 day ago