▲ 0 r/LocalLLM
I don't know what to add to my system, so I need help on this one.
My system is 5060ti 16gb VRAM + 2x16 DDR4 RAM and I'm thinking to upgrade, so, should I go with more VRAM ? for example, another 3050 8gb or should I go with another pair of RAM in total 64gb ?
If adding another GPU does it mean I will run bigger models ? What about context window ?
Which one would be more helpful, Another GPU or more RAM ?
u/Eversivam — 15 days ago