Is it worth upgrading from 2x RTX6kPro to 4x?
Hi All:
Earlier this year I built a new machine specifically for inference work. I went with 2x RTX6k Pro Max-Q to start with. I've mostly just been using Qwen3.6-35b-a3b which is great but I'm not really taking advantage of the 2 cards. There's plenty of much larger models like kimi, deepseek, and the like; but I cant run those on 2 cards.
I think my workflow would benefit from some of these bigger models, but my question is, does upgrading from 2 to 4 cards make sense? It feels like many people jump straight up to 8 cards.
Do people who use 4x RTX6kPro cards feel like the models that run on that hardware is worthwhile? Are you comfortable where you are at that level of vram?
Thanks for your thoughts!
u/MenuNo294 — 3 days ago