
Just got a new baby for my AI local journey - Need some Suggestions
I just got a new baby for my AI Journey. I'm coming from a 4060 8GB ( capable to run properly the Qwen 3.6 35B A3B ). But I need more VRAM and compute, so I was searching for the GPU with the best price/performance on the market.
So I got this 3090 with 24gb of memory ( 3 times the memory on the 4060 ).
I still don't know if I'm going to keep the 4060 to run small models and the 3090 to run dense with mtp.
Any suggestion?
P.S. power supply upgrade on the way.
P.S.S.
My current setup:
- CPU: AMD Ryzen™ 9 7900X × 24
- RAM: 64GB DDR5 5600MHZ
- MoBo: Gigabyte Technology Co., Ltd. B650 GAMING X AX V2