u/connexionwithal

Crap computer, with DDR2 + external Nvidia R9 GPU? Slower, but can one make it work?

Hey all, I know what I am about to say may be laughable and unideal, but is there is a way to make this work? I like local but can't afford a big budget local AI setup. Can I just plug in an Nvidia R9 in an external GPU case (with psu) and plug it into an old computer and make a slow running ollama server? It doesn't have much RAM, like 8 or 16 GB, and it is also slow DDR, but can I make it use SWAP space or something for big code ingestions? I don't mind waiting hours for results. I just don't want to deal with this model quotas when coding. Tried searching for this use case in the sub but can't seem to find a clear answer on this.

reddit.com
u/connexionwithal — 6 hours ago