Best budget workstation for local AI / self-hosted LLMs in 2026?
I’m currently looking for the cheapest possible or best price/performance machine for local AI / self-hosted LLMs.
I’m not looking for a perfect high-end system, but rather for a smart base that can realistically run local models, agents, Docker containers, and a knowledge base.
Right now I’m also looking at used workstations, for example an HP Z840 (what I found: 2 x Xeon E5-4669 v4; 160 GB DDR4 ECC RAM; 1250 watts). I’m still open when it comes to the GPU and would appreciate recommendations.
What matters to me:
- the best possible price/performance ratio
- used hardware is totally fine
- a solid base for a future GPU upgrade
- enough headroom for RAM, PCIe, and PSU
- suitable for local LLMs, agents, and Docker
I’d be especially interested in real-world experience:
What do you currently see as the best affordable base?
Older dual-Xeon workstations like the Z840?
Or would you rather go with a newer platform with fewer cores but a more modern foundation? If so, which alternative?
And at what point does hardware this old stop being worth it because GPU, PSU, BIOS, or PCIe limitations end up killing the price advantage?
I’d appreciate concrete models, builds, or real-world experience from 2025/2026.