Hi all,
I recently received a small grant of around $6800 to buy a workstation for my lab at the university. I work in computational engineering / numerical methods, mainly CPU-based simulations and algorithms.
I know this is not a huge budget for a high-performance workstation, but I see it as a starting point to slowly build the lab. I’m based in a small island state, so I also need to account for shipping/import costs, meaning the actual budget for the machine itself will probably be a bit less.
At the moment, my work is much more CPU/RAM-heavy than GPU-heavy. So my main requirement is to get as much RAM as possible. I would like to start with at least 128 GB RAM, but if there is a realistic way to get 256 GB within this budget, that would be ideal.
For the CPU, I was thinking along the lines of an AMD Ryzen Threadripper, but I’m open to suggestions. I’m not sure whether it is better to go for a newer/lower-end Threadripper, older higher-core-count workstation parts, or even something else entirely.
For the GPU, I don’t need anything very powerful right now. A basic GPU would probably be enough, as long as the system can be upgraded later. In the future, I may have students working on parallelized versions of the codes, GPU acceleration, or machine learning, but that is not the immediate priority.
A few questions:
- What kind of workstation configuration would you recommend for this budget?
- Should I prioritize CPU cores, RAM capacity, memory bandwidth, or platform expandability?
- Is Threadripper the right direction, or should I consider EPYC / Xeon / used workstation hardware?
- What would be the best way to make the system expandable in the future?
- If I get additional small grants later, would it make more sense to upgrade this machine with more RAM/GPU, or start adding small compute nodes?
Initially, the workstation will probably be used by two people. Later, after upgrades, it may support more students in the lab.
Any advice on practical configurations, pitfalls, or good upgrade paths would be appreciated.