
Anyone else feeling like the AI space is turning into a “compute economy”?
A year ago everyone wanted to buy RTX 4090s. Now I see more devs just rent GPU instances for training and inference whenever they need them.
For AI workflows, it honestly makes sense:
• no massive upfront hardware cost
• instant scaling for bigger models
• access to H100/A100 hardware
• easier experimentation
But at the same time, cloud GPU pricing can get brutal fast.
Do you think the future of AI builders is owning local rigs or simply learning how to efficiently rent GPU power on demand?