Lately I’ve been stuck in a thought loop about AI pricing.
Top-tier AI products, especially Claude, clearly aren’t cheap to run. At some point, prices may go up, token limits may go down, or both.
That makes me think a capable local machine for running local LLMs could be a smart move before more people start thinking the same way and hardware demand pushes prices up.
On the other hand, competition between AI providers is still very high. I don’t think they can cut tokens or increase prices too aggressively without users switching fast.
We already saw a small version of this with Claude: limits felt tighter, Claude Code disappeared from the $20 Pro subscription table, people got angry, and Anthropic moved back quickly and apologized.
I even know people who switched to Codex during that time.
So I’m torn: maybe buying strong local hardware now is smart, or maybe the big AI providers will keep subsidizing everything longer than expected.