▲ 1 r/ProxyUseCases
Trooper – A Go proxy that falls back to local Ollama when any LLM quota runs out
I built Trooper after hitting Claude's free tier limit mid-conversation one too many times.
It's a lightweight Go proxy that sits between your app and any LLM API (Claude, GPT, Groq, Mistral). When the primary provider hits a quota or rate limit, it automatically reroutes to a local Ollama model — full conversation context intact. Zero code changes in your app, just point your base URL at localhost:3000.
What it does:
- Works with any LLM provider, not just Claude
- Preserves full conversation history across the switch
- Streaming support
- Configurable fallback trigger codes (429, 402, 529, 400 by default)
- 401 errors surface properly — bad keys are never masked
- Single Go binary, Docker support
Tested this morning — mid-conversation context preserved perfectly across the switch. Built in an evening.
GitHub: https://github.com/shouvik12/trooper
Happy to answer questions or take feedback
What's next: MCP integration with Claude AI
u/Substantial_Load_690 — 2 days ago