Anyone else getting massive lag on OpenClaw + ChatGPT OAuth?
On 2026.4.29, every request is taking 20+ seconds before the model even responds. Trace logs show:
- model-resolution: 11s
- auth: 5s
- attempt-dispatch: 4s
Gateway logs show `codex model discovery failed; using fallback catalog` and `sendChatAction failed` on Telegram.
Curled the endpoints directly. api.openai.com and api.telegram.org both fast and clean. But chatgpt.com/backend-api/codex returns a Cloudflare challenge page asking for JS and cookies. Pretty sure that's what's eating the time on every request.
Re-authed fresh in incognito, refresh token is valid, no proxy, no VPN, no firewall apps. Tried adding skipCatalog and useFallbackCatalog under plugins.entries.openai-codex.config and the schema rejected both keys. Grepped the catalog source and there's no env var or config flag exposed to skip it.
Is anyone on a newer version where this got fixed? Or is there a version between when 5.5 launched and now where the catalog still works? Don't want to roll back to 2026.3.13 because I lose 5.5.
Mac, Node 22, OAuth via ChatGPT Plus subscription.