u/AccomplishedOne9144

Litellm response stream

Hello all,

I am using OpenWebUI with litellm to proxy my models from azure.

But when I change the connection type to response API the streaming doesn't work anymore.

Did anyone did that successfully?

reddit.com
u/AccomplishedOne9144 — 21 hours ago