u/ArchPowerUser

I got tired of clients only supporting Ollama/OpenAI, so I made a proxy that lets them use custom providers
▲ 2 r/ollama

I got tired of clients only supporting Ollama/OpenAI, so I made a proxy that lets them use custom providers

Conduit is an Ollama/OpenAI-compatible proxy that lets unsupported clients work with custom LLM providers through a unified API.

Main goal:
Use whatever provider you want in apps that normally hardcode provider support.

Features:

  • Ollama-compatible API
  • OpenAI-compatible endpoints
  • Streaming support
  • Model mapping
  • Lightweight/self-hostable

Built mainly because I wanted to use my own provider setup with tools that refused to support it directly.

u/ArchPowerUser — 5 days ago