
VIKI: A Sovereign, CLI-First AI Agent for Local LLMs (Ollama-Native, Privacy-First)
Hey everyone, I’m excited to share VIKI (v8.1.0), a sovereign digital intelligence system designed for users who want the power of autonomous AI agents without sacrificing privacy or performance.
🛡️ Why VIKI?
Most agent frameworks are cloud-dependent or heavy on web-UI overhead. VIKI is built from the ground up to be local-first and CLI-driven. It runs natively with Ollama (Phi-3, DeepSeek, Llama 3) and follows a strict Clean/Hexagonal Architecture for maximum resilience.
✨ Key Features
- Neural Forge: A specialized pipeline that bakes your personal lessons and "wisdom" back into custom Ollama images (LoRA/Prompt-baking).
- Sovereign Cognitive Architecture: Features tiered reasoning (Reflex, Chatter, Planning) powered by the Orythix kernel.
- Agentic Orchestration: Integrated MCP (Model Context Protocol) support for seamless tool and service integration.
- High-Performance CLI: No 3D hologram lag or dashboard bloat—just a high-speed terminal interface for air-gapped mission control.
- RAG Memory: Persistent SQLite-backed semantic memory that survives restarts.
🚀 Local-First Sovereignty
VIKI is a "privacy-first" alternative to cloud agents like AutoGPT or ChatGPT. Everything—from the mission planning to the tool execution—happens on your machine. No telemetry, no data leakage.
🔗 Get Started
- GitHub: https://github.com/Orythix/viki
- Ollama Hub: https://ollama.com/orythix/viki-neural-forge
- Pinokio: One-click installable via
pinokio.js. I’d love to get your feedback on the architecture and the Neural Forge pipeline. If you’re looking for an agent that respects your data and lives in your terminal, give VIKI a spin! VIKI: Virtual Intelligence, Real Evolution.