LLM Wiki
Hey everyone,
I've been thinking a lot about Andrej Karpathy's idea of moving away from classic RAG and our internal outdated confluence WIKI solution toward what he calls an "LLM Wiki" for all employees in our company that could combine both layers (personal + business) in one semi-personal LLM Wiki.
Has anyone implemented something similar? A curated knowledge layer that sits between your raw documents and the LLM, rather than using OWUI's built-in Knowledge/RAG directly? Has maybe anyone built a custom MCP server or function/tool that does graph-based retrieval (following wikilinks to find related context) instead of pure vector similarity?
For those running this at company scale: How do you handle content freshness? RAG doesn't know when a chunk is outdated. The LLM Wiki approach solves this with review_cycle_days and a gardener agent, but I'm curious about other solutions.
I think the biggest gap in current OWUI RAG is quality control. If you upload a PDF and hope for the best. An LLM Wiki gives you auditable, versioned, interlinked knowledge with explicit confidence scores. But it's obviously more work to set up.
Would love to hear if anyone's gone down this path or has thoughts on whether this is overengineering vs. a real improvement over vanilla RAG.