u/Distinct-Shoulder592

▲ 2 r/AISEOInsider+1 crossposts

There's a meaningful difference between a knowledge base your LLM searches and one it can navigate. Has anyone shipped something in the second category?

RAG gives you search over a corpus. Useful. But I keep thinking about a different thing a wiki your model can actually move through. Structured pages, linked concepts, compiled from raw sources, updated incrementally.

Built something that does this. But wondering what else exists in this space before I go further.

Karpathy pointed at it. Gbrain is circling it. Feels like the problem is understood but the tooling isn't there yet.

What are people actually using?

reddit.com
u/Distinct-Shoulder592 — 1 hour ago

Core insight holds instead of dumping files and hoping retrieval works the system builds a structured wiki with markdown pages links citations and search where each answer becomes a new page and knowledge compounds fast once content was organized into entities concepts syntheses sources and reports the graph in Obsidian became clear and by day two it was already linking ideas across sources that were previously missed search before write proved essential to avoid duplication and citations on every paragraph made outputs reliable building from scratch was a mistake since llm-wiki-compiler already handles the system and prompt discipline turned out to be the real mechanism because without strict rules agents default to messy notes a single shared vault with attribution works while separate ones break the graph contradictions need to be preserved not overwritten and using Hermes Agent to control ingestion and updates made the system feel automatic the pattern works but depends on structure and enforced behavior rather than the idea itself

reddit.com
u/Distinct-Shoulder592 — 12 days ago