u/Dense_Gate_5193

▲ 1 r/Rag

Ebbinggaus is insufficient according to April 2026 research

Ebbinggaus is insufficient according to April 2026 research

This research paper April 2026 specifically calls out Ebbinghaus as insufficient and I completely agree.

[https://arxiv.org/pdf/2604.11364](https://arxiv.org/pdf/2604.11364)

so i drafted a proposal specification to address the decay rate/promotion layers in an N-arity fashion in a declarative way down to the property level.

i am looking for community feedback because this could potentially allow rapid experimentation with various decay policies and memory management models.

[https://github.com/orneryd/NornicDB/issues/100](https://github.com/orneryd/NornicDB/issues/100)

i already have a workaround in place using the retention policy system but it’s a cheap hack that doesn’t provide all of the benefits the draft spec does.

TLDR; We are ripping out hardcoded Ebbinghaus memory tiers in NornicDB and replacing them with a fully declarative, MVCC-aware retention and promotion engine. The core architectural shift is Score-Before-Visibility paired with isolated access tracking: nodes, edges, and even individual properties decay over time but get reinforced by access, with all access-mutation state handled in a separate accessMeta index so the main bitemporal tree stays clean and read-only during evaluation. If an entity decays below its policy threshold, it becomes completely invisible to standard Cypher queries unless explicitly bypassed with a new reveal() function. This setup natively supports a true multi-layer cognitive architecture—meaning ephemeral "Memory" episodes decay naturally, while durable "Knowledge" facts and "Wisdom" directives bypass time-based forgetting entirely and only update via supersession, permanently solving the standard AI database flaw of accidentally deleting hard facts just because the clock ticked.

reddit.com
u/Dense_Gate_5193 — 18 hours ago

Ebbinggaus is insufficient according to April 2026 research

This research paper April 2026 specifically calls out Ebbinghaus as insufficient and I completely agree.

https://arxiv.org/pdf/2604.11364

so i drafted a proposal specification to address the decay rate/promotion layers in an N-arity fashion in a declarative way down to the property level.

i am looking for community feedback because this could potentially allow rapid experimentation with various decay policies and memory management models.

https://github.com/orneryd/NornicDB/issues/100

reddit.com
u/Dense_Gate_5193 — 2 days ago

How about running an LLM inside the graph?

i just found this sub and i’m really excited to share it with you guys because it changes the entire dynamics of memory for LLMs. UC Louvain benchmarked it apples to apples against neo4j for cyber-physical automata learning, and it performed 2.2x faster than neo4j for their experimentation cycle. sub-ms writes, hnsw search, and a whole agentic plugin system that performs in-memory graph-rag with the LLM running inside embedded llama.cpp.

https://github.com/orneryd/NornicDB/blob/main/docs/architecture/README.md

590+ stars, MIT licensed. it’s already deployed in production at a fortune 5 company where i work. i got really lucky to be able to develop this OSS and share it.

i’m not asking for anything from anyone other than if you’re interested try it out, if you like it, im grateful!

https://github.com/orneryd/NornicDB/releases/tag/v1.0.42

u/Dense_Gate_5193 — 5 days ago