Ebbinggaus is insufficient according to April 2026 research
Ebbinggaus is insufficient according to April 2026 research
This research paper April 2026 specifically calls out Ebbinghaus as insufficient and I completely agree.
[https://arxiv.org/pdf/2604.11364](https://arxiv.org/pdf/2604.11364)
so i drafted a proposal specification to address the decay rate/promotion layers in an N-arity fashion in a declarative way down to the property level.
i am looking for community feedback because this could potentially allow rapid experimentation with various decay policies and memory management models.
[https://github.com/orneryd/NornicDB/issues/100](https://github.com/orneryd/NornicDB/issues/100)
i already have a workaround in place using the retention policy system but it’s a cheap hack that doesn’t provide all of the benefits the draft spec does.
TLDR; We are ripping out hardcoded Ebbinghaus memory tiers in NornicDB and replacing them with a fully declarative, MVCC-aware retention and promotion engine. The core architectural shift is Score-Before-Visibility paired with isolated access tracking: nodes, edges, and even individual properties decay over time but get reinforced by access, with all access-mutation state handled in a separate accessMeta index so the main bitemporal tree stays clean and read-only during evaluation. If an entity decays below its policy threshold, it becomes completely invisible to standard Cypher queries unless explicitly bypassed with a new reveal() function. This setup natively supports a true multi-layer cognitive architecture—meaning ephemeral "Memory" episodes decay naturally, while durable "Knowledge" facts and "Wisdom" directives bypass time-based forgetting entirely and only update via supersession, permanently solving the standard AI database flaw of accidentally deleting hard facts just because the clock ticked.