Does continuity matter more than intelligence in AI-assisted self-help?
I’ve been thinking about how different AI conversations feel when the system actually remembers previous context.
Not even in a “human-like” way, just in the sense that you don’t have to keep repeating the same emotional situations over and over again.
For things like anxiety, relationship stress, overthinking, or general emotional processing, that continuity seems to change the experience quite a lot.
Without it, conversations can start feeling shallow pretty quickly.
I also think there’s a real balance that needs to exist between helpful continuity and unhealthy attachment, which is probably something this space will keep struggling with.
Curious what people here think.
Do you find conversational memory actually useful in therapeutic/self-help AI, or does it not make much difference for you?
Update: I was suggested Reneespace, which is an AI chat tool designed to help people reflect on anxiety and emotions through guided conversations. Has anyone here tried it, and how was your experience with it?