u/Bharath720

Building a companion-like app that uses vector db which can suggest undiscovered mental illnesses using patterns across conversations

I've been working on an AI-powered chatbot (personal project) that acts more like a long-term companion than a normal assistant. One of the goals is being able to notice patterns across conversations over time, things like recurring anxiety spirals, depressive language patterns, social withdrawal signals, mood shifts, etc. Not diagnosing people obviously, but it's like surfacing behavioral patterns carefully and safely.

What I underestimated was how confusing the retrieval layer got once conversations become emotional and deeply contextual.

A few problems I’ve run into:

  1. semantic similarity breaks down badly when users talk indirectly. Someone saying “I’m tired” could mean physically tired, emotionally exhausted, depressed, burnt out, or just sleep deprived. Embeddings cluster these together in weird ways.

  2. storing every conversation chunk destroys retrieval quality over time. The DB slowly fills with emotionally similar but contextually useless memories and retrieval starts surfacing the wrong emotional moments.

  3. recency vs importance is difficult. Some memories from 4 months ago matter more than something said yesterday. Simple vector similarity doesn’t really capture emotional significance.

  4. summarization causes personality drift. If you repeatedly compress memories into summaries, the bot slowly starts remembering an interpreted version of the user instead of the actual user.

  5. pattern detection gets dangerous fast. There’s a huge difference between “this resembles an anxiety pattern” and accidentally over-pathologizing normal behavior.

Currently experimenting with hybrid retrieval:

vector search

metadata filtering

emotional weighting

decayed long-term memory

Still doesn't feel that strong, is anyone else building companion-style AI, therapy-adjacent systems? If yes, have you faced these issues as well and how did you solve them?

A few things I’d love feedback on:

  1. how are you deciding what deserves long-term memory vs temporary context?

  2. are you re-embedding summaries or storing raw conversational moments separately?

  3. how are you preventing retrieval loops where the model keeps reinforcing the same interpretation of the user?

  4. what vector DBs are people actually happy with at scale for conversational memory?

Also, my app does not "diagnose" people directly. It just gives a suggestion and user discretion is always recommended since it's an AI that's doing the judging.

reddit.com
u/Bharath720 — 4 days ago

Account politics being disguised as "product feedback"

It's insane how much politics is disguised as product feedback in B2B SaaS.

Had a customer call this week where three stakeholders completely contradicted each other for 45 minutes straight.

Ops lead wanted more automation because the current workflow was too manual. Their compliance person wanted MORE approval steps because automation made them nervous. Their manager wanted neither. Just better reporting so leadership could apparently see what’s going on. All of this happened while the actual users barely spoke the whole meeting.

This keeps happening as we move upmarket. The bigger the customer, the less “build what users ask for” works as a strategy. Half the job becomes figuring out who actually has the power to say yes, who feels threatened by change, and whose KPI your feature breaks.

If you listen to all of them equally, you end up building bloated enterprise software nobody actually likes using.

I arrived at the conclusion that B2B SaaS product management is less about feature prioritization and more about organizational psychology.

reddit.com
u/Bharath720 — 5 days ago

Sony, Nintendo grapple with memory price surge as AI boom constrains supply, leading to higher console prices and projected lower sales

Sony and Nintendo are both openly saying memory prices are exploding because AI datacenters are taking up supply, and now we are getting hit with higher console prices as a result. The Switch 2 and PS5 prices have been increased because of this exact reason.

AI infrastructure is competing with regular consumer electronics for the same components. Since memory production takes a long time to scale, there is no fix for this just yet. AI demand is quietly raising prices across the entire tech ecosystem, not just GPUs. Consoles, phones, laptops and we don't know what's next.

reuters.com
u/Bharath720 — 5 days ago

I’ve been wondering about this with the rise of remote and flexible study options. academically, many programs are treated the same as traditional campus-based ones, but I’m more curious about how things play out in real hiring situations.

do employers tend to view applicants differently depending on how they completed their studies? even if it’s not officially stated, is there any noticeable preference? or has that distinction mostly faded over time?

would really appreciate hearing perspectives from people who’ve taken different paths, or anyone involved in hiring decisions. how much does the format of education actually matter when it comes to getting a job?

reddit.com
u/Bharath720 — 18 days ago