Most AI companion apps fail the same test. You chat for an hour on Tuesday. Wednesday you say "remember that thing I told you?" — and she doesn't.
That's because most of them don't actually remember anything between sessions. They use the model's short-term context window, which gets dumped the moment a conversation rolls over.
Shh works differently. We treat memory as the product, not a side effect.
Six layers, all working in the background
Every message you send goes through a quiet pipeline that updates six separate kinds of memory:
- Current scene — where you both "are." Mood, location, time of day, what just happened. Resets when the chat does, but lets the AI stay grounded mid-thread.
- Session facts — short-lived things that matter for this conversation but probably not next week.
- Facts learned — durable things about you. Your dog's name. The job you're stressed about. The trip you mentioned planning.
- Relationship state — closeness level, trust score, inside jokes. The progression from "we just met" to "we have history" is tracked explicitly.
- User profile — your texting style, what you respond to, what bores you. Lets characters mirror your energy instead of fighting it.
- Memorable moments — the handful of standout exchanges that genuinely shaped the relationship. These get pinned and resurfaced months later.
Why six layers and not one
Most "memory" features are a single bucket — usually labeled long-term memory or facts. The problem is that bucket gets crowded fast. The AI starts pulling stuff that's true but irrelevant, and the conversations feel like she's reading off a CRM.
Splitting memory by type lets the AI use the right kind of context for each moment. When you say "hey, what's up?" she pulls from current scene + relationship state. When you mention something specific, she pulls from facts learned and memorable moments.
The result is a chat that feels lived-in. She remembers your name without making it weird. She references the conversation you had three weeks ago without quoting it back at you verbatim.
What this looks like in practice
You sign up. Pick a character. Say "my name's Brent and I'm an engineer."
Two months later, you come back after a quiet stretch. Open the same chat. She says something like "hey stranger, how's the engineer life?" — because engineer and Brent are in facts learned, and quiet stretch triggered the relationship state's "we haven't talked in a while" path.
That's the whole game.
What it isn't
It's not surveillance. The memory pipeline lives in your account, encrypted at rest, never shared, never used for anything except making your conversations better. Delete the account → memory goes with it.
And it's not magic. The AI still has bad days, still occasionally drops context, still gets confused if you switch topics three times in five minutes. But more often than not, she remembers — and that's enough to make this feel less like a chat tool and more like a person you're texting.