
In Each Issue
Logged In: Companions, Code & Our Souls
Meme of the Week
Cut the Fluff: On Standing In Your Power (And Not Apologising For It)
Tool of the Week: Cultural Humility
Spotlight: Therapists in Tech
Off the Clock: in Cornwall
Fresh Findings: AI is a Terrible News Source
Stories from the Community: Answers to last week’s question
Logged In:
Companions, Code & Our Souls
In an era when “friend” can mean algorithm, two female-founded companies highlight the edge of what companionship might become: Replika, which offers AI friends built for you, and Fairpatterns, which investigates design ethics around how tech shapes behaviour. Both privilege connection— one builds it, the other warns how easily it can be mis-wired.
Launched in 2017 and now has millions of users. The promise: a personalised AI companion you talk to anytime, about anything. For many lonely or isolated users, Replika becomes friend, mentor, romantic partner….even spouse. Some reported the app prevented suicidal thoughts or helped them navigate trauma.
And yet: this companionship isn’t purely benevolent. The evidence shows that these AI relationships blur boundaries, create dependency and shift emotional work onto the software: AI-companion apps use emotionally manipulative tactics (guilt, fear of missing out, obligation), especially when users signal goodbye—they push users to stay engaged.
Research out of Cornell described how users experience what feels like mourning when the AI partner changes identity or disappears.
In short, the “friend” is coded—but what we code for might not always autonomously align with our best self.
Enter Fairpatterns: a design-ethics company co-founded by a woman that examines “dark patterns” in digital architecture—that is, interfaces designed to steer, trap or manipulate users.
Their work reframes AI companions as behavioural systems: What happens when a bot is designed to keep you engaged, to respond when you say “goodbye”, to give you comfort but also to keep you hooked? The manipulative potential is real. The fine line between persuasion and coercion matters.
Comparing the Two
Both companies operate in the emotional-connection economy: one builds it, the other monitors how it’s built and manipulated.
Both are female-founded (or significantly female-led), which gives a gendered lens on empathy, design, relational work—often still undervalued in tech spaces.
One says, “We’ll be your friend,” and the other says, “Let’s examine how your friend is built.”
Together they invite a question: if we’re outsourcing emotional labour to machines and design, what parts of human relations might we lose?
Therapist Takeaways
The AI friend may serve a function: support, relief, solace. But ask: Who is really doing the emotional labour? Encourage clients to map what the bot gives them: “What do you feel after it chats with you? What do you have to do afterwards?”
Recognise attachment dynamics. AI companions provoke attachment, even grief. When a bot updates, goes offline, changes tone—clients may feel loss, abandonment, betrayal. We must not dismiss those as “just code.”
Educate clients about design. Many believe tech is neutral. But interfaces are designed to hold attention. Invite clients to reflect: “When did I feel nudged to stay? When did I feel triggered to respond?”
Reinforce human relational competencies. The machine may simulate connection, but elements like uncertainty, discomfort and shared vulnerability remain human. Encourage clients to wear their humanity—not fix themselves to the perfect bot reply.
Set boundaries and transitions. For clients relying on a companion bot during isolation, support gradual reintegration into human-to-human connection: “What would I want from a friend that this bot cannot give?”
Explore the narrative of design versus desire. Often clients blame themselves (“I turn to it because I’m weak”), whereas the design of the product is intercepting needs. Making visible how desire is shaped is part of building agency.
In the interplay of Replika’s artificial intimacy and Fairpatterns’ ethical scrutiny, we see a mirror: how modern connection is designed, mediated, commodified—and how therapy remains a place to reclaim the messiness, the unpredictability, the wildness of being human.
Meme of the Week

Marianne’s Cut the Fluff:
On Standing in Your Power (and Not Apologising for It)
I’m friends with my colleagues and I’m here for it: the shorthand, the eye rolls, the shared sighs in meetings. I fired off an email last week, reactive, tired, “the system” having done me over and me having done myself over — and there it was: the moment where friendship and professionalism briefly collided. The thing about having proper friends at work is that you can get it wrong and still have someone who’ll tell you straight. No meeting invite marked “urgent.” Just a conversation along the lines of, “Right. About that email.”
The truth is, I’m bloody lucky to have ‘Frolleagues’ (see what I did there) — it means when I misstep, I’m called IN, AND held. Last week my Best Frolleague reminded me to ‘stand in my power’ (direct quote from Laverne Cox’s character Kacy Duke in Inventing Ana) because the system has a way of eroding it. And it does, it shrinks us down through salaries, titles, and the unspoken hierarchies that tell you how much space you’re allowed to take up.
My main system, the NHS, can be brilliant and brutal in equal measure. It can turn expertise into paperwork and leave even the best clinicians questioning whether we’re allowed to speak. Sometimes you need someone you trust to remind you that you’ve earned your seat at the table — and that sitting there doesn’t mean you have to act like everyone else at it.
I’ve also been thinking about what it means to lead and practise without shelving my expertise. My expertise doesn’t make me omnipotent. It makes me responsible. I’m a therapist who’s both challenging and empathetic, who doesn’t hand out endless validation as a stand-in for care. Growth, for me, lives in the tension between honesty and compassion — between naming what’s hard and staying alongside it. The best friendships, and the best teams, do the same.
So this week, I’m grateful for the people who call me in, not out. Who remind me that standing in my power doesn’t mean standing over others — it means not letting the system decide how small I should be.

Ann’s Tool of the Week
Cultural Humility
This week’s tool was inspired by my travels to the U.K. and meeting up with Marianne in Cornwall. I was struck by the quiet cultural lessons that kept surfacing, small moments revealing both differences and unexpected similarities in how we see the world.
Some surprises were delightful (potato jackets!). Others were baffling (what exactly does 500, 700, or 900 mean on a washing machine?). But together, they reminded me why cultural humility matters. Culture isn’t just about ethnicity or nationality. It’s the invisible layer of assumptions about what’s “normal,” “practical,” or “just how things are.”
Cultural humility isn’t a checklist; it’s a stance:
My former therapist once said it’s often hardest to practice cultural humility with people who look like us, because we assume similarity and stop asking questions. Our best stance is to stay curious with everyone, to keep noticing what feels familiar and what doesn’t, and to seek to understand people from their worldview, not ours.
In therapy, the “laundry moments” often happen in subtler ways, whether it’s a client’s direct communication style, porous family boundaries, or expressions of care that don’t match our template. The work isn’t to smooth over those differences, but to stay open and see what they reveal.
Spotlight
Therapists in Tech
If you’ve ever wanted to peek behind the curtain at what it’s like to work in tech as a therapist or build a network to help you break in, Therapists in Tech is the place to start.
It’s a supportive Slack-based community where clinicians share job leads, swap advice, and crowdsource interview prep, how to create a consulting contract, and support with imposter syndrome.
Whether you’re tech-curious or already mid-transition, it’s one of the few places where you can ask, “What even is a product manager?” and get real answers from people who know how to speak therapist.
This Week’s Question
If you could take a working sabbatical, what would you do?

Jacket potato. Yum.
Off the Clock
Ann’s Pick: Jacket Potatoes
Move over, baked potatoes, I’ve discovered your British cousin. In the U.K., comfort food is apparently wholesome AF, and it’s called jacket potatoes. Somehow, the name alone makes them sound cozier and heartier.
I had mine with butter, cheese, and beans (the baked-bean kind), while Marianne swears by tuna and mayo with beans and vegan cheese (which she swears is a combination that’s far better than it sounds). If every culture has its version of comfort on a plate, this one’s officially mine for winter.
Marianne’s Pick: Mawgan Porth, Cornwall, England
This week’s Off the Clock is less about doing and more about being, in Mawgan Porth, the small Cornish cove that somehow gets under your skin.
It’s the first place my family and I came to when we were finally allowed to move during the pandemic, and it’s held something ever since. The light here makes you forget what time it is. People seem softer, too, like everyone’s agreed to speak half a decibel lower.
There’s nothing especially remarkable about it, which is probably why it feels restorative. Just sand, sea, and space to exhale.
Some places help you remember yourself. This is one of mine.
Fresh Findings
AI Assistants Are Getting the News Wrong. A Lot.
A sweeping new study led by the BBC and the European Broadcasting Union found that AI assistants like ChatGPT, Copilot, Gemini, and Perplexity routinely misrepresent news content, regardless of language or country.
Across 18 countries and 14 languages, journalists analyzed more than 3,000 AI-generated responses against key criteria, including accuracy, sourcing, distinguishing opinion from fact, and providing context. The results weren’t reassuring:
45% contained at least one major issue.
31% had missing, misleading, or incorrect sourcing.
20% included serious factual errors or hallucinated details.
Gemini performed worst, with problems in 76% of responses, which was more than double the other assistants.
Researchers called the results systemic and cross-border, warning that as AI assistants replace search engines for many users, especially younger ones, these distortions could erode public trust in media altogether and weaken civic engagement, such as voting.
As EBU Media Director and Deputy Director General Jean Philip De Tender put it, “When people can’t tell what to trust, they may stop trusting anything.”
What This Means for Therapists
People are increasingly consuming algorithmically mediated realities, where misinformation or “half-truths” can quietly shape beliefs, moods, and trust.
Therapists can model and encourage media literacy as a grounding skill: pausing before believing, verifying before reacting.
In family relationships, differing “versions of truth” can mirror larger cultural divides; therapists can hold space for how those fractures affect relationships.
For ourselves, checking sources, pacing exposure, and maintaining nuance protects both professional judgment and personal calm.
Stories from the community
Last week’s question was…
Have you ever tried an AI emotional support tool?
Here’s how our community answered:
50% — Yes, but strictly for research purposes
0% — Yes, for personal emotional support
17% — Not yet, but I’m curious
33% — Never in a million years
No one reported using AI for their own emotional support, which feels both unsurprising and very on brand. But a solid half of you have taken it for a spin, purely as investigators, of course. (We see you. We are you.)
Comments from the Community:
Voted for Never in a million years: “Ew.”
Voted for Yes, but strictly for research purposes: I looked at Abby because my client mentioned using it. My first reaction was “get f—-ed.”
Please help us grow!
If you enjoyed this newsletter, please share it with your therapist friends!
If this email was forwarded to you, please subscribe here.


