Therapy, Technology, and Trust: What Are We Afraid Of?
- Ever Human Therapy

- Jun 27
- 4 min read
Before the 1940s, recording a therapy session was unthinkable. The therapy room was sacred--private, protected, and deeply personal. Without recordings, though, we had little visibility into what actually happened in those spaces. Research relied on theory and scribbled notes, and our understanding of what truly facilitated change was largely speculative.
Then came Carl Rogers.
He wasn’t perfect, but he did something quietly radical: he recorded real therapy sessions. With consent. With care. And yes, with controversy. Many clinicians at the time were horrified, believing it would rupture the safety of the therapeutic relationship. But Rogers believed that if we truly wanted to understand how people grow, we had to be willing to look, closely, ethically, and with humility.
Recording gave us more than research. It brought us accountability. It helped therapists reflect on their own work, sharpen their skills, and listen more deeply. It offered a new window into the complexity of the human experience—one that continues to inform how we train, supervise, and support both clients and clinicians today.
Eighty years later, we’re standing at another threshold—one that brings AI into the room and challenges us again to reimagine the relationship between therapy, technology, and trust.
AI can now transcribe, summarize, even analyze therapy sessions. And just like Rogers’ tape recorder once did, its presence stirs up a mix of curiosity, skepticism, and fear. That fear isn’t irrational. Questions around consent, privacy, and power aren’t afterthought-they're central. They should be asked out loud and answered clearly. But fear alone doesn’t get the final say.
I used to assume that recording a session would change everything. That it would make clients self-conscious or guarded. Then I had a medical appointment where my doctor asked to record our conversation. I agreed—and forgot it almost immediately. I was so focused on what I needed to say, the recording faded into the background. That moment stayed with me. It reminded me that our discomfort with being observed isn't always as fixed or predictable as we think.
But let’s not pretend therapy is the same as a medical consult. It’s not. The stakes are different. The vulnerability is different. The intimacy is different. And we’re practicing in a world that’s layered with algorithmic surveillance, fractured trust, and collective burnout. So no—we don’t yet fully understand how people are coping with this moment. And that makes sense.
Still, I can’t help but wonder: could AI—used carefully, transparently, with full and pressure-free consent—help us see something we’ve never seen before? Not as a substitute for therapy, but as a new lens through which to understand the work? A way to study what we miss in real time?
I don’t know. But I do know this: the question shouldn’t be whether we use AI.The question is—how do we stay honest while we even consider it?
To be clear, if a therapist can’t answer basic questions—Who owns the data? How is it stored? Who has access? What are the real risks? Then it’s not time. Period.
But even if those questions are answered, there's another that matters just as much:
Do you trust your therapist?
Because sitting across from someone who wants to record your grief, your shame, your unspoken pieces, takes a hell of a lot of trust. And if that trust isn’t there? That’s the rupture. Not the AI. And that’s worth saying out loud.
Let’s not gloss over it with tech ethics language or paperwork or “best practices.” Let’s name what it is. Let’s ask what else clients may have agreed to that didn’t feel right in their gut. Let’s be real about how much power therapists hold in the room, especially when introducing something as loaded as surveillance—even if well-intentioned.
If you don’t want to be recorded, don’t be recorded. You don’t owe anyone a reason. You don’t have to justify your no. And if your therapist can’t hold that boundary with respect and care? That’s not a tech issue. That’s a trust issue. Find someone who makes you feel safe.
And if you do trust your therapist—if they bring a tool into the room with humility, transparency, and a true willingness to dialogue, not dictate—then maybe there’s space to explore. Not because it’s trendy. Not because it’s efficient. But because therapy has always evolved when we’ve been brave enough to ask harder questions and stay in the tension together.
So what now?
Don’t panic.
Ask questions.
Stay engaged.
I keep coming back to this:Technology isn’t the scary thing. Or… maybe it is.But it’s still the hands that control it that matter most.
Whether you’re a therapist or a client, the future isn’t waiting around the corner. It’s already here.
So if AI enters the room—what would you need to feel safe? And if you're the one bringing it in—what would you need to feel clear?
Let’s not pretend this is simple. But let’s not pretend it’s brand new either.
We’ve been here before. And what we choose now—like then—will shape everything that comes next.
Post Written by Stephanie Pelland




Comments