The Algorithm Has Better Boundaries
Therapy promises a safe space to heal, but many clients spend years talking to someone who feels more like a professional wall than a person. The therapist nods, asks careful questions, reveals nothing. The client leaves each session wondering if they’re being heard or just processed.
This wasn’t an accident. Early psychoanalysis taught therapists to be blank slates. The theory was that revealing anything personal might contaminate the client’s healing. Keep your distance. Stay neutral. Let them project onto you.
It worked when experts held unquestioned authority. Patients didn’t expect transparency. They expected to be fixed.
But something shifted. Research started showing that trust between therapist and client predicts successful therapy more than technique or credentials. Yet the professional mask remained. Therapists still hide behind training that tells them emotion is unprofessional, that boundaries mean distance.
Clients from marginalized communities report this gap most acutely. When your therapist doesn’t share your cultural experience and won’t acknowledge their own perspective, you’re left translating your life into language they might understand. That’s not healing. That’s performing.
Now AI chatbots are entering mental healthcare. They offer structured responses, evidence-based techniques, twenty-four-hour availability. Some clients prefer them. At least the algorithm doesn’t pretend to care.