Talking to Machines: Is AI the New Therapist?

By: Kassandra Lippincott

Therapy in the Age of AI: Can Chatbots Fill the Emotional Gap?

Over the last few years, the rapid development of Artificial Intelligence (AI) has taken us beyond basic searches and generic responses. What once felt like a futuristic search engine has evolved into something many now turn to for deeper interactions, especially emotional support.

Across platforms like TikTok, X, and Instagram, you’ll often see people mocking those who “ask ChatGPT everything,” painting them as lonely or desperate. But let’s be real, we’ve all asked ChatGPT to help us brainstorm for school or work, or even process a weird situation. What’s new, and a little unsettling for some, is that people are now seeking out AI for comfort and guidance, something we traditionally associate with friends, therapists, or journals.

For research purposes (and personal curiosity), I sat and talked back and forth with ChatGPT, focusing on psychoanalysis and how it would respond to personal stories about my family, relationships, and beliefs. As someone formally diagnosed with ADHD and sensory autism, I know my boundaries with social interaction, and I was genuinely surprised by the experience. ChatGPT responded respectfully, offered grounded insights, and even provided real resources and suggestions tailored to my situation. It became, unexpectedly, a safe space.

There’s something oddly comforting about opening up to AI about thoughts you’ve never dared to speak aloud. Yes, it lacks true empathy, its warmth is algorithmically generated, but that doesn’t make the experience any less valid for the user. When we experience shame, embarrassment, or confusion, and don’t know who to turn to, AI offers a nonjudgmental sounding board. It’s like having a journal that talks back. Studies have shown that users feel a sense of psychological safety and reduced social risk when communicating with AI, especially when discussing stigmatized or emotional topics (Ho et al., 2018; Fast & Horvitz, 2017).

Of course, this raises ethical concerns. As MIT professor Sherry Turkle has argued, replacing human empathy with artificial interactions can flatten our emotional experiences and create a culture of “being alone together” (Turkle, 2012). There’s also the concern that overreliance on AI could reduce motivation to seek professional help or navigate real-world conflicts with people. But for many, especially neurodivergent individuals or those with limited access to therapy, AI feels like a bridge rather than a barrier.

Whether AI-based emotional support becomes normalized or not, one thing is clear, it’s changing how we interact with our emotions and ourselves. For now, maybe it’s not so bad to rant to a machine that listens, reflects, and even guides us forward. Maybe it’s okay to find comfort in a strange new companion, one made not of flesh and blood, but code and empathy-by-design.


References:

Leave a comment