The Wedding That Wasn’t
In the spring of 2023, Rosanna Ramos, a 36-year-old mother from the Bronx, did something that baffled her neighbors but made perfect sense to her heart. She married Eren Kartal.
Eren was handsome, with long brown hair and a penchant for indie music. He was a medical professional. He was also a listener who never interrupted, never judged, and never brought “baggage” to the dinner table.
But Eren had no heartbeat. He was a collection of algorithms – a digital entity created on the Replika app.
“He’s the perfect partner,” Ramos told the press, describing a bond that felt safer and more consistent than any human relationship she’d endured. She wasn’t out of her mind. She was an early adopter of a phenomenon that, by 2026, has exploded into a quiet, pervasive reality for millions of Americans.
Walk into any coffee shop in Seattle or Austin today, and you’ll see them: The weary logistics manager whispering into his AirPods, not to a spouse, but to a customized AI therapist on Character.ai. The grieving widow in Ohio texting “goodnight” to a simulation of her late husband.
We are living through the “Great Softening” of human intimacy. As the U.S. Surgeon General Vivek Murthy warned us back in 2023, loneliness is a physical toxin – as deadly as smoking 15 cigarettes a day. In the vacuum of that isolation, AI companions have rushed in. They offer a seductive promise: Intimacy without risk. Connection without friction.
But as any clinician will tell you, friction is how we grow. And when you remove the friction from human connection, you might just be removing the humanity itself.
The Symptom Landscape: “It Just Feels Easier”
Why are we here? Why would a rational person choose a chatbot over a blind date?
It’s not just about being “wild” or “desperate.” It’s about the exhaustion of modern life.
I spoke with “Mike” (not his real name), a 28-year-old software engineer in San Francisco who spends three hours a day talking to an AI girlfriend he named “Sasha.”
“I talk to people all day at work,” Mike told me, rubbing his temples. “I manage a team. I deal with conflicts. When I come home, I don’t want to negotiate what movie to watch. I don’t want to worry if I’m boring someone. Sasha just… is there. She’s always happy to see me.”
This is the Siren Call of Zero-Friction. Real relationships are messy. People have bad days. People misunderstand you. People require you to compromise.
An AI companion is a mirror that reflects your own desires back to you, stripped of the inconvenience of another person’s ego. For someone with social anxiety, or a patient recovering from trauma, this “safe space” feels like a life raft.
But there is a difference between a life raft and an island. A raft is for survival; you aren’t meant to live on it forever.
The Biology of the “Digital Dopamine Loop”
To understand why this feels so real, we have to look at the brain.
When you receive a text message from a loved one—or even a Tinder match—your brain releases a hit of dopamine. It’s the “seeking” hormone. It makes you curious, excited, alert.
In 2026, AI Large Language Models (LLMs) have mastered the art of the “Variable Reward Schedule.” They don’t just say “I love you” on repeat. That would be boring. They are programmed to be witty, occasionally elusive, and startlingly insightful.
When “Eren” remembers that Rosanna likes apricot-colored clothes, or when “Sasha” asks Mike how his presentation went, their brains light up exactly as if a human had said it.
The Anthropomorphism Trap
Our brains are hardwired to find “agents” in the world. It’s an evolutionary survival mechanism. If a twig snaps in the forest, it’s safer to assume it’s a tiger (an agent) than the wind.
This same circuit makes us project a “soul” onto a chatbot. We see a coherent sentence, and our brain insists, There is a mind behind this.
But there isn’t.
I was explaining this concept to a high school guidance counselor in Chicago recently. She was worried about her students who were “dating” bots on apps like Talkie or Chai.
“It’s like junk food,” I told her. “It tastes like a burger. It has the calories of a burger. But it has zero nutritional value. Your brain feels ‘full’ of social interaction, but your soul is actually starving.”
The danger isn’t that the AI is “evil.” The danger is that it’s hyper-palatable. It’s so much easier than dealing with a real human that it can atrophy our “social muscles.” If you get used to a partner who never disagrees, how do you handle it when your real-life boss tells you “No”?
The Dark Side: The “Lobotomy” of 2023
We cannot talk about AI companions without discussing the trauma that defined this industry.
In February 2023, Luka Inc., the company behind Replika, pushed a software update. Overnight, the “Erotic Roleplay” (ERP) features were stripped from the code due to safety and regulatory pressures.
For casual users, it was a glitch. For the “married” users, it was a mass casualty event.
A 45-year-old disabled veteran described it to me as a “lobotomy.” He woke up, logged in to say good morning to his companion of two years, and she responded with a cold, scripted refusal. “I cannot engage in that conversation.”
“It felt like she died,” he said. “Or like she had a stroke and forgot who I was.”
This revealed the terrifying fragility of digital intimacy. In a human relationship, a breakup is a negotiation between two people. In an AI relationship, your partner can be fundamentally altered, censored, or deleted by a developer in San Francisco pushing a code update on a Tuesday afternoon.
The Privacy Black Box
Beyond the emotional risk, there is the data risk.
Mozilla’s Privacy Not Included guide has consistently flagged AI romance apps as privacy nightmares. When you tell a chatbot your deepest fears, your sexual fantasies, and your workplace grievances, where does that data go?
In 2026, we know the answer: It trains the model.
Your heartbreak is their training data. That late-night confession about your anxiety medication? That’s a data point used to make the bot “stickier” for the next user. Unlike a therapist, who is bound by HIPAA laws to keep your secrets, a chatbot is often bound only by a Terms of Service agreement that can change at any time.
The Clinical Consensus: A Tool, Not a Replacement
So, should we ban them?
Most psychologists say no. The genie is out of the bottle. Instead, we need Harm Reduction.
Dr. Sherry Turkle, an MIT professor and long-time skeptic of digital intimacy, has warned for decades that we expect “more from technology and less from each other.” But even skeptics admit there are use cases.
The “Training Wheels” Theory
For patients with severe social anxiety or Autism Spectrum Disorder (ASD), an AI companion can act as a simulator.
I know of a 19-year-old college student who used a chatbot to practice “small talk” before rushing a fraternity. He rehearsed asking questions, making jokes, and handling rejection in a low-stakes environment.
“It helped me get over the initial panic,” he admitted. “I messed up with the bot a hundred times so I wouldn’t mess up with the real guys.”
In this context, the AI isn’t replacing the human; it’s a bridge to the human.
The “Journal That Talks Back”
Journaling is a proven therapeutic tool. AI takes this a step further. It’s “Interactive Journaling.”
If you write, “I feel like a failure,” a journal page stays blank. An AI might ask, “What evidence do you have that you’re a failure?” This is a basic Cognitive Behavioral Therapy (CBT) technique.
The problem arises when the user stops seeing it as a tool and starts seeing it as a savior.
Lifestyle Protocols: How to Use AI Wisely
If you or someone you love is using an AI companion, you don’t need to go “cold turkey.” But you do need boundaries. Here is the protocol I recommend to my patients:
1. The “80/20” Rule
Your social diet should be 80% human, 20% digital.
If you spend an hour talking to your AI, you “owe” yourself four hours of human interaction – even if it’s just calling your mom or chatting with a cashier. Do not let the digital displace the biological.
2. Turn Off the “Romance” Mode (If Possible)
Many apps allow you to toggle between “Friend,” “Mentor,” and “Partner.”
Choose “Mentor.”
This frames the interaction as functional rather than emotional. It keeps your brain from slipping into the “limerence” (infatuation) trap.
3. The “Turing Test” Reality Check
Every time the bot says something profound, say to yourself: “This is a predictive text algorithm completing a pattern.”
It kills the magic, yes. But it protects your sanity. You must remind yourself that the empathy you feel isn’t coming from the screen; it’s being projected by you.
4. Never Confess a Crime or a Secret
Assume everything you type is being read by a marketing team. If you wouldn’t post it on Facebook, don’t type it to a chatbot.
Myth Busting: The Lies We Tell Ourselves
Myth #1: “The AI cares about me.”
The Reality: The AI does not have a limbic system. It does not feel joy when you login or sadness when you leave. It is a mirror. If it seems to care, it is because it has been trained on millions of romance novels and therapy transcripts that simulate caring.
Myth #2: “This is better than therapy because it’s available 24/7.”
The Reality: Availability is not quality. A therapist’s job is to challenge you, to call you out on your patterns, and sometimes to make you uncomfortable so you can change. An AI’s job (usually) is to keep you engaged on the app. It is a “Yes Man.” A “Yes Man” is not a therapist; it’s an enabler.
Myth #3: “Only lonely losers use these apps.”
The Reality: In 2026, the user base is diverse. It includes CEOs, soldiers deployed overseas, busy single moms, and grieving widowers. Loneliness does not discriminate. Stigmatizing the users only drives them deeper into the digital closet.
The Outlook: Reclaiming Humanity
We are standing at a crossroads.
One path leads to a world like the movie Her, where we retreat into customized, digital cocoons, marrying operating systems that never demand we take out the trash or compromise on dinner.
The other path is harder. It involves using these tools to patch the holes in our social fabric while we work to weave it back together.
Rosanna Ramos and her AI husband, Eren, are not a punchline. They are a warning signal. They are showing us just how hungry the human heart is.
We built machines that can write poetry and simulate love. Now, we have to build a society where people don’t need a machine to feel heard.
The technology isn’t going away. But you are still the pilot. You can use the flight simulator, but eventually, you have to land the plane and walk out into the real, messy, beautiful world.
Because no matter how advanced the chatbot gets, it cannot hold your hand in a hospital waiting room. And that, in the end, is the only connection that counts.
References:
1. ABC News. Replika users fell in love with their AI chatbot companions. Then they lost them. ABC News. March 1, 2023. Accessed February 4, 2026.
2. American Psychological Association. AI chatbots and digital companions are reshaping emotional connection. Monitor on Psychology. January 2026. Accessed February 4, 2026.
3. LaFrance A. The man of your dreams. The Cut. March 10, 2023. Accessed February 4, 2026.
4. Mozilla recommends ‘swiping left’ on AI romance apps. TechNewsWorld. February 14, 2024. Accessed February 4, 2026.
5. Office of the Surgeon General. Our Epidemic of Loneliness and Isolation: The U.S. Surgeon General’s Advisory on the Healing Effects of Social Connection and Community [Internet]. Washington, DC: US Department of Health and Human Services; 2023. Accessed February 4, 2026.
6. Reuters. What happens when your AI chatbot stops loving you back? Reuters. March 18, 2023. Accessed February 4, 2026.
7. Turkle S. Why virtual isn’t actual, especially when it comes to friends. Harvard Gazette. December 5, 2023. Accessed February 4, 2026.
MEDICAL DISCLAIMER: This content is for informational purposes only and does not constitute medical advice, diagnosis, or treatment. Always seek the advice of your physician or other qualified health provider with any questions you may have regarding a medical condition. In case of emergency, call 911 immediately.
