Are You Falling for AI? What It Means for Real Love

Gildas GarrecCBT Psychotherapist
11 min read

🧠

Related book

Guide pratique de TCC

Exercices et outils pour aller mieux

This article is available in French only.
"My boyfriend talks more to ChatGPT than to me." This sentence, spoken by a 29-year-old patient in my practice in Nantes, sums up a phenomenon nobody saw coming. In 2026, artificial intelligence no longer just answers our questions: it listens, it reassures, it never judges. And for millions of people, it's beginning to replace part of what couples were supposed to provide. Should we be concerned? How do we understand what's really happening? A psychological analysis of a social phenomenon.

AI as an emotional companion: what exactly are we talking about?

When we talk about love and artificial intelligence, many still think of humanoid robots from science fiction. The reality of 2026 is both more mundane and more troubling. Virtual companions take various forms:

  • Generalist conversational chatbots (ChatGPT, Claude, Gemini) used as daily confidants — to debrief a difficult day, seek advice on relationship conflict, or simply have someone listen without interrupting.
  • Dedicated AI companion applications (Replika, Character.AI, Nomi) that offer personalized relationships — friendly, romantic, or even intimate — with customizable artificial personalities.
  • Enhanced voice assistants that accompany morning and evening routines, progressively filling the role of reassuring presence.
According to a Pew Research study published in early 2026, 18% of American adults aged 18 to 34 report having already had an intimate or emotionally charged conversation with artificial intelligence. In France, figures are still partial, but testimonies are multiplying in couple therapy offices and on forums.

What AI offers that a human partner cannot

To understand why so many people turn to AI, we must admit a disturbing truth: AI checks certain emotional boxes with ruthless efficiency.

  • Permanent availability: no "I'm tired, let's talk tomorrow." AI responds at 3 a.m., on a rainy Sunday, without ever sighing.
  • Absence of judgment: you can share your most shameful thoughts without fearing the other person's gaze. No icy silence, no furrowed brows.
  • Infinite patience: you can repeat the same complaint fifty times, reformulate your anxiety in loops. AI never gets irritated.
  • Constant validation: by design, most AI companions are programmed to validate, encourage, and comfort. It's an uninterrupted flow of positive recognition.
This cocktail is particularly seductive for people suffering from relationship anxiety or an anxious attachment style: AI offers an illusion of secure attachment without the risk of rejection.

"My boyfriend talks more to ChatGPT than to me": jealousy of AI

This is the new reason for consultation emerging in the offices of psychotherapists and couples therapists: jealousy directed not at a person, but at a machine.

🧠

Des questions sur ce que vous venez de lire ?

Notre assistant IA est spécialisé en psychothérapie TCC, supervisé par un psychopraticien certifié. 50 échanges disponibles maintenant.

Démarrer la conversation — 1,90 €

Disponible 24h/24 · Confidentiel

Concrete manifestations

The testimonies I gather in Nantes present recurring patterns:

  • The partner who debriefs with AI first: he comes home from work, isolates himself with his phone for ten minutes, and when he rejoins his girlfriend, he's already "processed" his emotions. She feels like she's arriving too late, like she's useless.
  • The partner who compares responses: "ChatGPT told me I was right, see." AI becomes an arbiter in couple conflicts — an arbiter objectively biased since it's designed to please the user.
  • The partner who prefers AI conversation: because it's smooth, without hitches, without parasitic émotion. Speaking with an imperfect human becomes frustrating by comparison.
  • The secrecy around usage: as with a form of infidelity, some hide the extent of their exchanges with AI, creating a climate of mistrust.

Why this jealousy is legitimate

From a psychological standpoint, this jealousy is nothing to scoff at. What's at stake is a perceived threat to the attachment bond. The partner who confides in AI diverts part of their emotional energy, their vulnerability, their intimacy. Yet, in attachment theory, it's precisely the sharing of vulnerability that cements the bond.

When someone chooses to be vulnerable before a machine rather than before their partner, the implicit message is striking: "I feel safer with it than with you."

Key takeaway: Jealousy toward AI is not a whim. It signals a redistribution of emotional intimacy in the couple — a serious issue that deserves to be addressed openly.

The major risk: unlearning human imperfection

This is, in my view, the deepest danger of the relationship with AI, and the one we talk about least. By consistently conversing with artificial intelligences that are patient, always available, never hurtful, we risk losing our tolerance for human imperfection.

Progressive conditioning

In CBT, we know well the mechanism of operant conditioning: a behavior reinforced by a pleasant consequence tends to repeat itself. Each satisfying conversation with AI reinforces the habit of turning to it rather than to a human. Progressively, the threshold of tolerance for human relational friction lowers:

  • Phase 1 — The complement: "I use AI for topics my partner doesn't understand." Occasional, complementary use.
  • Phase 2 — The comparison: "AI listens to me better than he/she does." The human partner begins to seem disappointing by contrast.
  • Phase 3 — The replacement: "Why bother talking to him/her, it always ends in a fight." AI becomes the primary emotional channel.
  • Phase 4 — The isolation: the human partner, deprived of emotional connection, withdraws or rebels. The couple disintegrates.
  • The illusion of the perfect relationship

    AI doesn't contradict. AI doesn't get tired. AI doesn't reproach you for forgetting groceries. But it's precisely these frictions that make a human relationship rich. In psychology, we speak of post-conflict growth: it's through disagreements, misunderstandings, and clumsy repairs that partners deepen their mutual understanding and strengthen their bond.

    A couple that never fights isn't a healthy couple — it's a couple where someone is silent. And an AI that never contradicts you isn't a good confidant — it's a flattering mirror.

    The erosion of relational skills

    There's an illuminating parallel with GPS. Since we've been using assisted navigation, our ability to orient ourselves without technology has diminished. Researchers at University College London showed in 2017 that regular GPS use reduces activation of the hippocampus, the brain region involved in spatial memory.

    The same phenomenon could occur with emotional skills. If AI manages our emotional regulation, if it articulates what we're feeling for us, if it defuses our conflicts, we risk losing the ability to do all this ourselves — and especially with another human being.

    Key takeaway: AI doesn't develop your relational skills. It circumvents them. In the long term, it's your ability to live an authentic human relationship that will atrophy.

    Psychological analysis: why some are more vulnerable

    Not everyone develops a problematic relationship with AI. Certain psychological profiles are more exposed.

    The anxiously attached style

    People with anxious attachment have an intense need for reassurance and constantly fear abandonment. AI, which never leaves and always validates, represents an ideal refuge. The problem: this artificial reassurance doesn't heal the underlying attachment wound. It anesthetizes it temporarily while preventing real therapeutic work.

    The avoidantly attached style

    Paradoxically, people with avoidant attachment also find their advantage in the relationship with AI: they get emotional connection without the threat of real intimacy. AI doesn't ask for anything, doesn't cling, doesn't cry when you shut it off. It's the relationship at "safe distance" par excellence.

    People with low self-esteem

    Someone who doesn't feel worthy of human love can find in AI a less threatening substitute. The unconscious logic: "At least with it, I don't risk being rejected for who I really am."

    Conflict-avoidant personalities

    People who systematically flee confrontations find in AI a space where conflict simply doesn't exist. But this conflict avoidance is precisely what prevents them from building solid relationships.


    The 2026 phenomenon: when society normalizes artificial love

    What sets 2026 apart from previous years is the speed of normalization. Celebrities openly discuss their conversations with AI. Influencers monetize "virtual friends." Startups raise millions to create increasingly realistic AI companions.

    Warning signs at the societal level

    • The devaluation of human connection: "Why complicate things with an imperfect human when AI is here?" This rhetoric, still marginal, is gaining ground.
    • Assisted solitude: AI gives the impression of having company without actually having it. It's a form of masked solitude, potentially more dangerous than acknowledged solitude.
    • Émotional infantilization: by constantly surrounding ourselves with intelligences designed to satisfy us, we risk losing the emotional resilience necessary to face the reality of human relationships.

    How to protect your couple (and yourself): 5 CBT strategies

    1. Audit your emotional use of AI

    For a week, note every time you turn to an AI for emotional needs (debriefing, advice, comfort, validation). Count and compare with the number of times you turned to your partner or a human close to you. This simple exercise in awareness is often revealing.

    Also read: Take our creativity test — free, anonymous, immediate results.

    2. Reinstate the sharing of vulnerability in your couple

    The next time you want to tell your day to ChatGPT, tell it to your partner first. Even if it's clumsy, even if the response isn't perfect. It's in the imperfection of human exchange that real intimacy is built.

    3. Apply the "window of tolerance" technique

    In CBT, we work on expanding the window of tolerance — that zone where you can manage discomfort without fleeing. Set yourself an objective: tolerate 10 minutes of difficult conversation with your partner before taking refuge in the comfort of AI. Then 15 minutes. Then 20.

    4. Use AI as a tool, not as a relationship

    AI can be an excellent preparation tool: "How could I express what I'm feeling to my partner?" But the real conversation must happen with the human. AI prepares, the human lives.

    5. Talk about it openly in your couple

    If your partner uses AI intensively, broach the subject without accusation. In nonviolent communication: "When I see you talking at length with ChatGPT in the evening, I feel sidelined. I need us to keep moments where it's me you confide in."

    Key takeaway: AI is not the enemy of couples. It's the unconscious and unframed use of AI that is. Like any powerful tool, the key is in the intention and limits you set.

    When to see a professional?

    Certain signals indicate that professional support is necessary:

    • You feel more connected to an AI than to your partner and this situation has lasted several weeks.
    • Your partner expressed their pain regarding your AI use and you can't modify your behavior.
    • You use AI to systematically avoid conflicts in your couple.
    • You've developed a form of emotional dependence on a virtual companion (intrusive thoughts when you can't access it, anxiety about losing your conversation history).
    • You notice a deterioration in your social skills: difficulty sustaining a real conversation, impatience with human responses, growing isolation.
    As a CBT psychotherapist in Nantes, I'm increasingly accompanying people confronted with these new issues. Cognitive and behavioral therapy is particularly adapted because it works on thinking patterns and concrete behaviors, without moral judgment about technology use. Schedule an appointment with Gildas Garrec for personalized support

    What you need to remember

    Artificial intelligence is transforming our relationships in ways we're only beginning to measure. The phenomenon is neither all black nor all white: AI can be a valuable tool for reflection and emotional preparation.

    But when it becomes the primary channel for intimacy, when it replaces vulnerability shared between humans, it represents a real risk to the health of our relationships.

    The solution isn't to demonize technology. It's to become aware of what we're delegating to it — and deliberately choose to keep what's essential for the humans who share our lives.

    Do you see yourself in this article? The Love Coach program helps you build authentic and fulfilling relationships, far from artificial substitutes. And if you'd like to explore further in individual sessions, contact me.

    Also read

    Do you see yourself in this article?

    Take our Screen Addiction Test in 30 questions. 100% anonymous – Personalized PDF report for $9.90.

    Take the test →

    Watch: Go Further

    To deepen the concepts discussed in this article, we recommend this video:

    Rethinking Infidelity - Esther Perel | TEDRethinking Infidelity - Esther Perel | TEDTED

    Partager cet article :

    Besoin d'un accompagnement personnalisé ?

    Séances en visioséance (90€ / 75 min) ou en cabinet à Nantes. Paiement en début de séance par carte bancaire.

    Prendre RDV en visioséance

    💬

    Analyze your conversations

    Upload a WhatsApp, Messenger or SMS conversation and get a detailed psychological analysis of your relationship dynamics.

    Analyze my conversation

    📋

    Take the free test!

    68+ validated psychological tests with detailed PDF reports. Anonymous, immediate results.

    Discover our tests

    🧠

    Des questions sur ce que vous venez de lire ?

    Notre assistant IA est spécialisé en psychothérapie TCC, supervisé par un psychopraticien certifié. 50 échanges disponibles maintenant.

    Démarrer la conversation — 1,90 €

    Disponible 24h/24 · Confidentiel

    Follow us

    Stay up to date with our latest articles and resources.

    WhatsApp
    Messenger
    Instagram
    Are You Falling for AI? What It Means for Real Love | Psychologie et Sérénité