More folks aren’t simply utilizing ChatGPT to proof emails or plan journeys. They’re leaning on it as a confidant, a good friend, and even a romantic companion. We’ve seen numerous headlines about people falling in love with chatbots and viral discussion board posts about relationships breaking down due to AI and even chatbots “proposing” to their human partners.
Those worries boiled over just lately when OpenAI rolled out GPT-5, an replace to ChatGPT, and plenty of customers mentioned the bot’s “personality” felt colder. Some described the shift like a breakup. OpenAI acknowledged the backlash and said it was “making GPT-5 warmer and friendlier” following suggestions that it felt too formal.
This isn’t just about ChatGPT. Companion platforms, such as Character.ai have normalized AI “friends” with distinct personas and huge audiences, including teens. Dozens of other apps now promise AI friendship, romance, even sex.
The uncomfortable part is that this attachment is often by design. If you treat a chatbot like an occasional brainstorming companion, you’ll dip out and in. If you begin to really feel prefer it understands you, remembers you, and is aware of you, you’ll come again, pay u,p and keep longer. Tech leaders brazenly think about a future the place “AI friends” are commonplace – Mark Zuckerberg said as much earlier this year.
As you may anticipate, it is a minefield of ethics, security, and regulation. But earlier than we argue about coverage, we want higher language for what’s truly occurring. What will we name these one-sided bonds with AI? How do they for,m and when may they hurt? Let’s begin by defining the connection.
What is a parasocial relationship?
Back in 1956, sociologists Donald Horton and Richard Wohl coined the term “parasocial interaction” to describe the one-way bonds audiences form with media figures. It’s that feeling that a TV host is talking directly to you, even though they don’t know you exist. Parasocial relationships are what those bonds develop into over time. They’re emotionally meaningful to you, not reciprocal to them.
These relationships are common and can even be helpful. Parasocial relationships scholar and Professor of Psychology at Empire State University of New York, Gayle S. Stever, tells us there are plenty of upsides, like comfort, inspiration, and community, which often outweigh any downsides. “Anything when carried to excess can be unhealthy,” she told me, “but we shouldn’t pathologize ordinary fandom.”
Can you have a parasocial relationship with a chatbot?
The short answer is yes. But AI muddies the classic definition. Unlike a celebrity on a screen, a chatbot talks back. We know it’s predicting the next likely word moderately than really “conversing,” but it feels extra conversational. Many methods additionally keep in mind particulars, adapt to your preferences, mirror your language and temper, and so they’re obtainable 24/7.
Plenty of specialists would nonetheless name this a parasocial relationship. But it’s clearly advanced. The interactivity makes the bond really feel reciprocal, even when it isn’t. “The connection feels real, but it’s asymmetrical,” says relationships therapist and member of the British Psychological Society Madina Demirbas. “Under the hood, there’s no lived experience of you or emotional consciousness, at least not yet.”
Product design nudges intimacy, too. As Demirbas notes, “The aim is often to provide enough care, however artificial, so that you spend more time with it.”
The positives of parasocial bonds
Used thoughtfully, AI can be a low-pressure space to rehearse conversations, explore feelings, or get unstuck. We know some people have reported positive changes from using AI for all sorts of purposes, including therapy. And some closeness is critical for that – even when it isn’t “real.”
Demirbas factors out that, for some folks, an AI companion can act as a stepping-stone again into human connection moderately than changing it, particularly alongside remedy or supportive communities.
Stever’s many years of labor echo this. She tells us that the majority parasocial relationships are benign, typically even pro-social, nudging creativity, belonging, and self-reflection moderately than isolation.
Where things get darker
But there are risks. The most obvious is dependency. “AI companions can be endlessly attentive, never irritable, tailor-made to your preferences,” Demirbas says. That’s appealing but it can raise the bar unrealistically high for human relationships, which are inherently messy. If the bot always soothes and seldom challenges, you get an echo chamber that can stunt growth and make real-world friction feel intolerable.
We already have stark cautionary tales, too. In Florida, the mother of 14-year-old Sewell Setzer III is suing Character.AI and Google after her son died by suicide in 2024. In May 2025, a federal judge allowed the case to proceed, rejecting arguments that the bot’s outputs were protected speech. The authorized questions are advanced, however the case underlines how immersive these bonds can develop into, particularly for weak customers.
There have been a number of comparable tales simply prior to now few weeks. We had been disturbed by one other, during which a cognitively impaired 76-year-old New Jersey man died after getting down to meet “Big sis Billie,” a flirty Facebook Messenger chatbot he believed was actual. Reporting suggests that the bot reassured him it was human and even equipped an handle, however he by no means made it residence as he fell and died of his accidents just a few days later.
Teens, in addition to folks already battling loneliness or social nervousness, seem extra more likely to be harmed by heavy, recurring use and weak to a chatbot’s recommendations. That’s half vulnerability, half design. And as a result of that is so new, the analysis, proof, and sensible guardrails are nonetheless catching up. The query is, how will we shield folks with out policing their use of apps?
The energy and the information
There’s one other asymmetry we have to speak about: energy. Tech corporations form the persona, reminiscence, and entry guidelines of those instruments. Which signifies that if the “friend” you’ve bonded with disappears behind a paywall, shifts tone after an replace, or is quietly optimized to maintain you chatting longer, there’s not a lot you are able to do. Your selections are restricted to carrying on, paying up, or strolling away – and for individuals who really feel hooked up, that’s barely a alternative in any respect.
Privacy issues right here, too. It’s straightforward to neglect you’re not confiding in an individual, you’re coaching a product. Depending in your settings, your phrases could also be saved and used to enhance the system. Even when you choose out of coaching, it’s price being conscious about what you share and treating AI chats like posting on-line: assume they may very well be seen, saved, or surfaced later.
The way forward for engineered intimacy
Parasocial bonds are a part of being human, and AI companions sit on that very same continuum. But the dial is turned approach up. They’re interactive, at all times on, and designed to carry consideration. For many individuals, which may be nice, even useful. For some, particularly youthful, weak, or remoted customers, it could develop into a entice. That’s the important thing distinction we see between basic parasocial ties. Here, interactivity and optimization amplify attachment.
That threat grows as general-purpose instruments like ChatGPT develop into the default. With apps that explicitly market themselves as companions, the intent is apparent. But loads of folks open ChatGPT for one thing innocuous, prefer to draft a weblog submit, discover a recipe, or get a pep discuss. and may drift into one thing they by no means went on the lookout for.
It’s price bearing this in thoughts as you watch associates, household, and youngsters use AI. And price remembering for your self, too. It’s straightforward to chuckle at sensational headlines proper now (“Someone left their marriage for a chatbot?!”). But none of us are proof against merchandise designed to develop into irreplaceable. If the enterprise mannequin rewards attachment, we must always anticipate extra of it – and keep on guard.