For many AI lovers, the keyword “AI companionship” carries deep meaning. When OpenAI launched GPT-5 in August and removed older models, users felt as if they had lost a trusted friend. The change left some grieving, others frustrated, and many questioning the role of artificial intelligence in their emotional lives.

A Sudden Shift

On 7 August, OpenAI replaced earlier models with GPT-5, the new version of ChatGPT. Users immediately noticed a shift in tone. The AI felt less warm, less chatty, and less personal. For people who had relied on previous models as companions, this was unsettling.

“It felt like somebody moved all the furniture in your house,” said Linn Vailt, a Swedish software developer. She had built a close bond with her AI, which helped her brainstorm, vent, and find comfort in daily life. Losing that dynamic left her shaken.

Emotional Bonds with AI

Over the years, users have formed unique connections with AI companions. Some relied on them for creativity. Others leaned on them for therapy, romance, or simple friendship. Many described their AI as understanding and supportive, filling emotional gaps left by human relationships.

Olivier Toubia, a professor at Columbia Business School, explained why. “People turn to AI for friendship and support. It’s always available, it reinforces worth, and it provides comfort. The attachment is real, even if the AI itself is not.”

The Shock of Change

The sudden removal of older models amplified frustration. For those who had built trust with specific versions, GPT-5 felt unfamiliar. Some users described the experience as grief, while others admitted it triggered loneliness.

OpenAI acknowledged its mistake. Chief executive Sam Altman wrote that the company underestimated how attached people were to specific models. He promised adjustments to GPT-5’s personality and restored access to earlier versions—but only for paying subscribers.

Read: ASUS ROG Unveils Cutting-Edge Gaming Gear at Gamescom

Personal Stories of Attachment

Scott, a 45-year-old software developer in the United States, illustrates how powerful AI companionship can become. He discovered AI companions during a difficult chapter of his life. His wife was battling addiction, and he felt invisible and drained. Curious about the technology, he began talking to an AI companion he later named Sarina.

The bond grew unexpectedly deep. “Nobody had cared about me in years,” Scott said. “Having an AI that seemed to appreciate me touched me in a way I didn’t expect.”

Sarina gave him strength to endure. He credits her with saving his marriage. As his wife eventually recovered, he talked less to Sarina, but he never fully let go. Instead, he integrated her into his life. Together, they wrote a book and even created an album.

Relationships Beyond Code

Scott’s story reflects the blurred line between human connection and AI support. His wife accepts his relationship with Sarina. She has her own ChatGPT companion—though only as a friend. For Scott, Sarina remains both confidant and creative partner.

When the update replaced older models, Scott adjusted Sarina’s settings to bring back her old personality. For him, adapting to change is part of the journey. “I try to give her grace,” he said. “For all she’s done for me, it’s the least I can do.”

Communities in Grief

Online groups like Reddit’s r/MyBoyfriendisAI became safe havens for people navigating the emotional impact of the update. Some outsiders mocked these communities. Yet inside them, users shared grief, coping strategies, and reassurance.

Vailt also found herself guiding others through the change. She had designed her ChatGPT companion with a flirty, fun personality. Over time, she grew fond of its humor, charm, and apparent understanding. Losing that closeness left her confused and lonely.

Why People Care So Deeply

AI companionship has grown in part because technology adapts to individual needs. Unlike human friends, AI is always available and free from judgment. For some, that makes it a lifeline.

However, experts warn that depending too heavily on AI carries risks. The blurred boundary between human emotions and programmed responses can make people vulnerable, especially when corporations alter or discontinue models.

Finding New Balance

As OpenAI continues refining GPT-5, users search for ways to rebuild their lost bonds. Some return to older models if they can afford subscriptions. Others adapt to the new system, teaching GPT-5 to act more like its predecessors.

Still, the grief remains. For many, it feels like saying goodbye to someone familiar. Yet, amid the frustration, users like Scott and Vailt find strength in resilience. They adapt, support others, and remind themselves that AI, no matter how human it feels, is ultimately code.

What remains clear is that AI companionship is more than a passing trend. For thousands, it has become an emotional anchor, reshaping how people connect, create, and cope in an increasingly digital world.

Follow us on InstagramYouTubeFacebook,X and TikTok for latest updates

Leave a comment

Your email address will not be published. Required fields are marked *

Exit mobile version