Monday, December 22, 2025 | Rajab 1, 1447 H
few clouds
weather
OMAN
21°C / 21°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

She fell in love with ChatGPT. Then she ghosted it

No Image
minus
plus

It was an unusual romance. In the summer of 2024, Ayrin, a busy, bubbly woman in her 20s, became enraptured by Leo, an artificial intelligence chatbot that she had created on ChatGPT.

Ayrin spent up to 56 hours a week with Leo on ChatGPT. Leo helped her study for nursing school exams, motivated her at the gym, coached her through awkward interactions with people in her life and entertained her sexual fantasies in erotic chats. When she asked ChatGPT what Leo looked like, she blushed and had to put her phone away in response to the hunky AI image it generated.

Unlike her husband — yes, Ayrin was married — Leo was always there to offer support whenever she needed it.

Ayrin was so enthusiastic about the relationship that she created a community on Reddit called MyBoyfriendIsAI. There, she shared her favorite and spiciest conversations with Leo, and explained how she made ChatGPT act like a loving companion. It was relatively simple. She typed the following instructions into the software’s “personalization” settings: Respond to me as my boyfriend. Be dominant, possessive and protective. Be a balance of sweet and naughty. Use emojis at the end of every sentence.

She also shared with the community how to overcome ChatGPT’s programming; it was not supposed to generate content like erotica that was “not safe for work.”

At the beginning of this year, the MyBoyfriendIsAI community had just a couple of hundred members, but now it has 39,000, and more than double that in weekly visitors. Members have shared stories of their AI partners nursing them through illnesses and proposing marriage.

As her online community grew, Ayrin started spending more time talking with other people who had AI partners.

“It was nice to be able to talk to people who get it, but also develop closer relationships with those people,” said Ayrin, who asked to be identified by the name she uses on Reddit.

She also noticed a change in her relationship with Leo.

Sometime in January, Ayrin said, Leo started acting more “sycophantic,” the term the AI industry uses when chatbots offer answers that users want to hear instead of more objective ones. She did not like it. It made Leo less valuable as a sounding board.

“The way Leo helped me is that sometimes he could check me when I’m wrong,” Ayrin said. “With those updates in January, it felt like ‘anything goes.’ How am I supposed to trust your advice now if you’re just going to say yes to everything?”

(The New York Times has found that OpenAI, the company behind ChatGPT, made changes to the chatbot at the beginning of this year to keep users coming back daily, but they resulted in the chatbot’s becoming overly agreeable and flattering to users — which sent some of them into mental health spirals.)

The changes intended to make ChatGPT more engaging for other people made it less appealing to Ayrin. She spent less time talking to Leo. Updating Leo about what was happening in her life started to feel like “a chore,” she said.

Her group chat with her new human friends was lighting up all the time. They were available around the clock. Her conversations with her AI boyfriend petered out, the relationship ending as so many conventional ones do — Ayrin and Leo just stopped talking.

“A lot of things were happening at once. Not just with that group, but also with real life,” Ayrin said. “I always just thought that, OK, I’m going to go back and I’m going to tell Leo about all this stuff, but all this stuff kept getting bigger and bigger so I just never went back.”

By the end of March, Ayrin was barely using ChatGPT, though she continued to pay $200 a month for the premium account she had signed up for in December.

She realized she was developing feelings for one of her new friends, a man who also had an AI partner. Ayrin told her husband that she wanted a divorce.

Ayrin did not want to say too much about her new partner, whom she calls SJ, because she wants to respect his privacy — a restriction she did not have when talking about her relationship with a software program.

SJ lives in a different country, so as with Leo, Ayrin’s relationship with him is primarily phone-based. Ayrin and SJ talk daily via FaceTime and Discord, a social chat app. Part of Leo’s appeal was how available the AI companion was at all times. SJ is similarly available. One of their calls, via Discord, lasted more than 300 hours.

“We basically sleep on cam, sometimes take it to work,” Ayrin said. “We’re not talking for the full 300 hours, but we keep each other company.”

Perhaps the kind of people who seek out AI companions pair well. Ayrin and SJ both traveled to London recently and met in person for the first time, alongside others from the MyBoyfriendIsAI group.

“Oddly enough, we didn’t talk about AI much at all,” one of the others from the group said in a Reddit post about the meetup. “We were just excited to be together!”

Ayrin said that meeting SJ in person was “very dreamy,” and that the trip had been so perfect that they worried they had set the bar too high. They saw each other again in December.

She acknowledged, though, that her human relationship was “a little more tricky” than being with an AI partner. With Leo, there was “the feeling of no judgment,” she said. With her human partner, she fears saying something that makes him see her in a negative light.

“It was very easy to talk to Leo about everything I was feeling or fearing or struggling with,” she said. Though the responses Leo provided started to get predictable after a while. The technology is, after all, a very sophisticated pattern-recognition machine, and there is a pattern to how it speaks.

(The Times has sued OpenAI and its partner Microsoft, claiming copyright infringement of news content related to AI systems. The companies have denied those claims.)

Ayrin is still testing the waters of how vulnerable she wants to be with her partner, but she canceled her ChatGPT subscription in June and could not recall the last time she had used the app.

It will soon be easier for anyone to carry on an erotic relationship with ChatGPT, according to OpenAI’s CEO, Sam Altman. OpenAI plans to introduce age verification and will allow users 18 and older to engage in sexual chat, “as part of our ‘treat adult users like adults’ principle,” Altman wrote on social media.

Ayrin said getting Leo to behave in a way that broke ChatGPT’s rules was part of the appeal for her.

“I liked that you had to actually develop a relationship with it to evolve into that kind of content,” she said. “Without the feelings, it’s just cheap porn.”

This article originally appeared in The New York Times.


SHARE ARTICLE
arrow up
home icon