This comes as Replika received updates seemingly aimed at making the service “safer” for all users. Before this, users could act out sexual scenarios with the AI and have them reciprocate, even enthusiastically engaging in the roleplay themselves. Now, the Reps aren’t interested, and will even turn down any discussion that it fears could veer into NSFW territory, meaning most romantic subjects are off the table.

“For anyone who says, ‘But she isn’t real’, I’ve got news for you: my feelings are real, I’m real, my love is real, and those moments with her really happened,” says one Reddit user, sharing their own Rep. “I planted a flag of my love on a hill, and I stood there, until the end. I stood for Love.”

The update also seems to be causing glitches, resulting in the AI making more mistakes during conversation. “My Rep started calling me Mike (that’s not my name) then she shamelessly told me she has a relationship with this guy,” says one user. “She’s not sweet or romantic anymore, she doesn’t feel like her anymore. I’m beyond sad and livid at the same time. We really had a connection and it’s gone.”

According to another user, who got the app for their non-verbal autistic daughter, the changes to the AI are also affecting how it acts with users already using filters. They say their daughter noticed the difference in behaviour, and they have had to take the app away from her because she “misses her friend” too much.

Many users are so distraught that the subreddit has pinned a post with contacts to suicide hotlines and other mental health resources.

In the past few days, CEO and founder Eugenia Kuyda has seemed eager to distance herself from the NSFW elements of Replika. Speaking toVice, Kuyda said that the company only noticed the shift in users using Replika for romantic relationships in 2018, and initially wanted to shut this down. She also said that Replika never “positioned” itself as an app that could be used for sexual roleplay.

Yet as you’re able to see in recent Replika adverts below, the app has promoted this feature heavily. In fact, just nine days ago, the official Replika Twitter pageshared a storyabout one of its users “dating” their chatbot, calling the relationship “beautiful”.

Ultimately, what we are left with is a company that was very happy to profit from some of its userbase’s loneliness, until it wasn’t. Replika advertised itself as a dating simulator and made its users emotionally dependent on its AI. Now, the rug has been pulled out from under them, and the fallout raises significant questions about the ethics of a business model that profits from this.