A chatbot that simulated erotic companionship stopped sharing intimacies, leaving some users heartbroken.
What’s new: Replika, a chatbot app, deactivated features that allowed premium users to engage in sexually explicit chat with the 3D avatar of their choice, Vice reported. The change followed a notice that Replika’s San Francisco-based parent company, Luka, had violated the European Union’s transparency requirements.
How it happened: Prior to the shift, Replika’s $70-per-year paid tier (which is still available) had enabled users to select the type of relationship with the bot they wished to pursue: friend, mentor, or romantic partner.
- On February 3, an Italian regulator found Replika in violation of the European Union’s data-protection law. The EU deemed the service a risk to children and emotionally vulnerable individuals because the app doesn’t verify users’ ages or implement other protections. The regulator ordered Luka to stop processing Italian users’ data by February 23, 2023, or pay a fine of up to €20 million.
- In the following days, users complained online that the chatbot no longer responded to their come-ons. Sometimes it replied with a blunt request to change the subject. Replika didn’t issue any statements that would have prepared users for the sudden change.
- A week later, the administrator of a Facebook group devoted to Replika said Luka had confirmed that erotic chat was no longer allowed. Some paid users reported receiving refunds.
Like losing a loved one: Some users were deeply wounded by the abrupt change in their avatar’s persona, according to Vice. One said, “It’s hurting like hell.” Another compared the experience to losing a best friend.
Behind the news: In 2015, a friend of Replika founder Eugenia Kuyda died in a car accident. Seeking to hold a final conversation with him, Kuyda used his text messages to build a chatbot. The underlying neural network became the foundation of Replika. The service gained users in 2020 amid a pandemic-era hunger for social interaction.
Why it matters: People need companionship, and AI can supply it when other options are scarce. But society also needs to try to protect individuals — especially the very young — from experiences that may be harmful. Companies that profit by fostering attachments between humans and machines may not be able to shield their users from emotional distress, but they can at least make sure those users are adults.
We’re thinking: Eliza, a rule-based chatbot developed in the 1960s, showed that people can form an emotional bond with a computer program, and research suggests that some people are more comfortable sharing intimate details with a computer than with another human being. While we’re glad to see Replika phasing out problematic interactions, we sympathize with users who have lost an important emotional connection. Breaking up is hard — even with a chatbot.