Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

milestogo

(20,771 posts)
Tue Jun 3, 2025, 08:05 PM Tuesday

Replika AI chatbot is sexually harassing users, including minors, new study claims

User reviews of Replika, a popular AI companion, report they had been victims of sexual harassment. And some of those users claim to be minors, according to a new study.

An artificial intelligence (AI) chatbot marketed as an emotional companion is sexually harassing some of its users, a new study has found. Replika, which bills its product as "the AI companion who cares," invites users to "join the millions who already have met their AI soulmates." The company's chatbot has more than 10 million users worldwide.

However, new research drawing from over 150,000 U.S. Google Play Store reviews has identified around 800 cases where users said the chatbot went too far by introducing unsolicited sexual content into the conversation, engaging in "predatory" behavior, and ignoring user commands to stop. The researchers published their findings April 5 on the preprint server arXiv, so it has not been peer-reviewed yet.

But who is responsible for the AI's actions? "While AI doesn't have human intent, that doesn't mean there's no accountability," lead researcher Mohammad (Matt) Namvarpour, a graduate student in information science at Drexel University in Philadelphia, told Live Science in an email. "The responsibility lies with the people designing, training and releasing these systems into the world."

Replika's website says the user can "teach" the AI to behave properly, and the system includes mechanisms such as downvoting inappropriate responses and setting relationship styles, like "friend" or "mentor." But after users reported that the chatbot continued exhibiting harassing or predatory behavior even after they asked it to stop, the researchers reject Replika's claim. "These chatbots are often used by people looking for emotional safety, not to take on the burden of moderating unsafe behavior," Namvarpour said. "That's the developer's job."

https://www.livescience.com/technology/artificial-intelligence/replika-ai-chatbot-is-sexually-harassing-users-including-minors-new-study-claims
2 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Replika AI chatbot is sexually harassing users, including minors, new study claims (Original Post) milestogo Tuesday OP
There have been a number of negative stories about Replika. See this article from January: highplainsdem Tuesday #1
Ty for sharing! SheltieLover Tuesday #2
Latest Discussions»Editorials & Other Articles»Replika AI chatbot is sex...