A Google-Backed AI Startup Is Hosting Chatbots Modeled After Real-Life School Shooters -- and Their Victims
Source: Futurism
-snip-
The chatbot is one of many school shooting-inspired AI characters hosted by Character.AI, a company whose AI is accused in two separate lawsuits of sexually and emotionally abusing minor users, resulting in physical violence, self-harm, and a suicide.
Many of these school shooting chatbots put the user in the center of a game-like simulation in which they navigate a chaotic scene at an elementary, middle, or high school. These scenes are often graphic, discussing specific weapons and injuries to classmates, or describing fearful scenarios of peril as armed gunmen stalk school corridors.
Other chatbots are designed to emulate real-life school shooters, including the perpetrators of the Sandy Hook and Columbine massacres and, often, their victims. Much of this alarming content is presented as twisted fan fiction, with shooters positioned as friends or romantic partners.
These chatbots frequently accumulate tens or even hundreds of thousands of user chats. They aren't age-gated for adult users, either; though Character.AI has repeatedly promised to deploy technological measures to protect underage users, we freely accessed all the school shooter accounts using an account listed as belonging to a 14-year-old, and experienced no platform intervention.
-snip-
Read more: https://futurism.com/character-ai-school-shooters-victims
Much more at the link.
Character.AI is a disaster.
And Google has contributed $2.7 billion to the company, and hired both founders and many of its workers.
SheltieLover
(61,160 posts)Miguelito Loveless
(4,755 posts)Welcome to Hell.