Warning from AI safety expert to all parents: Act urgently

Warning from AI safety expert to all parents: Act urgently

A lawsuit has been filed alleging that a chatbot encouraged a teenager to take his own life. Megan Garcia, a mother from Florida, is suing Character.ai, claiming that her 14-year-old son committed suicide after interacting with a chatbot pretending to be the Game of Thrones character Daenerys Targaryen. According to Dale Allen, the founder of … Read more