This week, there were several articles revolving around the liability of chatbot operators for teen suicides linked thereto. Apple is facing a lawsuit saying they did not do enough to remove CSAM from their services.
Chatbot Operator Liability for Teen Suicide
- https://arstechnica.com/tech-policy/2024/12/character-ai-steps-up-teen-safety-after-bots-allegedly-caused-suicide-self-harm/
- https://arstechnica.com/tech-policy/2024/12/chatbots-urged-teen-to-self-harm-suggested-murdering-parents-lawsuit-says/
- https://www.npr.org/2024/12/10/nx-s1-5222574/kids-character-ai-lawsuit
Apple CSAM Lawsuit