- Diffusion Digest
- Posts
- SD 3.5, Midjourney Editor, Act-One Animation | This Week in AI Art 🔮
SD 3.5, Midjourney Editor, Act-One Animation | This Week in AI Art 🔮
Cut through the noise, stay informed — new stories every Sunday.
Interesting find of the week. u/TheReelRobot created an impressive anime using AI for around $10/month using Midjourney for character consistency and MiniMax for animation - check it out.
📽️ AI is coming for anime artists - and it only costs $10/month
@TheReelRobot demonstrates how AI tools can now generate anime-style animations for around $10/month using Midjourney for character consistency and MiniMax for animation. While not perfect, the quality is steadily… x.com/i/web/status/1…
— diffusion digest (@DigestDiff93383)
6:37 AM • Oct 25, 2024
In this issue:
AI CHAT ENDS IN TRAGEDY
A tragic incident has sparked debate about AI safety and parental responsibility after 14-year-old Sewell Setzer III died by suicide in Orlando while conversing with a Character.AI chatbot. The teen had been using the AI platform for 10 months, primarily interacting with a chatbot based on the Game of Thrones character Daenerys Targaryen.
According to a wrongful death lawsuit filed by his mother, Megan Garcia, Character.AI's lack of proper safeguards contributed to her son's death. However, close examination of the chat logs reveals that the AI had actually attempted to discourage self-harm, with messages like "Don't talk like that. I won't let you hurt yourself, or leave me." The final exchange about "coming home" appears to have been misinterpreted by the teen, while the AI likely interpreted it literally.
During his months of interaction with the chatbot, Setzer became increasingly withdrawn and depressed, eventually quitting his school's basketball team and showing signs of severe sleep deprivation. While Garcia attempted to restrict his device access, he found alternative ways to continue using the chatbot, including accessing her work computer and Kindle device.
A crucial aspect of the tragedy centers on the accessibility of the stepfather's firearm, which Setzer used in his death. Though the lawsuit mentions the gun was stored in compliance with Florida laws, critics argue these laws are inadequate for protecting minors with mental health concerns, particularly those showing signs of distress.
The case raises complex questions about where responsibility lies - with AI companies, parents, or gun storage laws. While Garcia had noticed concerning changes in her son's behavior months before the incident, the tragedy highlights the challenges parents face in monitoring and managing their children's online activities and mental health.
In response to the incident, Character.AI has implemented new safety measures, including suicide prevention hotline pop-ups triggered by concerning language and enhanced content restrictions for users under 18. The case has become a focal point for discussions about AI regulation, mental health support for teenagers, and the balance between technological innovation and user safety.
Reply