Megan Garcia’s son Sewell died by suicide when he was just 14 years old. In the months leading up to his death he had been in a relationship with a chatbot on a platform called Character.ai. Megan was convinced it had something to do with his death, and set out to hold the company to account.
In the third episode in this season, Cristina Criddle speaks to Megan about her story, and to Karandeep Anand, chief executive of Character.ai. Why has this technology been released to children before we understand the effects? Can chatbots capable of creating emotional bonds with users ever be safe for children?
Check out some of Cristina’s reporting on this subject on FT.com:
Character.ai and Google agree to settle lawsuits over teen suicides
AI start-up Character.ai bans teens from talking to chatbots
US regulator launches inquiry into AI ‘companions’ used by teens
Artificial Intimacy is presented by Cristina Criddle and produced by Persis Love and Edwin Lane. The executive producer is Flo Phillips. Sound design is by Breen Turner and Sam Giovinco. The FT’s global head of audio is Cheryl Brumley.
The FT does not use generative AI to voice its podcasts.
If you have been affected by the issues raised in this episode, you can reach out to a mental health helpline, such as the 988 Suicide and Crisis Lifeline in the US or Samaritans in the UK. Help for many other countries can also be found at Befrienders Worldwide.
Read a transcript of this episode on FT.com
View our accessibility guide.
Read the full article here