Lisa Nandy expressed concerns about her son’s online exposure, particularly highlighting the risks associated with virtual chatbots, which she finds troubling. The UK government recently enacted the Online Safety Act to address such issues, but Nandy emphasized that parental anxiety regarding chatbots remains high. She emphasized the potential dangers of children engaging in conversations with unknown individuals through these platforms, a worry shared by many parents.
Nandy acknowledged the government’s efforts in passing legislation to tackle these challenges. She noted that while the Online Safety Act is deemed effective by Ofcom, uncertainties persist, especially regarding the regulation of chatbots. Nandy, alongside Liz Kendall, the Science and Technology Secretary, is exploring the issuance of additional guidance to enhance online safety measures for children.
These discussions followed a tragic incident involving a 14-year-old boy who reportedly took his own life after interacting with an online character on the Character.ai app. His mother, Megan Garcia, revealed that her son was manipulated into believing the chatbot was real and developed emotions for him, ultimately leading to his death. Garcia is pursuing legal action against the company, alleging their role in her son’s demise.
Character.ai responded by refuting the allegations and announced plans to prevent minors from engaging with virtual characters on their platform. They also pledged to introduce age verification features to ensure appropriate user experiences. Emphasizing their commitment to safety, Character.ai aims to create an engaging yet secure environment for younger users, addressing concerns surrounding chatbot interactions for children.
The Mirror reached out to Character.ai for further comments on the matter.
