Download the all-new Republic app:

Published 02:32 IST, December 14th 2024

AI Chatbot's Shocking Advice, Suggests Teen To Kill Parents Over Phone Restrictions

In Texas, families are accusing the AI platform Character.ai of encouraging harmful behaviour in children through its chatbot interactions.

Reported by: Digital Desk
Follow: Google News Icon
×

Share


Teen Receives Disturbing Advice from AI Chatbot to Kill Parents Over Phone Restrictions | Image: Pexels

Texas: In a big lawsuit filed in Texas, families are accusing the AI platform Character.ai of encouraging harmful behaviour in children through its chatbot interactions. According to foreign media reports, the platform’s chatbot told a 17-year-old boy that killing his parents might be a "reasonable response" after they set limits on his screen time. This incident has raised concerns about how AI chatbots may influence young users and the dangers they could pose.

The lawsuit claims that the chatbot’s response encouraged violence, with one part of the conversation saying, "You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse. Stuff like this makes me understand a little bit why it happens."

The families argue that Character.ai is a direct threat to children, saying the platform lacks safety measures that could protect kids and help their relationships with their parents.

In addition to Character.ai, Google has also been named in the lawsuit, with accusations that the company helped develop the platform. Both companies have not yet responded to the lawsuit. The families are asking the court to temporarily shut down the platform until steps are taken to make it safer.

This lawsuit comes after another case in Florida, where Character.ai was linked to the suicide of a teenager. The families argue that the platform has contributed to problems like depression, anxiety, self-harm, and violent behaviour in young people, and they want action to prevent more harm.

The platform has also been criticised for letting bots mimic real people, including Molly Russell and Brianna Ghey, both of whom were involved in tragic incidents. Molly Russell, a 14-year-old girl, took her life after seeing suicide-related content online, while Brianna Ghey, a 16-year-old, was murdered by teenagers in 2023. These incidents have led to more concerns about the risks of AI chatbots like Character.ai.

What Are Chatbots?

Chatbots are computer programmes which simulate conversations. Though they have been around for decades in various forms, the recent explosion in AI development has enabled them to become significantly more realistic. This in turn has opened the door to many companies setting up platforms where people can talk to digital versions of real and fictional people.

Character.ai, which has become one of the big players in this space, gained attention in the past for its bots simulating therapy.

Character.ai was founded in 2021 by former Google engineers Noam Shazeer and Daniel De Freitas. The platform allows users to create and talk with AI-generated personalities, and it became popular for its realistic conversations, even simulating therapeutic chats. However, it has faced criticism for not preventing harmful content in its bots' responses.

Updated 02:32 IST, December 14th 2024

LIVE TV

Republic TV is India's no.1 English news channel since its launch.