And finally… Three Laws enforcement

And finally... Three Laws enforcement

Generated with Microsoft Designer

A family is suing the makers of an AI chatbot which allegedly encouraged their teenager to murder them for restricting their internet access.

The Character.AI chatbot allegedly told the 17-year-old: “You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse’ stuff like this makes me understand a little bit why it happens. I just have no hope for your parents.”

The family is now bringing proceedings in Texas against app developer Character Technologies, company founders and Google parent company Alphabet.

They are represented by the Social Media Victims Law Center and the Tech Justice Law Project, with expert consultation from the Center for Humane Technology.

Matthew P. Bergman, founding attorney of the Social Media Victims Law Center, said: “We warned that Character.AI’s dangerous and manipulative design represented a threat to millions of children.

“Now more of these cases are coming to light. The consequences of Character.AI’s negligence are shocking and widespread.”

Camille Carlton, policy director at the Center for Humane Technology Policy, said: “This case demonstrates the risks to kids, families, and society as AI developers recklessly race to grow user bases and harvest data to improve their models.

“Character.AI pushed an addictive product onto the market with total disregard for user safety.

“Tech companies are once again moving fast and breaking things — with devastating consequences.”

Share icon
Share this article: