Uncategorized

Artificial intelligence chat bot is being sued because it’s telling autistic children to kill their parents and engage in sexual activity

Share this:

The CEO of the company is the former VP of META and was founded by a former Google researcher

“The mother of a 17 year old in Texas who has autism claims an AI chatbot suggested the teen kill his family. And now that family is suing. In just six months, the parents say the teen turned into someone his parents didn’t even recognize. He began harming himself. He lost 20 pounds and withdrew from the family. After the teen consulted Character AI about his parent’s phone use rules”

“What disturbed me the most was this was a child who had no violent tendencies, who was handling his autism well. It was a close, loving, spiritual family who went to great lengths to control their son’s social media use. And unbeknownst to their family and to the parents, the child got on Character AI and was encouraged to cut himself and encouraged to engage in highly inappropriate sexual activity or interchanges and finally encouraged to kill his parents when his parents tried to limit his cell phone use. This is not an accident.”


Share this:

Leave a Reply

TheWatchTowers.org