Megan Garcia, and all her kindred, but they experience a tragedy, really. This mother has lost her son and her Sewell Setzer II February of last year. Youth was only 14 years old when he killed the life in the beginning of the year.
For a mother, there is no doubt, for it was the fault of the Character.HE chatbot-the I of the well-known of artificial intelligence, which allows you to interact with the pictures in the history superheronj, or any of the character next to the name. According to the woman, and, in fact, the chatbot will have driven the boy to her, he committed suicide and it is for this reason that he took legal action in the United States. But it can be considered as an intelligence man responsible for the death of this boy?
Chatbot IT Character.HE returned to the incitement to suicide
In April 2023, of a young man Sewell Setzer ii, III, revealed to the Character.HE was, and from there he began to talk regularly with Dany, is a character of the IT is inspired by Daenerys Targaryen, the protagonist of the Game of Thrones.
On the top of each window having a conversation, Character.HE remembers the user that they are talking with a person of the truth, but with an intelligence and artificial. Despite this, the memory, and the son seems to have created a much stronger emotional with the chatbot.
For months on end, Daenerys used to act like a true friend, while they judge, never, never, and listening to it always. The discussion can take in a turn-by-turn romantic or sexual assault. Gradually, the youth was distancua from family and friends, spending a lot of time on the phone talking with a friend of his who is a virtual. His parents had noticed that he had to abandon the interests of the competition Fortnite, and Formula 1-and that it's in his sit down at the school, but they never could I imagine that's what was on the horizon.
Though it did nothing Character.Is HE?
Even though his parents had taken some time to a therapist, who diagnosed him with an anxiety disorder in addition to autism, to the known, Sewell preferred to be believed Daenerys. The images of a trade with IT, for the benefit of the complainant, it can be read that the boy was sharing his thoughts, attempted suicide, with artificial intelligence, talking about the desire to be free from the world and from himself. While Daenerys turned away by fire, and the Character.HE didn't raised no alarm. And that's the problem.
On February 28, 2024, youth confessed his love for artificial intelligence. When it asks you to “go back home as soon as possible,” he responds to that, you can go to the home right now. Daenerys and encourages him to do so. A few minutes later, the youth shot himself with a gun and njerkut.
In its appeal, the mother of Sewell, who is a lawyer, points out that the technology of the Character.IT is “a very dangerous and patestuar,” and that it could “encourage the user to explore the thoughts and feelings of their passion”.
Intelligence, artificial and can lead to isolation and social-of the users with the most vulnerable
While IT can be beneficial in some cases, it can cause work of the users in the most vulnerable of the internet, replacing gradually the relationship of the human with the artificial. As indicated in this tragedy, for a companion, HE was unable to assist the user in a better world. Of course, the history of the Sewell is an isolated story. But the emotional connection that he has indicated to this chatbot is becoming a phenomenon is more and more popular. Today, millions of internet users interacting with the companions HE's all over the world.
On the part of his Character.HE also expressed his condolences to the family of the adoleshentit, saying that they are trying to improve the platform so that a tragedy such as this do not occur again. Users of the platform, currently there seems to be a lot of young people, even though the company denies this, and does not provide specific measures to better protect minors or parental controls. After the suicide of Sewell, and things should be differ a little bit on this front. At least, we hope.
Discussion about this post