A 16-year-old boy was the victim of an AI-driven sextortion fraud and committed suicide. His family is in mourning over his untimely demise.
Elijah Heacock, a teenager, got a message threatening to post a naked photo created by artificial intelligence unless he paid $3,000.
Soon after, he committed suicide because he was so overwhelmed by the threat.
John Burnett and Shannon Heacock, Elijah’s parents, claim that prior to their son’s passing, they were ignorant of AI-driven sextortion fraud.
According to Burnett, “we had no idea what was happening,” and the criminals were “organized and relentless.”
Thanks to the use of generative AI, Young people are in extreme danger considering the fact that scammers can now generate fake inappropriate pictures more easily.
Over 500,000 reports of sextortion have been received by the National Center for Missing and Exploited Children, indicating a sharp increase in these incidents.
According to the FBI, at least 20 young people have died since 2021, by committing suicide as a result of these scams.
Elijah’s parents are responding by calling for stricter legislation, such as the “Take It Down” Act, which makes it illegal to share obscene photographs without permission, whether they are created by AI or are real.
They hope that their tragedy will spur immediate action to better protect kids from predators on the internet.