Home / News / Culture Watch / Youtube kid videos have suicide tips hidden inside

Youtube kid videos have suicide tips hidden inside

Do you know a kid who watches cute cartoons on Youtube? They might be getting lessons and tips for committing suicide.

A Florida doctor is warning parents about how cartoons on YouTube are being used to target and expose children to suicidal ideas and tips.

Dr. Free Hess, a pediatrician based in Gainesville, recently took to her blog to warn about a cartoon posted on the Google-owned, user-generated video hosting platform that had a clip spliced in the middle featuring an adult male demonstrating for kids how to cut their wrists.

“Remember kids: sideways for attention, longways for results,” the man says in the clip.

He then points at the camera and shouts, ‘End it!’

The issue is finally getting attention on national media  outlets like CBS News, ABC and others.

The video comes after Hess had previously spoken out about the same clip being spliced into a cartoon on the YouTube Kids app several months back. That clip was removed from the app after she pointed it out.

But Hess was alerted more recently about another cartoon video — this time posted to YouTube — that had a similar clip spliced in. Comments below the video showed that people were complaining about the video for as long as eight months. At the time of Hess’ blog post, the video still hadn’t been removed by YouTube.

However, in the days following the blog post and ensuing media attention, YouTube removed the video and deemed it violated YouTube’s terms of service.

“Exposure to videos, photos, and other self-harm and suicidal promoting content is a huge problem that our children are facing today,” Hess wrote. “Suicide is the SECOND leading cause of death in individuals between the ages of 10 and 34, and the numbers of children exhibiting some form of self-harm is growing rapidly.”

Hess pointed out that a nationwide survey of high school students in the U.S. found that over 1.5 high school students out of 10 have seriously considered suicide. The percentage of high schoolers who have seriously considered suicide has risen 25 percent since 2009, according to the Centers for Disease Control and Prevention.

“Every year 157,000 young people between the ages of 10 and 24 present to Emergency Departments for self-inflicted injuries and/or suicide attempts,” Hess stressed. “Many experts believe that access to self-harm and suicide-promoting content is making the situation worse. There have been several recent reports of teens committing suicide after viewing self-harm and suicide material online and on social media platforms.”

“More and more researchers are starting to look into how access to this type of material is linked to self-harm and suicide in adolescents,” she added. “One such study has just been commissioned and will hopefully give us some good insight into this issue.”

Hess warned that just because those two videos have been removed doesn’t mean there aren’t similar videos available online.

She also said that suicide-promoting content is not limited to just one channel as she has seen such content on various cartoons on different YouTube channels.

“I’ve seen many different videos on both YouTube and YouTube Kids,” Hess told Fox4 in Southwest Florida in an interview. “Things like self-harm, cutting, suicide, shooting and violent videos.”

In her blog post, Hess asserted that something must be done “now” to prevent the spread of the videos.

“We should start by educating ourselves, educating our children, and speaking up when we see something that is dangerous for our children,” Hess wrote. “We also need to fight to have the developers of social media platforms held responsible when they do not assure that age restrictions are followed and when they do not remove inappropriate and/or dangerous material when reported.”

A YouTube spokesperson said in a statement shared with ABC News that the organization works hard to ensure that its platform is not used “to encourage dangerous behavior.”

“[W]e have strict policies that prohibit videos which promote self-harm,” the statement reads. “We rely on both user flagging and smart detection technology to flag this content for our reviewers.”

Every quarter, YouTube removes millions of videos and channels that “violate our policies,” according to the spokesperson.

“[W]e remove the majority of these videos before they have any views,” the statement assured. “We are always working to improve our systems and to remove volative content more quickly, which is why we report our progress in a quarterly report and give users a dashboard showing the status of videos they’ve flagged to us.”

X
X