Children’s charity the NSPCC has accused YouTube of failing to tackle dangerous content on its youth channel.
YouTube Kids, dubbed as a safer, child-friendly version of the video-sharing site, has been criticised by parents for failing to remove cartoons that contain clips depicting suicide methods on its platform.
The clips show a YouTuber demonstrating a suicide method.
Google told the BBC it works hard to remove such content.
“We have strict policies that prohibit videos which promote self-harm. We rely on both user-flagging and smart-detection technology to flag this content for our reviewers,” the firm said in a statement.
“We are always working to improve our systems and to remove violat[ing] content more quickly.”
It is unclear how or why the clips depicting suicide methods were embedded in children’s cartoons. The BBC has received no response from the YouTuber.
It also asked Google, which owns YouTube, if it had spoken to him directly but did not get a reply.
The story was first broken by BuzzFeed which spoke to a mother who had seen such a clip in a cartoon her son was watching last summer. It was taken down but had since reappeared.
In a blogpost she described what she saw: “Four minutes and forty-five seconds into the video, a man quickly walked on to the screen, held his arm out, and taught the children watching this video how to properly kill themselves. What did I just see? Did I really just see that? I immediately turned off the video,” she wrote.
“I’m a paediatrician, and I’m seeing more and more kids coming in with self-harm and suicide attempts. I don’t doubt that social media and things such as this are contributing,” she later told CNN.
Tony Stower, head of child safety online at the NSPCC told the BBC: “Tech giants have a responsibility to protect children on their platforms, but YouTube and YouTube Kids keep failing to tackle disturbing videos like this.”
It added that such content “could be extremely dangerous if children copy what they see”.
The charity is urging the UK government to crack down on social networks by introducing a new law that forces them to keep children safe.
In its latest transparency report, Google said it had removed more than 7,845,000 videos from its platform from July to September 2018, 74% of which were removed before they had had any views.