YouTube has announced new plans to protect child users of the video site, after a wave of predatory content and behaviour has been unearthed.
YouTube has built its brand on its family-friendly nature, with parents comfortable leaving their kids to browse the site and watch Let’s Play and Minecraft videos, among others. Indeed, YouTube’s minimum age to create an account is 13, with teen vloggers and fans a key part of how the site has grown.
In order for that to be possible, though, a number of safety measures have to be in place to make sure that children can access a version of YouTube free of inappropriate content and behaviour. A investigation by the BBC this month revealed that there has been a loophole in one of those systems: YouTube’s system for reporting sexualised comments on children’s videos relies partly on reports of abuse by the public, but volunteer moderators told the BBC that this has not been working correctly for more than a year. While YouTube reviews the majority of reports within 24 hours, the links attached to the reports are not always correctly included, which means that the comment can be precisely located. Moderators told the BBC there could be up to 100,000 predatory accounts making indecent comments on videos.
The comments range from the sexually explicit to phone numbers of adults and have been left on innocuous, normal videos posted by young people on the site.
This is not the first time YouTube has come under fire for inappropriate content, with videos found earlier this year that were designed to look like popular kids cartoons but contained disturbing or adult content not suitable for children. Some were parodies or clearly intended for adults, but concerns were raised about the risk of children accidentally watching them unawares.
In response to this latest scandal, a number of large brands have pulled advertising from the site, over concerns that their ads were being displayed by content being exploited by paedophiles. Mars, Cadbury, Lidl, Deutsche Bank and Adidas are among the corporate giants who have removed their advertising.
A Mars spokesperson told The Guardian: “We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content. We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally. Until we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.”
YouTube has responded by beginning to purge channels and videos it deems predatory and announcing a new wave of protective measures. Johanna Wright, VP of Product Management, wrote on the YouTube blog that there were aware of “a growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not”.
“While some of these videos may be suitable for adults, others are completely unacceptable, so we are working to remove them from YouTube,” she added.
Wright said YouTube is implementing tougher application of Community Guidelines and faster enforcement through technology.
“We have always had strict policies against child endangerment, and we partner closely with regional authorities and experts to help us enforce these policies and report to law enforcement through NCMEC. In the last couple of weeks we expanded our enforcement guidelines around removing content featuring minors that may be endangering a child, even if that was not the uploader’s intent,” she wrote.
In the last week, over 50 channels have been terminated and thousands of videos have been removed.
YouTube has also implemented policies to age-restrict (only available to people over 18 and logged in) content with family entertainment characters but containing mature themes or adult humor, using machine learning technology and automated tools to find and escalate these for human review.
Back in June, YouTube posted an update to its advertiser-friendly guidelines making it clear that it will remove ads from any content depicting family entertainment characters engaged in violent, offensive, or otherwise inappropriate behavior, even if done for comedic or satirical purposes. Since June, Wright said that 3 million ads have been removed under this policy and that the site has further strengthened the application of that policy to remove ads from another 500K violative videos.
On blocking inappropriate comments on videos featuring minors, Wright wrote:
“We have historically used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors. Comments of this nature are abhorrent and we work with NCMEC to report illegal behavior to law enforcement. Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”
YouTube also re-emphasised the importance of its YouTube Kids platform, announcing plans to release a comprehensive guide to creating content suitable for the kid-friendly app.
The site is also doubling the number of Trusted Flaggers it partners with to help find and assess any videos that may or may not be appropriate for children.
“These latest enforcement changes will take shape over the weeks and months ahead as we work to tackle this evolving challenge,” said Wright. “We’re wholly committed to addressing these issues and will continue to invest the engineering and human resources needed to get it right. As a parent and as a leader in this organization, I’m determined that we do.”