YouTube removes 8 million videos in 3 months as crackdown continues
David Farnor | On 25, Apr 2018
YouTube removed 8 million videos in just three months, as it steps up its crackdown upon videos violating its terms and conditions.
The video giant has been under increasing pressure to tighten its standards, after a number of advertisers were alarmed to discover their ads had played in front of videos with extremist or other inappropriate content. With vloggers such as Logan Paul also uploading controversial videos and bringing the site’s reputation into questino, YouTube has been investing in both people and computers to detect and review problematic uploads that violate its community standards. As part of this push, YouTube has confirmed that it removed over 8 million views in the final quarter of 2017. The majority of these were mostly spam or people attempting to upload adult content, and represent a fraction of a percent of YouTube’s total views during that three-month period.
The increase in scale is notable: at the beginning of 2017, 8 per cent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views, but with machine learning introduced in June 2017, more than half of the videos removed for violent extremism now have fewer than 10 views. In the last three months of 2017, 76 per cent of the 6.7 million videos flagged for review were removed before they received a single view.
YouTube’s crackdown is unlikely to resolve complaints from creators who say that the rules are being haphazardly applied and damaging people with non-violative content, who are finding it harder to monetise their work. YouTube, however, insists that its system is using algorithms to flag up videos, before using a panel of humans to review and assess each one.
“Our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam),” says the video giant in a blog post.
Last year, it committed to bring the total number of people working to address violative content to 10,000 across Google by the end of 2018. This is begin achieved by hiring full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and expanding regional expert teams. The majority of additional roles needed to reach this contribution to meeting that goal have already been appointed. YouTube is also investing in the network of over 150 academics, government partners, and NGOs that help its work, including the International Center for the Study of Radicalization at King’s College London, Anti-Defamation League, and Family Online Safety Institute, as well child-safety focused partners, such as Childline South Africa, ECPAT Indonesia, and South Korea’s Parents’ Union on Net.
YouTube’s steps are being outlined in a new quarterly Community Guidelines Enforcement Report, which details its efforts and their outcome – a move towards public disclosure that, at a time of data privacy scandals, will put pressure on Facebook and Twitter to be equally transparent.
“This regular update will help show the progress we’re making in removing violative content from our platform,” says YouTube. “By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons.”
A Reporting History dashboard is also being introduced to allow each YouTube user to see the status of videos they’ve flagged for review against YouTube’s Community Guidelines.
“We are committed to making sure that YouTube remains a vibrant community with strong systems to remove violative content and we look forward to providing you with more information on how those systems are performing and improving over time,” adds the site’s blog post.