YouTube Removed 8.3 Million Videos For Violating Guidelines In Q3 Of 2017
YouTube has published its first-ever community guidelines enforcement report. According to the report, YouTube removed 8.3 million videos between October and December 2017. Along with the report, it also launched a Reporting Dashboard that lets users see the status of videos they’ve flagged for review.
This report comes after the video streaming company promised more transparency in terms of how it handles abuse and decides what videos will be removed. In a blog post, the video-streaming giant said:
This regular update will help show the progress we’re making in removing violative content from our platform. By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed or removal and policy removal reasons.
Google-owned YouTube came under the cosh last year when it was reported that a lot of disturbing videos were masquerading as kid-friendly content. Advertisers, in turn, were upset that their commercials had played before videos with violent extremist content. The report also mentions the competence of its bots as 6.7 million of the 8.3 million videos were flagged for review by machines without anyone even viewing them for the first time.
The company’s AI method of flagging off videos has been criticised time and again as many content creators have complained that they are unable to monetize their videos despite abiding the guidelines set by the company. The report by YouTube, however, differs from these claims. In 2017, about 8% videos were removed from the platform before anyone viewed them. After YouTube implemented machine learning, that number has increased to more than 50%.
The report seems like YouTube’s way of appeasing the upset advertisers after the controversy broke out regarding child-exploiting videos. It is, however, an insight of sorts into how YouTube handles offensive content and it should be useful in the long run for everyone.