YouTube deleted 5 million inappropriate videos in three months
YouTube removed nearly 8.3 million videos from its platform between October and December 2017 as they violated its content policy. Almost 5 million of these videos were removed even before anyone saw them, the video sharing website said on Monday, as it stressed that automated flagging was “paying off” in helping remove videos quicker.
The website has released such a report for the first time, as it has faced criticism in recent years by governments and advertisers for not doing enough to remove extremist and inappropriate content.
The company introduced “machine learning flagging” – deploying software to help identify inappropriate videos – in June 2017. After this, the number of videos removed before they got 10 views has risen to more than half, as compared to 8% in the beginning of 2017, YouTube said in a blog post.
“Machines are allowing us to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed,” YouTube said. “And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).”
About 1.6 million videos were removed after users, activist organizations or governments flagged them. Indians flag the most videos, followed by Americans, the company said.
Almost 31% of all videos flagged by humans are reported as having sexual content, 26% are tagged misleading or spam, and about 16% are reported as hateful or abusive.