YouTube deleted over 58 million videos and 224 million comments from its platform during the third quarter of 2018 (between July and September), according to a post on its official blog. The videos and comments were removed for violations of YouTube’s community guidelines.
YouTube says it is serious about accurately detecting and quickly removing content that violates its Community Guidelines, so much so that in April of this year it launched the quarterly YouTube Community Guidelines Enforcement Report.
Recently, the scope of the report expanded to include additional data like channel removals, the number of comments removed, and the policy reason why a video or channel was removed.
To cope with the monumental volume of content uploaded to YouTube every day, YouTube uses automated detection tools in addition to human reviewers to help quickly identify spam, extremist content and nudity. The platform has over 10,000 reviewers at the moment.
From July to September of this year, 7.8 million videos were deleted from YouTube. Out of this, 81% were first detected by machines. Of those detected by machines, 74.5% had never received a single view.
In September itself, 90% of the nearly 10,400 videos removed for violent extremism or 279,600 videos removed for child safety issues received fewer than 10 views.
The vast majority of attempted abuse comes from bad actors trying to upload spam or adult content: over 90% of the channels and over 80% of the videos that we removed in September 2018 were removed for violating YouTube’s policies on spam or adult content.
As with videos, the combination automated detection and human reviewers is also applied to flag, review, and remove spam, hate speech, and other abuse in YouTube’s comments section.
In the same quarter, YouTube removed over 224 million comments for violating its Community Guidelines, most of it is spam.