According to The Verge, from June to August YouTube removed 500 million comments, 100,000 video recordings and over 17,000 channels. This is the result of the popular platform’s battle with hate speech. As a result of tightening the regulations on using the website, the number of interventions is several times higher than in previous months.
YouTube estimated that for video content the number of interventions was twice as high as in the first quarter of 2019, five times for user comments. Much of the removed content concerned ethnic hatred or was classified as inappropriate for minors. According to The Verge , YouTube also boasted the results of self-learning systems that it uses. Thanks to them, as much as 80 percent content was removed from the platform before it was first viewed by users.
Google changes YouTube’s regulations
YouTube began to make changes to published content in mid-2017. It was a response to allegations against a company that has been met with criticism for several years regarding content appearing on YouTube, comments or entire channels, led, among others, by so-called patostreamers .
In mid- June , Google CEO Sundar Pichai admitted that the company is seriously intending to change this situation. He emphasized that the fight against the phenomenon of hate speech on a popular website is for him one of the most difficult tasks he has ever faced. In an interview with CNN , he also admitted that YouTube is currently too big to completely eliminate hateful content.
It is estimated that nearly 2 billion registered users use YouTube monthly. Every minute, over 500 hours of video content gets to the platform.