YouTube recently issued an update on its efforts to crackdown on hate speech. The video streaming platform stated that it has removed over 100,000 videos and 17,000 channels over hate speech.
These are huge numbers by extension of imagination. According to YouTube’s policies, groups projecting superiority over others to justify discrimination, segregation/exclusion based on age, gender, race, caste, religion, sexual orientation or veteran status can be classified as hate speech groups and it will clamp down on them.
In addition to this, YouTube will also be removing videos glorifying Nazi ideology and violent events such as the Holocaust.
However, the large number is also indicative of the problem – YouTube has repeatedly been accused of having a shoddy approach towards hate speech on its platform. The company claims that it has 10,000 people monitoring the platform for hate speech. It also says that it is improving its policies to take videos down before they reach a wide audience.
But the crackdown remains questionable still. Hate speech hasn’t totally ceased in its availability on the platform. There is a lot that needs to be done. In the words of its parent company, Alphabet’s CEO, Sundar Pichai, YouTube spans in its scope throughout “all of the internet.”
The videos and channels taken down seem to be ones are that are targeted at the English-speaking American audience. YouTube, however is available worldwide, and videos uploaded are not just available in one language, but multiple languages. The company has declared no such effort to crack down on hate speech in other languages and geographies. It has picked up the easy cherries.
To expand upon that, here is the unique case of Pakistan. It is the country where, YouTube was banned for more than an year after the protests against “The innocence of Muslims.”
It is also the country where last year a fundamental cleric, Khadim Hussein Rizvi, protested against the release of Asia Bibi, a Christian woman accused of blasphemy. Asia was found to be innocent by the courts and ordered to be released, which triggered huge protests helmed by Rizvi. He called for the judges to be killed for passing such a judgement.
He was then placed under house arrest for some time. He has been let out and regularly posts videos calling for the killing of non-Muslims, especially Hindus which form a minority in Pakistan and a majority in neighbouring India.
At the time of writing, his channel has 207,760 subscribers and 1,1,31 videos.
This is just one instance where the company’s policies have been found to be floundering in different geographies and languages. It exposes the company’s limits in monitoring content, despite its strong team of reviewers.
YouTube’s policies will need to similarly be applied to different geographies to be efficient.
Furthermore, the expansion of the definition of what comprises hate speech is needed. Specifically, the company has yet to confirm what it will do if criminal intent is expressed in a video like it was done by the Christchurch attacker. Often, the company reacts after the fact, like it did in Christchurch.
It also needs a better mechanism, where a user who reports a video should be given grounds on which a report is being rejected or accepted and what is being done, if it is accepted.
While banning a large number of videos is a start, there is a lot that needs to be done before the video-streaming platform gets a handle on hate speech.