+
 
For the best experience, open
m.thewire.in
on your mobile browser or Download our App.
You are reading an older article which was published on
Jun 14, 2022

Hindu Nationalist Groups Exploiting YouTube to Target Muslims, Women: Report

The report says the "targeting of Muslims" by backers of the BJP and other right-leaning Hindu nationalist groups is the "most troubling abuse of YouTube" in the country.
Silhouettes of laptop and mobile device users are seen next to a screen projection of the YouTube logo in this picture illustration. Photo: Reuters/Dado Ruvic
Listen to this article:

New Delhi: Social media has intensified religious intolerance in India, says a report by the NYU Stern Center for Business and Human Rights which highlights the “targeting of Muslims” by backers of the ruling Bharatiya Janata Party and other right-leaning Hindu nationalist groups as the “most troubling abuse of YouTube” in the country.

The report, titled ‘A Platform “Weaponized”: How YouTube Spreads Harmful Content – And What Can Be Done About It’, outlines YouTube’s role in spreading political disinformation, public health myths, and incitement of violence. Though it largely focuses on developments in the US – such as misinformation regarding the COVID-19 pandemic and conspiracy theories of election fraud mooted by Donald Trump – the report also has smaller sections on India, Brazil and Myanmar.

It says “organised misogynists in South Korea, far-right ideologues in Brazil, anti-Muslim Hindu nationalists, and supporters of Myanmar’s oppressive military regime have all exploited YouTube’s extraordinary reach to spread pernicious messages and rally likeminded users”.

The NYU Center for Business and Human Rights has been publishing reports on the effects of social media on democracy after Russia’s alleged attempts to meddle in the 2016 US presidential campaign through Facebook, Twitter, and YouTube. The new report says YouTube “served as a megaphone” for Vladimir Putin’s disinformation about Ukraine and its relations with the West for years before Russia invaded its western neighbour.

In the India section, the report flags conspiracy theories that were spread at the beginning of the coronavirus pandemic that Muslims purposefully spread the virus as a form of ‘jihad’, the targeting of Muslim vendors and the presence of several channels that thrive on Islamophobic content that demeans Muslims and incites violence against them.

“Religious intolerance long predated the arrival of YouTube in India, but widespread social media use has intensified the hostility,” the report says.

India is YouTube’s biggest market.

Left, right, an example of ‘coronajihad’ cartoons circulated by Hindutva groups; centre, a Nazi propaganda poster blaming Jews for the spread of lice. Source: Lakshmi Murthy, ‘The Contagion of Hate in India’ and US Holocaust Museum

The report also says “menacing online attacks on women often blend with anti-Muslim themes in India”, saying a spate of misogynistic rants by “nationalistic Indian YouTube influencers have made such invective popular on the platform”. It links to a report about popular YouTube personalities who issued physical threats to women. “When YouTube deleted some of the misogynistic accounts, creators simply started new ones. Another genre of hate videos features photos of Muslim women gleaned from public sources and facetiously puts the subjects ‘up for sale’, sometimes leading to abusive comments and talk of rape,” it says, referring to the Sulli Deals and Bulli Bai apps.

Prateek Waghre, a researcher with the Takshashila Institution, told the authors of the report that YouTube faces a difficult task in monitoring and taking action against accounts which spread pernicious messages.

“You will rarely find a YouTube content creator who sticks to just one language, or especially just English. They are most likely switching between at least English, Hindi, probably a bunch of other languages as well,” Waghre said. The problem is further complicated because the linguistic jumps don’t occur within just videos but “sometimes within a single sentence”.

The report also says the Union government’s new IT Rules, which give it the authority to remove content on social media platforms, complicate these challenges.

Also Read: After Alt News Raised the Issue, Facebook, YouTube Ban Accounts Spreading Hindutva Hate

The researchers say that less is known about YouTube than Facebook or Twitter, mostly because it is difficult and expensive to analyze a large volume of videos than it is to search for words or phrases in a text data set of Facebook or Twitter posts. They also say the company makes itself “almost inscrutable”.

The report discusses the “rabbit hole” effect – people finding more and more extreme videos through YouTube’s recommendations. While they cite one NYU research to say that YouTube recommendations did nudge both Republicans and Democrats toward more conservative and more homogenous content, this push was “very mild on average”.

“In a small fraction of cases,” the researchers say, “users did ‘fall down a rabbit hole’ in which they were only shown relatively extreme recommendations for the duration of their time on the platform.”

Another researcher, also cited by the new report, said, “The danger lies not in the average user experience but the ability of people inclined toward extremism to easily find what they’re looking for or project violent intentions.”

Brendan Nyhan, a Dartmouth political scientist, told the report’s authors he was less worried about unwitting users “tumbling down rabbit holes than about small numbers of extremists who know what they’re looking for and find it by means of YouTube’s search function or by subscribing to channels dedicated to conspiratorial or hateful content”.

The report makes several recommendations – to YouTube and the US government – to counter these problems. It asks YouTube to disclose more information about how the platform works; facilitate greater access to data that researchers need to study YouTube; expand and improve human review of potential harmful content; and invest more in relationships with civil society and news organisations.

The US government should allocate political capital to reduce the malign side effects of social media and enhance the Federal Trade Commission’s authority to oversee social media.

Make a contribution to Independent Journalism
facebook twitter