New Delhi: Facebook brought in emergency measures to curb misinformation towards the end of the 2019 Lok Sabha elections, according to internal company documents reviewed by The Wire.>
These measures — called internally as ‘break-the-glass’ steps, a reference to how fire alarms are usually placed in glass boxes that have to be broken — were brought on partly in response to higher user reports.>
Internal documents also describe how the company saw, ahead of the last round of polling, what it describes as an “escalation” from Bengal — specifically, videos which talked of an “alleged Hindu exodus from certain areas under threat from Muslims.”>
“A few days before round six of polling, an out of context West Bengal road accident video exhibited signs of virality. Captions in the violating posts depicted Bangladeshi and Rohingya migrants in West Bengal as ‘terrorists’ and ‘intruders’ and claimed they were attacking Central Forces,” Facebook employees wrote in an internal report titled ‘India Elections: A Case Study, Part 2’.>
“IPOC UXR reinforced how harmful the video is as it was being taken out of context in order to target vulnerable populations.”>
These insights and more come from documents which are part of disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by the legal counsel of Facebook whistleblower Frances Haugen. The redacted versions received by Congress were reviewed by a consortium of news organizations, including The Wire.>
The documents constitute an extensive array of internal research reports and internal corporate communications that offer an unparalleled look at how Facebook and WhatsApp serve as the canvas on which deep-rooted problems of conflict play out in a country.>
BTG measures>
Over the last few years, Facebook has developed specific measures to curb misinformation during emergency situations. These steps form a key part of its “election playbook” and are based on the company’s experiences from multiple countries including the United States and Brazil.
In the last few years, these ‘break-the-glass’ measures have been used in conflict-stricken countries to stop bloodshed and even in the United States to stop the spread of content that declared Donald Trump had won the American elections.>
The ‘India Elections’ case study report noted that for the 2019 polls, two specific measures were deployed by the company.
Firstly, Facebook downranked “all civic posts in India with a re-share depth of >=2”. And secondly, it reduced the “thresholds” for engagement classifiers in Hindi, English, Tamil and Bengali.>
The first measure, The Wire understands, essentially involves ‘demoting’ certain types of content that have been shared heavily across the platform. Practically-speaking, this curbs the spread of the information by ensuring they appear less on a user’s News Feed.
The second step increases the likelihood of the company’s algorithms acting on specific content in those languages.>
“We did not expect any significant publisher impact but lowered the thresholds gradually to monitor the effects,” the case study notes.>
The same internal report says that Facebook demoted “all content” from 583 top “civic misinforming” groups in India, although it doesn’t specifically state when this was done.>
The action taken against top misinforming groups, the report noted, would reduce “3% of all known misinformation” in India.>
Facebook response>
When The Wire reached out to Facebook for comment on these emergency measures, a company spokesperson merely said that it was committed to “protecting and preserving the integrity of elections around the world”.>
The spokesperson also directed The Wire to a section of its website on the specific measures undertaken for the 2019 general elections. The website does not contain any details of emergency measures taken.>
When asked specifically about who the 583 top civic misinforming groups are in India, the company did not provide a specific response.>
Ad circumvention>
One section of Facebook’s case study on the 2019 Indian elections also makes clear the company is aware of the problem of how political parties may be using proxies to get around the limits that are placed on advertisement spending.>
The study quotes investigations done by Huffington Post and Quartz on the links that an organization called ‘Associations of Billion Minds’ (ABM) and how it has links to pages that appear to be run by normal fans of the Prime Minister, but are in fact actually run by ABM.>
In the internal report, Facebook’s employees admit that these media investigations showed the company’s “limitations in uncovering who is really behind an entity”, but added that the news organizations used these tools to help raise these questions.>
While the BJP has officially distanced itself from ABM, Huffington Post India in 2019 described it as being is “Amit Shah’s personal election unit”,>