Internal Facebook documents showed that while the company enjoyed a boom of popularity in India in 2019, researchers were warning that the company’s services were filled with religious hate speech between the nation’s Hindu and Muslim populations.
That year, researchers monitored a test account from February to March that quickly became awash with bigotry, misinformation and celebrations of violence that one report would eventually link to the deadly February 2020 religious riots in Delhi that killed 53 people, The Washington Post reports.
‘The test user’s News Feed has become a near constant barrage of polarizing nationalist content, misinformation, and violence and gore,’ one Facebook researcher wrote in the report.
‘I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total.’
Yet the researcher’s recommendations to fix the problems were allegedly ignored due to ‘political sensitivities,’ due to their ties with India’s ruling party.
The internal documents were a part of a large cache of files collected and released by Facebook whistleblower Frances Haugen.
Facebook enjoys its biggest market in India with more than 300 million users and its WhatsApp services has more than 400 million users.
The deadly Dehli riots took place from February 23 to 29 in 2020 and left 53 dead
The riots came as the nation’s Hindu majority clashed with its Muslim minority
Facebook researchers said the growing amount of hate speech on the platform may have fueled the fires of the riots as it polarized its Hindu and Muslim users
Equity Labs found that more than third of the hate speech on Facebook in 2019 was directed at Muslims. Researchers claim that Facebook was well aware of the problem
The internal documents revealed that Facebook saw a 300 per cent spike leading up the riots, with increasing calls to violence flooding Facebook and WhatsApp, it’s international text and calling service.
The documents echoed a 2019 study conducted by Equity Labs, a nonprofit international organization that studies the causes of racial inequality, and shows that Facebook was fully aware of the polarizing effect it was propagating in India.
Equity Labs’ research found that of the hate speech permeating in Facebook, more than a third was directed at India’s Muslim minority.
Facebook’s researchers had conducted interviews with users that found this to be the case as Hindu users said they frequently saw posts vilifying their Muslim neighbors.
Similarly, Muslims interviewees said they had begun fearing for their lives because of all the hatred on Facebook.
‘It’s scary. It’s really scary,’ one Muslim man said in the report.
Many users told the researchers that it was ‘Facebook’s responsibility to reduce this content.’
India will be a ‘very difficult place to survive for everyone,’ another Muslim interviewee warned. ‘If social media survives 10 more years like this, there will be only hatred.’
Family members mourned those who died in the 2020 riots
The riots came as Muslims protested the nation’s new citizenship law. The protestors were attacked by Hindu counter-protestors
A man is pictured crying over the death of Mohammad MudAsir, who led one of the protests
Through their work, the Facebook researchers found that two Hindu nationalists groups with ties to India’s ruling party were leading the wave of anti-Muslim posts.
But when the researchers recommended that one of the groups be banned from Facebook, nothing happened, according to one report.
The other group, researchers said, promoted violence against Muslims and compared the minority groups to ‘pigs’ and ‘dogs.’
The groups also remains active on Facebook and was not labeled as ‘dangerous’ due to ‘political sensitivities,’ the report read.
The documents also included reports on hot bots and fake accounts tied the country’s ruling party were which were sowing chaos and division through the platform.
Indian Prime Minister Narendra Modi has often made anti-Muslim claims as nationalist groups, like the ones mentioned in the report, use his speeches to propagate violence against the minority group.
Modi’s citizenship law, prohibiting immigrants from attaining legal citizenship, was what ignited the 2020 riots as it was seen as a political move targeting Muslim immigrants.
Indian Prime Minister Narendra Modi, left, pictured hugging Facebook CEO Mark Zuckerberg. Facebook researchers said that Hindu nationalists groups linked to Modi were allowed to stay up despite their use of the platform to promote hate and violence against Muslims
Facebook has several offices in India, like the one in Gurgaon, pictured. Employees in India were reportedly threatened with jail time by Modi’s administration
The Modi administration had even threatened to jail Facebook and Twitter employees earlier this year if they did not comply with take-down requests as the nation sought to crush political protest on social media, The WSJ reports.
Another researcher noted in a report that Facebook was ill-equipped to take down posts from the Hindu nationalists group Rashtriya Swayamsevak Sangh, RSS, because the group posted in Hindi and Bengali, which their system had trouble translating.
RSS’s posts make claims that Muslims are to blame for COVID-19’s spread in India and that Muslim men try to lure Hindu women into marriage to convert them to Islam.
The documents said that RSS would not be removed due to ‘political sensitives,’ the New York Times reports. Modi had worked for the RSS for decades.
Another Hindu nationalist group linked to Modi, the Bajrang Dal, was reported to have used WhatsApp to ‘organize and incite violence,’ but the group remains active on Facebook.
Researchers said the company balked at removing Bajrang Dal despite the warnings because doing so might have endangered Facebook’s staff and business prospects, as well as infuriated Modi’s party.
Facebook spokesman Andy Stone did not comment on the Hindu nationalist groups but said the company bans groups or individuals ‘after following a careful, rigorous, and multidisciplinary process.’
He told the WSJ that the research mentioned in the reports were working documents and not complete investigations to be used for policy recommendations.
He added that Facebook has invested and improved significantly in technology to find hate speech across the world and such posts have been declining on the platform.
Facebook did not immediately reply to DailyMail.com’s request for comment.
This is the second time Facebook has been accused of allowing hate speech on its platform to grow against a Muslim minority.
A Facebook post calling for the mass killings of Muslims in India similar to those taking place at the time in Myanmar, formerly known as Burma
An anonymous whistleblower complained that Facebook had not been aggressive enough when it came to military officials in Myanmar using the platform to spread hate speech during the mass killings of the Rohingya ethnic group in 2017.
In the most dramatic line of the whistleblower’s affidavit, the former employee anguished over Facebook’s inability to act quickly to the killings of the Muslims group as military officials used the site to spread hate speech.
‘I, working for Facebook, had been a party to genocide,’ the whistleblower wrote.
Although Facebook had previously acknowledged its failure to act swiftly in the mass deaths of the Rohingya people, the company said it no longer makes such mistakes.
‘Facebook’s approach in Myanmar today is fundamentally different from what it was in 2017, and allegations that we have not invested in safety and security in the country are wrong,’ McPike said in a statement.
Haugen (pictured testifying in Congress on October 5), who claims Facebook puts ‘profits before people,’ earlier this month released tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit
Facebook Whistleblower Frances Haugen’s testimony to Congress
During a Senate Commerce subcommittee hearing on October 5, Whistleblower Frances Haugen called for transparency about how Facebook entices its users to keep scrolling on its apps, and the harmful effect it can have on users.
‘As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable,’ said Haugen, a former product manager on Facebook’s civic misinformation team. She left the nearly $1 trillion company with tens of thousands of confidential documents.
‘The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed,’ Haugen said.
Haugen revealed she was the person who provided documents used in a Wall Street Journal and a Senate hearing on Instagram’s harm to teenage girls. She compared the social media services to addictive substances like tobacco and opioids.
Before the hearing, she appeared on CBS television program ’60 Minutes,’ revealing her identity as the whistleblower who provided the documents.
‘There were conflicts of interest between what was good for the public and what was good for Facebook,’ she said during the interview. ‘And Facebook over and over again chose to optimize for its own interests like making more money.’
Haugen, who previously worked at Google and Pinterest, said Facebook has lied to the public about the progress it made to clamp down on hate speech and misinformation on its platform.
She added that Facebook was used to help organize the Capitol riot on January 6, after the company turned off safety systems following the U.S. presidential elections.
While she believed no one at Facebook was ‘malevolent,’ she said the company had misaligned incentives.
In response to Haugen’s bombshell comments, a Facebook executive accused her of stealing company documents and claimed she is ‘not an expert’ on the company’s content algorithms.
Facebook Vice President of Content Policy Monika Bickert spoke out in an interview with Fox News on, slamming Haugen a day after she testified to Congress.
Bickert said that Haugen ‘mischaracterized’ the internal studies regarding the harmful impacts of content on Facebook, Instagram and WhatsApp, which she presented to to Congress.
Haugen testified before Congress in early October, where she claimed Facebook promoted divisiveness as a way to keep people on the site, with Haugen saying the documents showed the company had failed to protect young users.
It also showed that the company knew Instagram harmed young girls’ body image and even tried to brainstorm ways to appeal to toddlers by ‘exploring playdates as a growth lever.’
‘The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed,’ Haugen said at a hearing.
Haugen, who anonymously filed eight complaints about her former employer with the US Securities and Exchange Commission, told 60 Minutes earlier this month: ‘Facebook, over and over again, has shown it chooses profit over safety.’
She claimed that a 2018 change prioritizing divisive posts, which made Facebook users argue, was found to boost user engagement.
That in turn helped bosses sell more online ads that have seen the social media giant’s value pass $1 trillion.
‘You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media,’ Haugen said.
She also blamed Facebook for spurring the January 6 Capitol riot.
Meanwhile, the senator leading a probe of Facebook’s Instagram and its impact on young people is asking Zuckerberg to testify before the panel that has heard far-reaching criticisms from a former employee of the company.
Sen. Richard Blumenthal, D-Conn., who heads the Senate Commerce subcommittee on consumer protection, called in a sharply worded letter Wednesday for the Facebook founder to testify on Instagram’s effects on children.
‘Parents across America are deeply disturbed by ongoing reports that Facebook knows that Instagram can cause destructive and lasting harms to many teens and children, especially to their mental health and wellbeing,’ Blumenthal said in the letter addressed to Zuckerberg.
‘Those parents, and the twenty million teens that use your app, have a right to know the truth about the safety of Instagram.’
In the wake of Haugen’s testimony early this month, Blumenthal told Zuckerberg, ‘Facebook representatives, including yourself, have doubled down on evasive answers, keeping hidden several reports on teen health, offering noncommittal and vague plans for action at an unspecified time down the road, and even turning to personal attacks on Ms. Haugen.’
Blumenthal did offer, however, that either Zuckerberg or the head of Instagram, Adam Mosseri, could appear before his committee.
‘It is urgent and necessary for you or Mr. Adam Mosseri to testify to set the record straight and provide members of Congress and parents with a plan on how you are going to protect our kids,’ he told Zuckerberg.
A spokesman for Facebook, based in Menlo Park, California, confirmed receipt of Blumenthal’s letter but declined any comment.
Haugen, who buttressed her statements with tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit, accused Facebook of prioritizing profit over safety and being dishonest in its public fight against hate and misinformation.
‘In the end, the buck stops with Mark,’ Haugen said in her testimony. ‘There is no one currently holding Mark accountable but himself.’