| 24 May 2024, Friday |

U.S. Court orders Facebook to release anti-Rohingya content records for genocide case

A federal court in the United States has ordered Facebook (FB.O) to reveal information of accounts linked to anti-Rohingya violence in Myanmar that the company had shut down, dismissing the company’s privacy argument as “loaded with irony.”

According to a copy of the judgment, a court in Washington, D.C., chastised Facebook for failing to pass over evidence to authorities trying to prosecute Myanmar for international crimes against the Muslim minority Rohingya.

Facebook has refused to provide over the information, claiming that doing so would violate a US statute prohibiting electronic communication services from revealing their users’ messages.

However, the judge ruled that the deleted tweets were not covered by the legislation, and that not publishing the material would “compound the calamity that has befallen the Rohingya.”

“The irony of Facebook taking up the banner of privacy rights is palpable. News outlets have dedicated whole sections to Facebook’s tumultuous history of privacy problems “he penned

Facebook’s decision is being reviewed, according to a spokeswoman, and the firm has previously provided “voluntary, lawful disclosures” to another UN agency, the Independent Investigative Mechanism for Myanmar.

More than 730,000 Rohingya Muslims fled Myanmar’s Rakhine state in August 2017 after a military crackdown that refugees said including mass killings and rape. Rights groups documented killings of civilians and burning of villages.

Myanmar authorities say they were battling an insurgency and deny carrying out systematic atrocities.

The crackdown by the army, during the rule of Nobel laureate Aung San Suu Kyi’s civilian government, did not generate much outcry in the Buddhist-majority nation, where the Rohingya are widely derided as illegal immigrants from Bangladesh.

Gambia wants the data for a case against Myanmar it is pursuing at the International Court of Justice (ICJ) in the Hague, accusing Myanmar of violating the 1948 U.N. Convention on Genocide.

In 2018, U.N. human rights investigators said Facebook had played a key role in spreading hate speech that fueled the violence.

Reuters investigation that year found more than 1,000 examples of hate speech on Facebook, including calling Rohingya and other Muslims dogs, maggots and rapists, suggesting they be fed to pigs, and urging they be shot or exterminated.

Facebook said at the time it had been “too slow to prevent misinformation and hate” in Myanmar.

In Wednesday’s ruling, U.S. magistrate judge Zia M. Faruqui said Facebook had taken a first step by deleting “the content that fueled a genocide” but had “stumbled” by not sharing it.

“A surgeon that excises a tumor does not merely throw it in the trash. She seeks a pathology report to identify the disease,” he said.

“Locking away the requested content would be throwing away the opportunity to understand how disinformation begat genocide of the Rohingya and would foreclose a reckoning at the ICJ.”

Shannon Raj Singh, human rights counsel at Twitter, called the decision “momentous” and “one of the foremost examples of the relevance of social media to modern atrocity prevention & response”.

  • Reuters