Rohingya refugees have launched two class action lawsuits valued in excess of $150 billion against Meta, formerly known as Facebook, for its role in sparking the Rohingya genocide in Myanmar.
The lawsuits, which were filed anonymously in the UK and US, allege that the Facebook platform widely disseminated anti-Rohingya hate speech and misinformation while also helping users incite violence, which culminated in the rape, death, and torture of hundreds of thousands of Rohingya.
The Rohingya people are a Muslim minority that has historically lived in present-day Myanmar, but have faced persecution and human rights violations from the local government and extremist Buddhists in the country.
The UK lawsuit was brought on behalf of all non-US resident Rohingya survivors around the world, while the separate US claim will be for the Rohingya community based in the US.
In the originating complaint [PDF] for the US lawsuit, the refugees allege that Facebook executives allowed posts ordering hits by the Myanmar government on Rohingya people to remain on the platform despite having known about its existence for years.
The complaint claims that Facebook’s algorithms recommended susceptible users to join extremist groups, where users are conditioned to post even more inflammatory and divisive content, as it inflated the user data Facebook presented to financial markets. In doing so, Facebook’s News Feed allegedly prioritised and rewarded radical users who sparked the Rohingya genocide as negative content as that provided the most engagement on the platform.
“To maximise engagement, Facebook does not merely fill users’ News Feeds with disproportionate amounts of hate speech and misinformation; it employs a system of social rewards that manipulates and trains users to create such content,” the plaintiffs wrote in the complaint.
“At the core of this complaint is the realisation that Facebook was willing to trade the lives of the Rohingya people for better market penetration in a small country in Southeast Asia.”
According to the complaint, Facebook only conceded in 2018 that its platform was used to foment division and incite offline violence after the Rohingya genocide had already happened. This was despite Facebook being warned from around 2013 onwards about extensive anti-Rohingya posts, groups, and accounts on its platform, the complaint said.
The complaint also states that the social media platform has continued to fail to prevent misinformation in Myanmar as there are allegedly still many fake accounts in existence that create inauthentic activity supporting the Myanmar military.
To remedy these accusations, both class action lawsuits are seeking more than $150 billion in reparations for the wrongful death, personal injury, pain and suffering, emotional distress, and loss of property suffered by the Rohingya people.
While one of the cases has been filed in the US, for that particular matter, the refugees are seeking for it to be trialled under Burmese law so that Meta is unable to use Section 230 of the Communications Decency Act, which could allow the tech giant to avoid any liability regarding any content posted by users on Facebook.
Under Burmese laws, companies can be liable for content on their social media platforms if it incites violence and contributes to genocide.
The lawsuits follow Facebook whistleblower Frances Haugen slamming Facebook about its alleged use of opaque algorithms to spread harmful content. Her accusations that these algorithms could trigger a growing number of violent events, such as the attacks on the US Capitol Building that occurred last January, were cited in the lawsuit.
Similar arguments have been filed in the UK lawsuit. For that lawsuit, the Rohingya refugee’s legal representatives have so far given Facebook formal notice of their intention to initiate proceedings.
- More violent events driven by social media are bound to happen, says Facebook whistleblower
- Facebook whistleblower: ‘Morally bankrupt’ social giant will have to ‘hook kids’ to grow
- Facebook gives Kazakhstan government direct access to content reporting system
- Congress scolds Facebook over the harms its platforms cause: What’s next?
- Facebook rebukes WSJ over investigation on the platform’s ability to harm, ‘toxic’ impact
- CDC study finds ties between online bullying, violence, hate speech and suicide or self-harm
- Facebook shelving Instagram for kids, expanding parental oversight for teens
- Facebook CEO Mark Zuckerberg on putting profit before safety: ‘That’s just not true’