Dozens of Rohingya refugees in the UK and US have sued Facebook, accusing the social media giant of allowing hate speech against them to spread.
They are demanding more than $150bn (£113bn) in compensation, claiming Facebook’s platforms promoted violence against the persecuted minority.
An estimated 10,000 Rohingya Muslims were killed during a military crackdown in Buddhist-majority Myanmar in 2017.
Facebook, now called Meta, did not immediately respond to the allegations.
The company is accused of allowing “the dissemination of hateful and dangerous misinformation to continue for years”.
In the UK, a British law firm representing some of the refugees has written a letter to Facebook, seen by the BBC, alleging:
- Facebook’s algorithms “amplified hate speech against the Rohingya people”
- The firm “failed to invest” in moderators and fact checkers who knew about the political situation in Myanmar
- The company failed to take down posts or delete accounts that incited violence against Rohingya
- It failed to “take appropriate and timely action”, despite warnings from charities and the media
In the US, lawyers filed a legal complaint against Facebook in San Francisco, accusing it of being “willing to trade the lives of the Rohingya people for better market penetration in a small country in Southeast Asia.”
They cite Facebook posts that appeared in an investigation by the Reuters news agency, including one in 2013 stating: “We must fight them the way Hitler did the Jews.”
Another post said: “Pour fuel and set fire so that they can meet Allah faster.”
Facebook has more than 20 million users in Myanmar. For many, the social media site is their main or only way of getting and sharing news.
Facebook admitted in 2018 that it had not done enough to prevent the incitement of violence and hate speech against the Rohingya.
This followed an independent report, commissioned by Facebook, that said the platform had created an “enabling environment” for the proliferation of human rights abuse.
Meta haunted by past mistakes
What happened in Myanmar was one of Facebook’s first red flags.
The social media site was hugely popular there – but the company didn’t fully understand what was happening on its own platform. They weren’t actively moderating content in local languages like Burmese and Rakhine.
If they had, they would have seen anti-Muslim hate speech and disinformation about terrorist plots from the Rohingya. Critics say this helped fuel ethnic tensions that spilled over into brutal violence.
Mark Zuckerberg has personally admitted to mistakes in the run up to widespread violence there,
That’s what makes this lawsuit particularly interesting – Facebook isn’t denying that it could have done more.
Whether or not that means they are legally culpable is a very different question though. Could this lawsuit get anywhere? It’s possible, though unlikely.
But as its parent company, Meta, tries to turn the focus away from Facebook – it finds Itself still haunted by past mistakes.
The Rohingya are seen as illegal migrants in Myanmar and have been discriminated against by the government and public for decades.
In 2017, the Myanmar military launched a violent crackdown in Rakhine state after Rohingya militants carried out deadly attacks on police posts.
Thousands of people died and more than 700,000 Rohingya fled to neighbouring Bangladesh. There are also widespread allegations of human rights abuses, including arbitrary killing, rape and burning of land.
In 2018, the UN accused Facebook of being “slow and ineffective” in its response to the spread of hatred online.
Under US law, Facebook is largely protected from liability over content posted by its users. But the new lawsuit argues the law of Myanmar – which has no such protections – should prevail in the case.
The BBC has asked Meta for comment.
Source: BBC