Skip to main content

New SEC Complaint Says Meta Misled Shareholders over Myanmar Hate

Three men look at a mobile phone screen
Rohingya Muslim refugees look at a cellphone at the Kutupalong refugee camp in Bangladesh on January 14, 2018. © Manish Swarup/AP

Brutal killings. Torture and abuse. Villages set on fire and destroyed. The vicious campaign launched by the armed forces of Myanmar against Rohingya Muslims in August 2017 resulted in an estimated 700,000 people fleeing as refugees across the border into neighboring Bangladesh.

The military officer who commanded that ethnic cleansing—General Min Aung Hlaing—is now head of Myanmar’s military government. He is also the target of a prosecutorial request for an arrest warrant for crimes against humanity now before judges at the International Criminal Court.

Separately, in 2022 the United States government, in a rare determination, termed the 2017 campaign against the Rohingya, which followed decades of persecution, as a genocide.

But the question of responsibility for what happened to the Rohingya in 2017 extends beyond the crimes directly carried out by the military: it must include those responsible for accelerating and magnifying the wave of public hatred and incitement to violence launched online against the Rohingya both before and during the military atrocities.

A significant portion of this material was transmitted on Facebook, then and now the dominant social media platform in Myanmar (Facebook has around 18 million users out of around 24 million total internet users in the country today). A report by the UN Independent International Fact-Finding Mission on Myanmar found in 2018 that at the time of the atrocities in 2017, Facebook was synonymous with the internet and, with that, the main source for online news and information.

Amnesty International’s research concluded that the algorithms deployed by Meta to recommend content to users had the effect of maximizing attention to posts that incited hate and violence against the Rohingya—even though they violated Meta’s own community standards. In addition, hate speech against the same group was rampant on Facebook in the lead up to 2017, and yet, even though Meta’s policies prohibit this type of content, Meta’s willful blindness or at a minimum their careless ignorance meant that not enough was done to remove them from the platform. As a result, Amnesty International found that Meta’s “content-shaping algorithms fueled the mass violence perpetrated against the Rohingya.”

Maung Sawyeddollah, a Rohingya youth activist and refugee, who reported posts using slurs about Muslims but to no effect, told Amnesty International he “realized that it is not only these people—the posters—but Facebook is also responsible. Facebook is helping them by not taking care of their platform.”

Since 2021, several legal actions brought on behalf of Rohingya in courts in the United States and Europe have highlighted the role of Meta in spreading this wave of hate. These cases have included a class-action lawsuit in San Francisco arguing that Meta’s design contributed to the harm suffered by the Rohingya (the case is currently on appeal before the US Court of Appeals for the Ninth Circuit from the district court’s dismissal on statute of limitations grounds).

Now, the Open Society Justice Initiative has joined Amnesty International and Victims’ Advocates International in bringing a new complaint—this time to the United States Securities and Exchange Commission (SEC), the U.S. regulator responsible for overseeing the securities laws to protect investors, and maintaining fair, orderly, and efficient markets.

The three organizations have filed a whistleblower complaint to the SEC on behalf of Maung Sawyeddollah alleging that Meta misrepresented its role in the attacks against the Rohingya to its shareholders in statements made between 2015 and 2017. Even though Meta was warned on numerous occasions by civil society actors about the way Facebook was fueling hate in an already volatile environment, they either did not present the full truth or at times lied outright to their investors.

This is not the first complaint to the SEC over the failures of Meta’s content moderation. In 2021, Frances Haugen, a former Meta employee turned whistleblower, filed a complaint based on a cache of internal company documents. But the current filing is the first that puts the situation of the Rohingya and the role of Meta in Myanmar at its center. 

Clearly, the changing political environment in the United States will be a factor in how the SEC responds, including the increasing push back by some in corporate America against applying business and human rights standards to corporate behavior. But in the eyes of the Rohingya community Meta has yet to face consequences for their actions in helping foment and contribute to mass violence, and the SEC is one of the few bodies with the power to make this happen. With an increase in lawsuits against social media platforms in general, this complaint aims to add another chip in the legal armor that has been shielding Meta from any accountability to date.

What is at stake goes beyond the specifics of the violence unleashed upon the Rohingya, horrific as that was. In a world increasingly susceptible to atrocities fueled by social media giants, the question of Meta’s responsibility looms far and wide.

Read more

Subscribe to updates about Open Society’s work around the world

By entering your email address and clicking “Submit,” you agree to receive updates from the Open Society Foundations about our work. To learn more about how we use and protect your personal data, please view our privacy policy.