Facebook’s Monopoly Power Threatens Democracy Itself
By Laleh Ispahani
Earlier this month, Facebook’s Mark Zuckerberg came to Washington. He was there to testify about Libra, Facebook’s cryptocurrency initiative. But he got a barrage of withering criticism on issues related to Facebook’s reign as the premier social media platform of our time—criticisms that he as chief executive should have been fully equipped to answer.
He was roasted about his company’s policy of accepting and distributing political ads, even those with demonstrably false content. He took shots for the company’s failure to police hate speech, the slow pace at which it responded to Russian interventions during the 2016 elections, and its diversity policies. Even as he struggled in Washington, his company was under fire in some 47 states across the country, where Democratic and Republican attorneys general have joined forces in accusing Facebook of violating antitrust law.
Not long ago, Zuckerberg and his fellow tech platform pioneers were hailed as heroes. But the immense and unchecked power of these platforms has become untenable, as evidenced by the growing concern on both sides of the aisle. As Cory Doctorow, the famed blogger, author, and special advisor to the Electronic Frontier Foundation describes it, what could have been a technological democracy has become more of a constitutional monarchy. Through acquisitions and market consolidation, a handful of companies have become behemoths too big to be effectively regulated—online giants that function as de facto state monopolies, blinded by their successes to the impact of their actions.
Today, merely logging onto a website is to surrender to a multitude of potential harms. Few bother to wade through the sites’ impenetrable terms of service—which Doctorow dubbed “sprawling novellas of garbage.” Once on a site, the architecture makes it difficult to leave. A visitor is battered with mis- and disinformation, content that incites violence, and discriminatory content that functions as equal-opportunity abuse of its users, targeting people by race, gender, religion, ethnicity, economic status, and sexual identity.
But perhaps the most pressing problem that has arisen as these platforms have amassed such market power is the epistemological crisis they have spawned: people have no way of knowing whether something they are reading is true.
These harms undermine the values we work to uphold at the Open Society Foundations—supporting the growth and development of inclusive and accountable democracies. What once was seen as a tool for opening up societies is in danger of becoming a vehicle for closing them.
At Open Society, we’ve experienced these dangers first hand.
Following revelations that Facebook had hired a propaganda firm to discredit our founder and chair, George Soros, Open Society President Patrick Gaspard wrote an open letter to Facebook asserting that such underhanded ploys to avoid accountability “threaten the very values underpinning our democracy.” This past summer, Tom Perriello, executive director of Open Society-U.S., took on the transnational dimensions of the problem of platform power in an op-ed rebutting Facebook’s fear-mongering claim that breaking up big tech would open the door to digital domination by China.
These challenges affect us in significant ways, but they cast a far longer shadow on the communities we support across the globe. Facebook’s policies have enabled grave human rights abuses, including the Myanmar military’s use of the platform to incite genocide against Rohingya Muslims. An internet blackout imposed in Kashmir by the government of India this summer cut people off from the outside world, dividing families, interrupting medical care, and causing deaths.
Facebook is constantly having to adjust policies and products to deal with a blizzard of complaints. These “fixes” seldom solve the problems they are meant to address. In handling the case of free speech, and the company’s role in amplifying violent rhetoric here and abroad, Facebook announced plans to establish an independent oversight board. But as Gideon Rachman of the Financial Times wrote, “it is surely beyond the capabilities of a single company to make nuanced judgments about political debate in hundreds of countries and languages.” Rachman went on to observe that Facebook, now 15 years old, has become “the world’s most powerful adolescent. Like many teenagers, it could benefit if adults set some limits.”
For 15 years, Open Society has maintained an abiding commitment to issues of democracy and technology. In the United States, we have focused on tech policy and surveillance; with our global colleagues, we work to challenge the surveillance-based business model of the dominant platforms, to counter disinformation, and to expose the ways algorithms perpetuate discrimination.
We were proud to participate in a recent gathering of NetGain, an initiative launched in 2015 by the nation’s leading foundations, who joined together to address the evolving challenges of the digital age. This joint commitment—the first of its kind in the philanthropic sector—underscores the importance of charting a more just digital future, for the communities we champion, the grantees we support; and, indeed, for the very future of the democratic experiment itself.
The time to address the many challenges of platform power is now. And, judging by Mr. Zuckerberg’s reception on Capitol Hill, the concerns about allowing this power to go unchecked span the ideological spectrum. The NetGain gathering was a powerful reminder that we have the ability to fix information politics today, by leveraging the power of movements, and knitting together diverse constituencies that share a common concern about the corruption and inequality that have been the byproduct of monopoly for time immemorial. Together, we can forge the reforms we need to transform this tech monarchy into a true tech democracy.
Laleh Ispahani is managing director of Programs at the Open Society Foundations.