Skip to main content

Preserving Open Society in a World Run by Algorithms

The City of London provides public Wi-Fi within the Square Mile financial district. But what information is collected by the city and its Wi-Fi provider, the Cloud, about the users of that service? How might the data on location and activities of Wi-Fi users be exploited? New research by Privacy International, one of 12 winners from 10 countries in the Quantified Society call for proposals, will investigate these questions.

As a society, we are increasingly aware of the specter of constant surveillance as an invasion of privacy. But what are the implications of large-scale data collection, retention, and analysis—by corporations or governments—for open society?

Today, data is being used to algorithmically determine an increasing number of aspects of our lives, from what we see in our digital news feeds, to the way law enforcement operates, to how public services are managed and allocated, to how well we, as a society, ensure protections against discrimination for the vulnerable and marginalized.

Researchers, journalists, and activists around the world are only beginning to grapple with the social justice and civil liberties implications of living in an increasingly “quantified” society, and to document and react to actual harms taking place that fall, in many cases, beyond the purview of current legal frameworks such as data protection laws and government regulators. Ethical frameworks that will be needed to guide us in a quantified world are rare to nonexistent.

In an effort to shed light on these emerging concerns, in February 2015, the Open Society Foundations, the Ford Foundation, and the Media Democracy Fund launched a call for proposals to investigate the implications of algorithmic decision making for open society issues. Intentionally broad in scope, the call sought applicants working on a range of issues from outside North America; we hoped to attract researchers, investigative journalists, and activists looking at these questions from their own cultural, political, and regulatory contexts. 

The response, from teams and individuals operating around the world, was both heartening and sobering: an increasing number of actors are seeking to understand the new pressures on efforts for social justice in an era of mass quantification. But we also saw, from the range of proposals, the scale of the questions being asked.

In Poland, the Panoptykon Foundation will investigate the implications of algorithmically driven categorization and resource distribution to the country’s 1.8 million unemployed citizens. In Argentina, Fundación Vía Libre will explore the impact of algorithmic decision making on the right to education, the right to privacy, and control of personal data through an examination of the public school placement system in Buenos Aires. And LIRNEasia of Sri Lanka will scrutinize the use of “data exhaust,” or behavior data, to influence public policy on a range of issues.

Few aspects of life will remain untouched by this fundamental shift in how the world around us is rendered as information, and the 12 projects from 10 countries—funded from a $270,000 USD pool—supported through this call will give us only a window onto a much broader and still emerging field of work. 

Read more

Subscribe to updates about Open Society’s work around the world

By entering your email address and clicking “Submit,” you agree to receive updates from the Open Society Foundations about our work. To learn more about how we use and protect your personal data, please view our privacy policy.