Skip to main content

Polling and Democracy in Georgia

Public opinion polling plays a major role in democracies. It provides information about the public’s needs to governments and lets the public understand where they stand compared with other citizens. It is employed all over the world—and sometimes creates problems for the public and for pollsters. 

In the Republic of Georgia’s 2012 parliamentary elections, what appeared to be competing poll results confused voters and damaged the research industry. And with a presidential election scheduled for later this year, repairing the damage became critical. 

Georgia’s historic 2012 parliamentary elections involved an intense campaign between the government and a single opposition party. Elections polls differed widely from each other throughout the campaign and from the election’s final results. In the wake of the election, polls were subject to intense criticism about their accuracy, their methodology, and the motives of those who conducted them. 

Some examples of the confusion: one summer poll showed 42 percent undecided in the October election; another had almost no one who hadn’t made up their minds. One poll indicated the incumbent United National Movement was ahead by more than 20 points; another saw a one-point difference between UNM and the opposing Georgian Dream party. The polls became a campaign issue instead of substantive concerns.

The Open Society Georgia Foundation and the Think Tank Fund supported a project to improve Georgian polling. They asked ESOMAR, the world research organization, and WAPOR, the World Association for Public Opinion Research, to appoint a panel of international experts to look at the pre-election polls and make recommendations for practitioners and poll users.

I chaired that panel, joined by another American, Professor Michael Traugott of the University of Michigan, and two Europeans, Emmanuel Riviere of TNS-SOFRES in France, and Professor Miroslawa Grabowska of Warsaw University, the head of CBOS. Professor Grabowska and I journeyed to Georgia in April to interview pollsters, journalists, and academics. 

All in all, we heard from more than 40 individuals who had been connected to polling in 2012—from the Georgian firms that interviewed respondents, to the American advisers to the candidates. We learned that many of those involved in public opinion research were concerned about the controversy: We learned about accusations of intimidation of respondents (from both sides), worries about whether interviewer preference affected the responses they received (for which there is evidence from countries as different as the United States and Nicaragua), and discussed technical issues where improvements might be made. 

There were a few things we noticed in the data. Although UNM (the incumbent party) was ahead in nearly all the polls, it never reached majority status unless the sizable group of undecided voters was left out of the calculations. In fact, in most polls UNM support hovered around 40 percent. While that would be sufficient to win in a multi-party election, in a two-way race that suggests problems for an incumbent.

And a serious prison abuse scandal was reported two weeks before the October 1 election day. In Georgia, polls are conducted in person, and there is a law that restricts poll reporting in the last days of an election. So the window for polling after the scandal was reported was very small. Only one company did public polls both before and after the scandal emerged, documenting a 16-point gain for Georgian Dream and an 18-point drop for UNM. The final estimate in this poll, while overly optimistic for the challengers, shows the impact of the scandal. And the panel learned of three private polls, not publicly reported before the election, which showed similar shifts in support. 

Our report, Making Public Polling Matter in Georgia [pdf], was presented in Tbilisi at a press conference June 19. There was widespread media interest, with coverage on five television stations, major newspapers, and multiple online sites. Polling organizations were asked to review their methods, reconsider the role of the interviewer, and poll closer to election day, if possible. But most important, they were asked to adopt a code of disclosure, something that might have mitigated the uproar when summer polls disagreed with each other. The differences were easily explained by looking at the different poll methodologies. 

We also recommended training for journalists in understanding polls. Had journalists learned what questions to ask (such as what was asked and who paid for the poll), there would have been less confusion and better writing (see “20 Questions a Journalist Should Ask about Opinion Polls”).

Within a few weeks of the release of the report, there was interest from donor organizations—including potential support for poll reporting training for journalists, and help with writing and publicizing a code for disclosure for polling practitioners. All recognize the need—and urgency—of improvements. The expert panel continues to make themselves available to help in these programs, and we are hopeful that opinion polling will become an even better tool for democracy in Georgia. 

Read more

Subscribe to updates about Open Society’s work around the world

By entering your email address and clicking “Submit,” you agree to receive updates from the Open Society Foundations about our work. To learn more about how we use and protect your personal data, please view our privacy policy.