Showing posts with label Opinion Poll. Show all posts
Showing posts with label Opinion Poll. Show all posts

Monday, April 08, 2019

The election environment in minority areas of Georgia is getting worse

[Note: This article was published together with OC-Media. It was written by Dustin Gilbreath. The views presented in this article do not necessarily represent the views of CRRC-Georgia. The views presented in this article do not represent the views of the National Democratic Institute or any related entity.]

Post-election polling by CRRC-Georgia suggests that not only are elections most problematic in Georgia’s ethnic minority regions, they are also getting worse.

The 2018 presidential elections, and particularly, the events surrounding the second round, have come to be considered a setback for Georgia’s democratic trajectory. Between the first and second round, it was announced that 600,000 voters would have debt relief immediately following the elections, leading some to suggest this was a form of vote buying. A number of instances of electoral fraud were also alleged. The use of party coordinators around election precincts was also widely condemned.

Elections in minority regions have generally been worse in quality than in ethnic Georgian populated regions. Some statistical evidence suggests irregular voting behaviour if not fraud in these regions. Moreover, these regions of the country consistently vote for whoever is in power.

The situation appears to be getting worse, at least when compared with the parliamentary elections of 2016.

CRRC-Georgia and the National Democratic Institute’s 2016, 2017, and 2018 post-election polling asked voters, ‘Thinking back to the situation when you voted in the polling station/place (either in the 1st or 2nd round), please say whether you agree or disagree with the following?’

  • It was well ordered;
  • It was overcrowded;
  • It was intimidating;
  • The election officials were well prepared.
Respondents were also asked whether they noticed party coordinators around the polling station asking for personal information.

The results suggest that people in predominantly minority settlements were about three times more likely to report seeing party coordinators collecting personal information outside polling places. People in minority areas were 2.5 times as likely to report that the polling place was intimidating and four times more likely to report the polling station was overcrowded. They were 14 percentage points less likely to report that election officials were well prepared, and 13 percentage points less likely to report that the polling place was well ordered.


All respondents were also asked ‘Please tell us whether [each of the following] occurred or not during the election process’:
  • People voting more than once (including carousel voting);
  • Intimidation of voters or party representatives;
  • Use of administrative resources to benefit a campaign;
  • Bribing of voters;
  • Pressure to donate or not donate to certain candidate/party;
  • Mobilising state employees to participate in campaign/vote for a certain candidate;
A similar pattern as the above holds with these questions, with respondents consistently reporting most of the above problems more often in predominantly minority settlements than in ethnic Georgian settlements. The only activity which was not reported more often (statistically) in minority settlements than ethnic Georgian ones was putting pressure on people to donate.


The data suggest that problems with elections in minority regions are on the rise. The share of individuals in predominantly minority settlements reporting that the polls were intimidating more than tripled between 2016 and 2018. The share reporting it was overcrowded more than doubled between 2016 and 2018. The share of individuals in predominantly minority settlements reporting that the election precinct was well ordered also declined between 2016 and 2017.

In predominantly ethnic Georgian settlements, there was a decline between 2017 and 2018 in terms of how well prepared election officials were perceived to be. There was also a slight decline in terms of people reporting that the polling station was overcrowded. However, there was no change in the share reporting it was well ordered or intimidating.

The 2018 elections had problems. While the conduct of elections in predominantly minority areas in Georgia has been historically problematic, these problems appear to have gotten worse, at least by comparison to the elections in 2016 and 2017.

Monday, January 16, 2017

Developing the “culture of polling” in Georgia (Part 1): Survey criticism in Georgia

[Note: This is a guest post from Natia Mestvirishvili, a Researcher at International Centre for Migration Policy Development (ICMPD) and former Senior Researcher at CRRC-Georgia. This post was co-published with the Clarion.]

Intense public debate usually accompanies the publication of survey findings in Georgia, especially when the findings are about politics. The discussions are often extremely critical or even call for the rejection of the results.

Normally criticism of surveys would focus on the shortcomings of the research process and help guide researchers towards better practices to make surveys a better tool to understand society. In Georgia most of the current criticism of surveys is, unfortunately, counterproductive and mainly driven by an unwillingness to accept the findings, because the critics do not like them. This blog post outlines some features of survey criticism in Georgia and highlights the need for constructive criticism aimed at the improvement of research practice, because constructive criticism is extremely important and useful for the development of the “culture of polling” in Georgia.

Often, discrepancies between the findings and the critics’ opinion about public opinion cause criticism of surveys in Georgia. Hence, the survey critics claim that the findings do not correspond to ‘reality’. Or rather, their reality.

But, are surveys meant to measure ‘reality’? For the most part, no. Rather, public opinion polls measure and report public opinion which is shaped not only by perceptions, but also by misperceptions i.e., the views and opinions that people have. There is no ‘right’ or ‘wrong’ opinion. It is equally important that these are opinions that people feel comfortable sharing during interviews –while talking to complete strangers. Consequently, and leaving aside deeply philosophical discussions about what reality is and whether it exists at all, public opinion surveys measure perceptions, not reality.

Among the many assumptions that may underlie criticism of surveys in Georgia, critics often suggest that:

  1. They know best what people around them think;
  2. What people around them think represents the opinions of the country’s entire population. 

However, both of these assumptions are wrong, because, in fact:

  1. Although people in general believe that they know others well, they don’t. Extensive psychological research shows that there are common illusions which make us think we know and understand other people better than we actually do – even when it comes to our partners and close friends;
  2. Not only does everyone have a limited choice of opinions and points of view in their immediate surroundings compared to the ‘entire’ society, but it has also been shown that people are attracted to similarity. As a result, primary social groups are composed of people who are alike. Thus, people tend to be exposed to the opinions of their peers, people who think alike. There are many points of view in other social groups that a person may never come across, not to mention understand or hold; 
  3. Even if a person has contacts with a wide diversity of people, these will never be enough to be representative of the entire society. Even if it were, individuals lack the ability to judge how opinions are distributed within a society.


To make an analogy, assuming the opinions we hear around us can be generalized to the entire society is very similar to zooming in on a particularly large country, like Canada, on a map of a global freedom index, and assuming that since Canada is green, i.e. rated as “Free”, the same is true for the rest of the world. In fact, if we zoom out, we will be able to see that the whole world is all but green. Rather, it is very colorful, with most of the countries being of different colors than green, and “big” Canada is no indication of the state of the rest of the world.



Source: www.freedomhouse.org

People who think that what people around them think (or, to be even more precise – who think that what they think that people around them think) can be generalized to the whole country make a similar mistake.

Instead of objective and constructive criticism based on unbiased and informed opinions and professional knowledge, public opinion polls in Georgia are mostly discussed based on emotions and personal preferences. Professional expertise is almost entirely lacking in those discussions.
Politicians citing questions from the same survey in either a negative or positive context, depending on whether they like the results or not, is a good illustration of the above claim. For example, positive evaluations of a policy or development by the public is often proudly cited by political actors without doubting the quality of the survey. At the same time, low and/or decreasing public support for a particular party according to the findings of the same survey is “explained away” by the same actors as poor data quality. Subsequently, politicians may express their distrust in the research institution which has conducted the survey.

In Georgia and elsewhere, survey criticism should be focused on the process of research and should be aimed at its improvement rather than the rejection of the role and importance of polling. It is the duty of journalists, researchers and policymakers to foster healthy public debate on survey research. Instead of emotional messages aimed at demolishing trust in public opinion polls and pollsters in general, rationally and carefully discussing the research process and its limitations, research findings and their meaning/significance and, where possible, pointing to possible improvements of survey practice is needed.

Criticism focused on “unclear” or “incorrect” methodology should be further elaborated by professionally specifying the aspects that are unclear or problematic. Research organizations in Georgia will highly appreciate criticism that asks specific questions aimed at improving the survey process. For example, does the sample design allow for the generalization of the survey results to the entire population? How were misleading questions avoided? How have the interviewers been trained and monitored to minimize bias and maximize the quality of the interviews?

This blog post argued that survey criticism in Georgia is often based on inaccurate assumptions and conveys messages that are not helpful for research organizations from the point of view of improving their practice. These messages are also often dangerous as they encourage uninformed skepticism towards survey research in general. Rather than these unhelpful messages, I call on actors to engage in constructive criticism which will contribute to the improvement of the quality of surveys in Georgia, which in turn will allow people’s voices to be brought to policymakers and their decisions to be informed by objective data.

The second part of this blog post, to be published on January 23, continues the topic, focusing on examples of misinterpretation and misuse of survey data in Georgia.

Monday, June 06, 2016

Attitudes towards public opinion polls in Georgia (Part 2)

CRRC/NDI’s public opinion polls become the subject of intense discussions after the results of every wave of the survey are released, with politicians from various political parties criticizing the polls. Such a situation, though, is not unique to Georgia. As Professor Arthur Lupia recently put it, pollsters are a “popular whipping boy in politics”, yet they also “can give people a stronger voice”. In a previous blog post, we showed that attitudes toward public opinion poll results are mixed in Georgia, with nearly equal shares of the population trusting, distrusting, and neither trusting nor distrusting the results. This blog post shows that even though public opinion polls are regularly criticized in Georgia, there is still a public demand for them. 

CRRC’s 2015 Caucasus Barometer survey asked respondents to rate the level of their agreement or disagreement with the following statements:

“Public opinion polls help all of us get better knowledge about the society we live in”;
“Ordinary people trust public opinion poll results only when they like the results”; 
“Public opinion polls can only work well in developed democratic countries, but not in countries like Georgia”;
“The government should consider the results of public opinion polls while making political decisions”;
“Politicians trust public opinion poll results only when these are favorable for them or for their party”;
“I think I understand quite well how public opinion polls are conducted”.

Those who, while answering the previous question about trust in polling results, reported they did not know anything about public opinion polls, were not asked these questions.
Two-thirds of the population agrees with the statement that the government should consider the results of public opinion polls while making decisions, and nearly half agrees that polls help everyone to better understand the society they live in.   

Note: A 10-point scale was used to record answers to these questions. On the original scale, code ‘1’ corresponded to the option “Completely disagree” and code ‘10’ corresponded to the option “Completely agree”. For the charts in this blog post, the answers were grouped as follows: codes ‘1’ through ‘4’ were labeled “Disagree”; codes ‘5’ and ‘6’ were labeled “Neutral”; codes ‘7’ through ‘10’ were labeled “Agree”. Options “Don’t know” and “Refuse to answer” aren’t shown on the charts.

The share of the population who disagree with the statement that “polls can only work in developed democratic countries, but not in countries like Georgia,” is almost twice as large as the share of those who agree with this statement.


At the same time, people don’t feel they have a good knowledge of how public opinion polls are conducted. Only 36% report believing they have a good understanding of it. 45% also report that ordinary people trust the results of public opinion polls only when they like them, and 62% report the same in the case of politicians. Increasing knowledge of and trust in polls are clear challenges for pollsters in Georgia.

Whether people trust them or not, polls are important for society, and the results presented in this blog post show that people do acknowledge this importance. Polls help everyone grasp what society thinks, and the majority of the population thinks the government should consider poll results when making decisions.

To learn more about public opinion polls, take a look at earlier blog posts including Attitudes toward public opinion polls in Georgia,  Ask CRRC | Survey vs Census and Pre-Election Polls | what would be needed. To learn more about how CRRC collects data, take a look at this video or read CRRC-Georgia’s Research Guidelines




Monday, April 11, 2016

Attitudes toward public opinion polls in Georgia

In his book Polling and the Public Herbert Asher notes that findings of public opinion polls have significant effects on citizens’ attitudes and behavior. This is clearly true in Georgia where public opinion polls (especially those focused on political attitudes) are widely discussed by politicians, experts, and the media. Using CRRC’s 2015 Caucasus Barometer (CB) data, this blog post examines attitudes towards public opinion polls in Georgia.

Generally, the public’s trust in the results of public opinion polls is mixed in Georgia. One-third of the population reports trusting poll results, another third reports a neutral attitude, and 21% reports distrusting them. A small share of the population either does not know anything about the polls, answers “Don’t know” or refuses to answer this question.


Note: A 10-point scale was used to record answers to the question: “Generally speaking, to what extent would you say you trust or distrust the results of public opinion polls conducted in our country?” On the original scale, code ‘1’ corresponded to the option “Do not trust at all” and code ‘10’ corresponded to the option “Completely trust”. For this blog post, the answers were grouped as follows: codes ‘1’ through ‘4’ were labeled as “Distrust”; codes ‘5’ and ‘6’ were labeled as “[In the middle]”; codes ‘7’ through ‘10’ were labeled as “Trust”. Options “Don’t know” and “Refuse to answer” were combined. 

Reported trust in the results of public opinion polls varies in different demographic groups. Tbilisi residents tend to report slightly higher trust compared to those living in other urban and rural settlements. Those who are younger (18 to 35 years old) also report higher trust than those who are 56 years old or older. A slightly greater share of those with higher than secondary education reports trusting poll results compared to those with secondary or lower education.


Note: Only shares of those who reported trusting public opinion poll results are shown in the chart. The answer options for the question on education level were grouped as follows: options “No primary education”, “Primary education (either complete or incomplete)”, “Incomplete secondary education” and “Completed secondary education” were grouped into “Secondary or lower”. Options “Incomplete higher education”, “Completed higher education (BA, MA, or specialist degree)” and “Post-graduate degree” were grouped into “Higher than secondary”.

Interestingly, nearly half (46%) of those who report trusting the media also report trusting poll results, and statistical correlation of the answers to these two questions is significant. By comparison, only a quarter (26%) of those who distrust the media report trusting poll results.


Note: A 5-point scale was used to record answers to the question, “Please tell me how much do you trust or distrust Georgia’s media?” For this blog post, answer options "Fully trust" and "Rather trust" were combined into "Trust media"; ”Rather distrust" and "Fully distrust" were combined into "Distrust media". Options "Don't know" and "Refuse to answer" are not shown on the chart.

Attitudes toward public opinion poll results in Georgia are mixed, and nearly equal shares of the population trust, distrust or neither trust nor distrust the results. There are, however, some differences between those living in different settlement types, as well as between representatives of different age groups, and those having different levels of education. Generally, those who report trusting the media tend to trust the results of public opinion polls.

To learn more about public opinion polls, take a look at earlier blog posts including Ask CRRC | Survey vs Census and Pre-Election Polls | what would be needed. To learn more about how CRRC collects data, take a look at this video or read CRRC-Georgia’s Research Guidelines

Saturday, February 18, 2012

Leaving Thoughts by British Political Officer in Georgia

David Gale, who had served as Political Officer at the British Embassy since 2007, recently wrote down some of his thoughts upon leaving Georgia, after covering a turbulent time. It was refreshing to read a direct and evenhanded take on a number of issues, from a diplomat who has been following events very closely.

One aspect we especially liked in David's reflections is that he repeatedly highlights polling, as a way of understanding the preferences of the Georgian electorate. To read David's thoughts click here.

Thursday, June 09, 2011

What’s behind the May 2011 protests in Georgia?

There have been three main protests since 2007 that have demanded the Georgian president, Mikhail Saakhashvili, to step down. These three protests have been described in detail in international and qualified local coverage, such as that on EurasiaNet and Civil.ge. So what, from the point of view of research, are the main differences between the protests in 2007, in 2009 and in 2011?

There are a number of striking differences. First, the most recent protest has been the least attended. Reuters reported that around 10,000 people protested at the peak of the May 2011 protests. According to CRRC estimates, subsequently picked up by a lot of media, up to 60,000 people did so in 2009 and between 50,000 and 100,000 people participated in the 2007 protest.

Second, this decline in protest numbers is mirrored by a change in national sentiment about the direction in which Georgian politics is going. A March 2011 survey conducted by CRRC for the National Democratic Institute (NDI) shows that the percentage of people who say Georgian politics is going in the wrong direction has diminished since 2007.


In 2007, 40% of respondents thought that the direction of politics in Georgia was going mainly or definitely in the wrong direction, whereas in March 2011, that number was 19%. In other words, since 2007 the number of people who thought Georgia is going in the wrong direction has halved. As the slide shows, many more people now also think that Georgia is definitely going in the right direction.

Third, several media outlets have highlighted that the majority of protesters in May 2011 seemed to be above 50 years old. As Koba Turmanidze, CRRC Georgia Country Director, notes, the results from a 2011 CRRC media survey show that older people (who are not retired) continue to be less employed than younger people, and are less happy with their own and Georgia’s economic conditions than younger people. Their unhappiness may also reflect that they have been less engaged in post-Rose Revolution Georgia.

Moreover, results from the 2010 CB also show that the poorer a person considers the economic level of his/her household to be compared to most of the households around them, the more they agree that people should participate in protest actions against the government since the people should be in charge.

2010 CB-Georgia
The numbers do not sum to 100% because the ‘don’t know’ and ‘refuse to answer’
categories have been removed. Note that few people characterize their own
economic condition as "very good", thus the results are less representative.
More details on request.


This may be important in a country where 91% of the population consider rising prices to be worse than in 2008.

Next to the differences that we have identified, do you think there was anything that made the 2011 protests different to previous waves of demonstrations? (Note that you can also do some analysis yourself on our data interface ODA).

Friday, January 29, 2010

Reporting Data in the Media

This recently was on PhD Comics. So true. 




What would one add for the Caucasus? Any ideas? We should translate this and circulate.

Monday, September 07, 2009

"Is Georgia a Democracy?" | Recent Publication

Is Georgia a democracy? In previous blog posts we tracked various indicators, including the Freedom House Index. But what do Georgians themselves think?

Koba Turmanidze, Director of CRRC Georgia, and Hans Gutbrod from the CRRC Regional Office have written a short chapter discussing poll findings on this question. It is part of a broader publication by the Foreign Policy Centre, a UK Think Tank.


The publication also includes essays by Peter Semneby (EU Special Representative), Giorgi Gogia (Human Rights Watch) and Giorgi Chkheidze (Georgian Young Lawyers/Ombudsman's office). It also has fascinating electoral maps that we meshed up for NDI.



To read, click here.

Monday, August 17, 2009

ECFR report: Befuddling data

Public opinion found its way into a major report by the European Council on Foreign Relations (ECFR), but through the back door. The chart below, from page 28 of the report, appears to compare support for integration with Russia/CIS versus EU integration in the six EU “neighborhood” states.


But the footnote reveals that this data is a pastiche from a number of national opinion surveys that asked questions about attitudes toward EU integration. This type of data presentation can lead us astray, for a few reasons:

1. Comparability: A footnote claims that the variously worded questions are nevertheless “roughly comparable.” However, the subjects of the questions range from actual political integration, to foreign policy alignment, to “strategic partnership.” Some are concrete (“If a referendum were held next Sunday…”), others abstract (“With which of the following does Armenia’s future most lie?”). The form of the questions also varies. Some explicitly offer a choice between Russia and EU, others probe attitudes about the EU alone, still others offer unknown options for partnership. More fundamentally, many respondents in the “neighborhood” countries may not believe that EU integration is actually a feasible option. Asking about preferences for integration in the CIS versus the EU is meaningful only if people feel this is a realistic choice.

2. Unknown Sources: There is no indication of the survey sources. Even if the data itself is of high quality, methodology certainly was different in each of the polls. Timing is a particular concern. Commendably, the authors of the report do note that the survey in Georgia was carried out in 2007, while the others were in 2008. But political events at various points during those years (the Russia-Georgia conflict, the economic crisis, gas disputes between Russia and Ukraine, among others) could influence responses.

3. Presentation: For some countries, the combined responses total nearly 100% (Belarus, Ukraine), others are far less (Georgia, at around 50%) or far more (Moldova, with 120%). Presumably this reflects the different types of questions asked, or possibly missing values. But the chart fails to tell us which responses account for these discrepancies.

This kind of data presentation is a little disconcerting. Although it is very encouraging to see public opinion data in a major report, one would wish for a slightly more cautious presentation. To be able to draw powerful conclusions, a more consistent approach to gathering the data would be required.

In the coming days, we’ll put up a follow-up post presenting CRRC’s data on attitudes toward cooperation with EU and Russia in the South Caucasus countries.

Friday, October 03, 2008

Polling Data on Turkish-Armenian Bilateral Relations

Recently, as a result of the football diplomacy between Armenia and Turkey, an opinion poll was conducted in both Turkey and Armenia to gauge the reaction to new gestures in the Turko-Armenian relationship. The poll was carried out by MetroPoll in Ankara (Turkish only website) and by the Armenian Center for National and International Studies -- run by Rafik Hovannisian an American Diaspora Armenian now resident in Yerevan and involved in Armenian politics.

Unfortunately, the original questions asked or the sample size are not available online. However, the findings are indicative of the opinions of countries that are winners and losers (Turkey -- winner, Armenia -- loser).

In Turkey, almost 70 percent of the population found the Turkish president Abdullah Gül's trip was successful and presumably supported the normalization of relations with Armenia. What would have been more interesting to ask, however, was Turks view of the importance of normalizing relations with Armenia. I would hypothesize that the majority of Turks, particularly those who live far from Eastern Anatolia do not see the current position as hurting their economic interests and do not see the issue as vital -- particularly if it would require any change of Turkey's stance on the genocide issue. With Armenia's limited purchasing power, Turkey stands little to gain economically from opening its border. Furthermore, Turkey already export to Armenia through Georgia, and it is presumably Armenia that pays the higher costs for goods, not Turkey.

Interest in Armenia may be more pronounced for those Turks who live in Kars and other settlements bordering Armenia. However, while these places stand to gain most from cross-border trade, they also may have much stronger feelings about how the opening of the border may affect their lives and have potential worries about attempts of Armenians to reclaim or purchase property in the area.

Given the deep and continuing melancholy that permeates much of Armenian society's consciousness as a result of the slaughter and expulsion of hundreds of thousands of Armenians from Eastern Turkey and the central role that genocide plays in Armenian political culture, the Armenians show much more skepticism towards normalized relations with Turkey -- though the news is not all bad. Only 11 percent of respondents said they were against all cooperation with Turkey -- albeit 76 percent were only willing to normalize relations after certain preconditions were met. Ostensibly, preconditions revolve around the recognition of the Armenian genocide.

However, we would expect that more thorough plumbing of Armenian citizens' perceptions may reveal a more nuanced understanding of the policy trade-offs involved in preconditions. Likely, many more Armenians may be willing to engage in some compromise, if it meant more sustainable economic growth. Unlike Turkey, Armenia stands to reap large economic benefits from the opening of the border with Turkey. Transport costs would drop significantly for the many Turkish products that already wend their way through Georgia to Armenia; moreover, Armenia would have a more ready export market for finished goods they produce -- particularly if the Caucasian Tiger becomes more of a reality than a simulacrum.

Whatever the future for relations between Turkey and Armenia may hold, it is important to continue to provide open and reliable data on the process.

Friday, September 05, 2008

Georgia Post-Conflict Phone Survey | may be a first glance?

Georgian IPResearch (first time we heard of them, actually) conducted a phone poll between Aug.25 and Sept.2. 450 respondents were questioned countrywide. While we have our strong reservations about these telephone polls (they are biased towards people with phones, picking up calls from strangers, and bored enough to chat), they may serve as a preliminary indication. Here are the results:

1. Was the international community active in stopping the Russian aggression?

Yes 73.3%
No 24%
NA 2.7%


2. Which state was the most active in stopping Russian aggression?

USA 64,2%
France 20,9%
Poland 4,9%
Baltic states 2,2%
Germany 1,6%
UK 1,1%
Ukraine 0,7%
None 1,3%
NA 3,1%

3. Which organization was the most active in stopping Russian aggression?
EU 26,7%
UN 17,6%
NATO 14%
CoE 3%
OSCE 3.1%
All 5.1%
None 4.0%
NA 26.2%

4. Which is the most positive politician in stopping Russian aggression?
Nicolas Sarkozy 35.6%
George W. Bush 17.8%
Condy Rice 12.0%
Angela Merkel 7.8%
John McCain 5.6%
Lech Kaczynski 4.2%
Bernanrd Kouchner 3.3%
Mat Bryza 1.8%
David Miliband 1.6%
Barack Obama 0.9%
Other 2.8%
None 1.3%
NA 5.3%


5. Which is the most friendly country for Georgia?
USA 72.0%
France 42.9%
Ukraine 20.9%
Germany 13.6%
Poland 12.0%
Baltic states 11.6%
UK 7.8%
Turkey 2.2%
Israel 0.7%
Azerbaijan 0.4%
Other 0.6%
None 1.6%
NA 8.9%

(Respondents were asked to name max. 2 states, therefore the total number exceeds 100%)

6. Do you think that the Georgian government could have avoided Russian aggression?

Yes 42.4%
No 46.7%
NA 10.9%

7. Will Georgia receive NATO MAP in the near future?
Yes 80.9%
No 6.9%
NA 12.2%


Perhaps the most interesting part is that more than 40% of respondents think that the Georgian government could have avoided the confrontation. This means that the country might very well be split, with a lot of people thinking that what happened was a big mistake. But as mentioned, we cannot vouch for the quality of this survey.

It will be interesting to examine whether indeed so many people are critical, and who precisely they are. We are planning to conduct our own survey in the next few weeks. If you want to suggest questions that you think should be asked, let us know as soon as possible.

Monday, June 30, 2008

European Cup Craze : Who Supports Whom in the Caucasus?

Given the recent craze over UEFA football and the large number of diehard football fans across the Caucasus, I think the question about the politics of support is worth addressing. It can provide interesting insights into both cultural and political affinities -- much like Eurovision support -- except with a different demographic. We have limited information here, so the blog cries out for help!

Georgians, the titan of South Caucasus football, will only support teams with a tried and true record of success, according to one colleague who is a football maven. This means that Georgians generally support Germany. The historical connections and affinities also play an important role here.

But what about Russia facing off against Spain? Here the jury is more mixed. According to one Georgian television poll (which isn't the best source but it's all we have), 45% of Georgians supported Russian in the Russia-Spain game. If true, this like our Data Initiative data, shows that Georgians are still very open to Russians, though they may have strong feelings against the Russian Government.



As for Azerbaijan, the picture is clear. It's Turkey all the way (see picture above taken in Baku). But what happens when Turkey loses to Spain? I don't have the answer, but I encourage our Azerbaijani readers to chime in.



Armenians, well, I don't have any info here. Please comment.

Thursday, June 12, 2008

Georgia post-Election Phone Survey | Quick Review

Yet another survey has been sent around as a PDF in Georgia. The survey attempted to measure the postelection mood in Tbilisi. According to the information provided in the PDF, 503 respondents have been selected randomly and interviewed by telephone. According to the results 46.92% of respondents say they "fully disagree with the announced results of the 2008 parliamentary elections". 25,65% say they totally agree with the announced results. We have been asked to comment, and some of the things we have to say will sound pretty obvious.

First, Tbilisi itself is a biased sample. All of the survey work so far has shown that Tbilisi residents are the more skeptical about the government than any other part of the country. While the views of Tbilisi residents may matter in terms of who will demonstrate on the streets, it is not an accurate assessment of the mood of the country. Moreover, and this is a standard objection even if it's not familiar to people that don't know much about surveys, telephone surveys have limited reliability. You will only reach people that have a telephone, are at home (or willing to pick up calls from random strangers on their mobile), and who are willing to trust the interviewer and take some time to talk. Now obviously each of these four qualifications serve as a filter, and the people that actually respond may well not be very representative.

As one saying goes, telephone surveys only get the views of those who are bored at home. Great for exploring some issues ("what television programming would you like to see?"), not for measuring attitudes towards complex political questions. Especially not in Georgia where there's little experience in weighting to counter the implicit bias.

Then again, you could say that surveys like this only attempt to take the general mood, and that in the absence of funding for face-to-face interviews across the country, such a survey still gives us a quick glimpse. There is some merit to this view, but the problem is that reporting the views of 46.92%, with the precision of 0.92% tacked on, suggests a level of representativeness that this survey simply does not have.

And there's another problem with the question: what exactly does "agreeing with the announced results of the 2008 parliamentary elections" mean? Did respondents like the results? Even if you voted for the winning party, you could fundamentally "disagree" with the results, since, knowing the final results, you would have preferred a lot more diversity in parliament. We ultimately just don't know what precisely "agreeing" means here.

Still, on one issue the telephone respondents seem to have been fairly prescient, at least as things have developed so far:


That's pretty much what happened in the meantime.

Tuesday, May 27, 2008

What do Georgian Troops Think about the Iraq War?

Recently, the Georgian Times published an article on a poll recently conducted by GORBI of Georgian Troops in Iraq. According to the article, this is the first poll conducted amongst these soldiers.

The article highlights that troops continue to have very positive feelings towards their tenure in Iraq and agree with Saakashvili sending them there. According to the poll, 89% are satisfied with the current conditions and 93% are satisfied with their training.

This data may point to the professionalization of the Georgian army. Interestingly, according to a poll done by the Military Times (no ability to rate its quality) at the end of 2007, 80% of American troops, still say they are "somewhat" or "completely" satisfied with their jobs. This is despite the fact that now more than half of troops believe that America should not have gone to war.

Georgian citizens, however, like American citizens and unlike the Georgian troops generally do not support troop involvement in Iraq and often possess cynical views of America "buying cheap Georgian cannon fodder" type.

The article, however, opens several interesting research questions.

  • Why do Georgian troops have such a positive attitude towards serving in Iraq? I think there may be several unexpected answers to this question, which involve exposure to different troops (i.e. Americans and Brits) and the benefits and salaries these soldiers receive compared to what the receive back home. Or maybe, they just have the feeling that they are serving a useful purpose. Feedback welcome.
  • Interestingly, the questionnaire used by GORBI was a self-completion questionnaire. Our experience is that these type of questionnaires work poorly with Georgians, as there is no tradition of filling them out. We wonder how this worked with the Georgian troops.
If the GORBI dataset was publicly available, interesting analysis could be done there.

Tuesday, May 06, 2008

Diversity Polling on the Caucasus | Ask500

Sometimes it's worth clicking on those Gmail links. "Ask 500" is a website in beta, the web version of a straw poll. Polling? Surveys? Obviously I wanted to know more. To say it up front: it's about as unrepresentative as you can get, since it assembles those that suffer from terminal curiosity.

Playing around with it and discovering that so far this still is a small community, I posted a question on people's feelings about the Caucasus. I wanted to know whether people have positive associations (mythical, attractive), or rather negative ones (messy, dangerous, uncomfortable). And then providing some options in between. I also wanted to see whether this question will go anywhere, or whether tabloid interests will prevail.

It certainly is an attractive interface for seeing where the votes are:


Also, the comments function is particularly useful, a sort of focus group of the electronically vociferous.

Ask500 could become incredibly powerful in doing a quick review of an idea, checking it for mistakes. Put more succinctly, where a diversity of viewpoints is more important than representativeness, this approach could be a BIG THING (maybe not THE, but certainly A). It's interesting to see how the founders explicitly invoke Surowiecki's The Wisdom of Crowds as a starting point for their work.

What never ceases to amaze me is how technology DOES flatten the world. An instrument can still be under development in the US, and as long as you have an Internet connection, you already can take part. (Obviously, turning electronic into economic opportunity to alleviate poverty is a very different challenge. Unless you are a programmer.)

In the meantime, check Ask500 to see how responses to the Caucasus develop while the poll is open. Note that the Vote button is quite small: top right.

Saturday, May 03, 2008

Exit Polls | Take Two

Readers may recall that we voiced some concern with regards to exit polls. Here is a fascinating account, first-hand, by a reputed pollster having what they describe as an "Adventure in Baku". It is a salutary tale, and again shows that exit polls are not the quick fix they often are believed to be -- even when organisations such as Mitofsky International, bringing extraordinary experience get involved. As the authors conclude:

"One should never go through an experience like this without taking away something for the future. The number one lesson here is that public polling is difficult to do for organizations other than the media and for organizations that have a long history of publication of survey results, regardless of the direction of the findings. This criterion is met in the United States by foundations that sponsor polls, many government agencies, and private companies. However, if one chooses to work, as we did, for organizations with no known record for open availability of the survey findings, caveat emptor."
Well, OK. And who outside the US (and in a transition context) meets these criteria? At a very minimum, the old virtues of total transparency are critical for getting it right. But even then, huge challenges remain that cast serious doubts on the accuracy of any such enterprise, especially in a really competitive environment. Who in their right mind would have serious confidence in nuanced district level results, given the extensive problems described?

Highly recommended reading (it's entertaining, too), you find the article here.

Tuesday, December 18, 2007

Pre-Election Polls | what would be needed

With the election in Georgia approaching fast, polls are beginning to appear every week. Unfortunately, many of these polls are taken at face value. The reality is that at this point there is not a single pre-election poll that has demonstrated credibility. This does not necessarily mean that polling firms and newspapers are simply fabricating their data -- it simply means that if they were simply fabricating their data, it would be very difficult for anyone to know.

So can we be confident that a poll is credible? There are a number of basic stipulations:

1. Reveal the sampling methodology. How, in other words, do the pollsters ensure that interviewing a few thousand people is representative of the entire electorate? Choosing respondents requires a) knowing where most people live, and b) having a very strong theory about which people are likely to turn out to vote on election day. This is very difficult stuff, and even tiny errors here can have tremendous consequences.

2. Tell us about the field work. Were the interviews done face to face or by telephone? When and how? Did the survey enumerators explain who they were working for, and is it possible that the respondents knew that they were looking certain answers?

3. Publish the questionnaire.
What exactly was asked, and how, and in what sequence?

4. Document the non-response rate. How many people refused to answer? There are plenty of people who don't pick up the phone, or who don't have 30 minutes to talk to pollsters...and in this country, many of those people will vote.

5. Allow peer-review. Power point presentations for nonspecialists are fine, but make the data set available to peers for professional scrutiny (and of course you can restrict usage). If you really are confident in what you're doing, this is the way to go.

If polls do not meet the standards, they really do not deserve to be taken seriously.

Too many commentators forget that the burden of proof is on the polling firms, not on the public. We seem to be entering a dangerous cycle, where there is a lot of awful information floating around, and no one has the ability to sort the good from the bad. This is as much a problem with what the public is demanding as what the firms are supplying. The public should beware, and commentators should be very cautious about taking firms' power point slides at face value, until some basic methodological questions are answered transparently.

Monday, December 03, 2007

Exit Polls | a good idea?

With upcoming elections in Georgia, the attention is back on a theme that otherwise often gets neglected: what does the Georgian electorate want?

One of the ideas is to conduct an exit poll, to track the scale of any potential manipulation. You ask a representative sample coming out of the polling station who they voted for, and that should give you a good idea about the electoral results.

According to several people, the Georgian government very much would like such an exit poll. One reason, it is said, is that at least one opposition candidate is considering financing his own exit poll, and getting a large exit polls supported by international donors may counterbalance any biased results that have been paid for by a single candidate.

At face value, this seems like an attractive idea and it has a number of supporters. After all, all you're doing is triangulating, helping to verify what actually happened on E-Day and more information always seems better than less.

But where the trust in the election administration is limited, the risks of exit polls far outweigh any potential benefit. As the head of one organization working in the elections field pointed out, it would "be like fighting fire with fire".

Voters exiting a polling station may not actually tell the truth of who they voted for. This can have various reasons: the social acceptability of their choice, fear for jobs, the first impressdon that the interviewer makes, plain intimidation.

Therefore exit polls can easily be off by 5%, or more. Now imagine one candidate wins the first round legitimately with 52%, but the exit polls only show 47% support, because of such bias, or skewed sampling. The opposition will believe that the election has been stolen, although results simply were within the margin of error.

Ultimately there is no substitute for a regular, disciplined conduct of elections, with citizens actively participating to guard their own vote. If polls were an alternative, there would be no need for the entire elections rigmarole.

Friday, May 11, 2007

Armenian Election Polling

Pre-election polling has become an increasingly big business in the South Caucasus. The Armenian elections, scheduled for May 12, again illustrate this. Much of the polling appears like a quick job with little attention to scholarly rigor. However, the results from these polls are often presented as gospel, particularly in local media outlets. The article quoted below was put on the wire by ARKA, an Armenian news agency.

The findings themselves could be interesting. But there are many problems in talking about "Armenia's population", and these data should be taken with more than a grain a of salt. I personally would like to to know what "excessively intense" means and what this actually tells us about the election. As a rule, it would be great if journalists asked who funded this research.

43.2% OF ARMENIA'S POPULATION ESTIMATE ELECTION CAMPAIGN AS EXCESSIVELY TENSE 18 April 2007, ARKA - News (Armenia) English (c) 2007 ARKA News Agency YEREVAN\

43.2% of Armenia's population estimate the election campaign as excessively tense, said Director of Independent Sociological Centre "Sociometer" Aharon Adibekyan, when introducing the results of the sociological research. He said that 34% of the respondents think that the political propaganda is conducted in the usual regime without deviations. According to the survey, 9.7% of the electorate thinks that the propaganda is conducted coarsely and importunately, and 8.3% - lower of the moral norms.

The survey was conducted in 19 big cities of Armenia and 8 Yerevan communities. The total number of respondents made 1,500, statistical error is not more than 1%. Centre "Sociometer" intends conducting three more sociological surveys on the parliamentary elections in Armenia - in Yerevan, in the rural regions of Armenia and the final survey, including the voters throughout the country.

Local media would probably increase their authority if they contextualised data for their readers. EurasiaNet carries a comprehensive article highlighting the lack of professional polling, and contrasting it with widespread apathy. Surely that apathy in part is also a result of the lack of any reliable information.

Friday, March 23, 2007

IRI Release New Georgian Poll | Questions and Highlights

IRI has released a new poll chock full of data (to see all their polls, click here). They interviewed 1500 Georgian voters over the age of 18 in February 2007. This yields an image of political developments, but also provides suggestions to political parties on what the electorate cares about.

We have some misgivings on this poll as a research tool (see below), but let me present some brief highlights.

Conflict


  • Relations with the CIS remain cold. 60% think that Georgia should not remain a member of the organization.
  • Optimism about resolving armed conflict has significantly dropped. In the recent poll, 32% of respondents now don’t know if South Ossetia will return under Georgian control and 17% of respondents believe it will take more than 6 years. In presumably the same poll done in 2004, 21% thought that South Ossetia would be returned to Georgian control within a year. Now only 5 % believe this will occur.
  • Over 90% still oppose using force in resolving the separatist conflicts.

Politics

  • In data supported by CRRC’s Data Initiative, trust in the judicial system remains low, according to the poll. After unemployment, the judiciary is the field most in need of reform, according to respondents. Additionally, 78% of respondents are not satisfied with the Girgvliani court decision.
  • Other things being equal, 50% of voters would choose a male candidate over a female, whereas only 6% would prefer the female (42% claiming indifference). 8% believe that women have too much power in Georgia (we'd love to see what these 8% think about other issues; are they the male part of the 16% who would like to see a return of constitutional monarchy, for example?).
  • 42% of respondents believe abortion should be made illegal, while 38% believe it shouldn't. (Again, we'd like to see how that breaks down into male and female respondents.)
  • In an interesting statistic about participatory democracy, 59% of respondents did not know the name of the majoritarian MP of their rayon. Nevertheless, half thought that their MP was doing a bad job (whoever that person is). If IRI would make the data set publicly available, lots of interesting comparisons could be made here. What is the relationship between knowledge of the majoritarian MP and judgment of his job? This among other indicators, could judge how effectively politicians were making themselves stand out as individuals. Another reason to promote open data sources!
  • In a statistic that may break with impressions created by public discussion, 56% of respondents support the new statue of St. George on Freedom Square. Again, the breakdown of who these supporters would be interesting to know, as I have a feeling they correlate with other behaviors.
  • 37% believe that many people are afraid to openly express their views. This number has been climbing slowly, but steadily from a baseline of 22% in October 2004. Only 8% believe that the government fully respects citizen's human rights. At the same time, 48% believe the country is developing in the right direction, up from 39% in April 2006.
  • Unsurprisingly, unemployment and relations with Russian were the government's biggest failures, while electricity and paving roads the government’s biggest successes. The respondents also thought that unemployment is the largest problem that Georgia is facing.
  • Impressively, only 2% reported having to pay a bribe in the last 12 months to get a service or decision, and 78% believed that the criminal situation had improved (showing a consistent trend over the last few years).

Data, data, data!!!

For this survey to become a research tool, it would be desirable if IRI would

  • make the raw data set available, allowing researchers to look for correlations;
  • tell us about sampling, and non-response, to give a better understanding what type of data we are looking at;
  • release the Georgian questionnaire, so that one can find out exactly what was asked.
It would desirable if such basic transparency requirements became a standard for surveys financed by international donors.