Showing posts with label Crowdsourcing. Show all posts
Showing posts with label Crowdsourcing. Show all posts

Sunday, October 17, 2010

Crowdsourcing: Lessons Learned

We have previously posted on some of our crowdsourcing work, also here. This may seem like a niche interest, but it is part of our broader approach: making sure that the voices of ordinary citizens in the Caucasus are heard. Surveys are one way of doing that, crowdsourcing is another tool that we are working with.

On October 13, we presented some of the lessons we learned at a conference on Social Media, at the Frontline Club in Georgia.



The video quality is not great, and we spoke to the crowd, not the camera, but you will get the idea. We summarize the main six lessons we drew out of the project and here is the link to the presentation. Yep, they are pretty obvious in retrospect, but not all of it was so clear to us at the time. The talk, primarily by Jonne Catshoek, starts at 1.16.00. If you want to hear more, let us know.

Monday, May 31, 2010

SMS Survey | First Insights

So! Our SMS project worked quite well. Critical to its success was the systematic error control early in the day. Our interviewers still made a fair number of mistakes in the early morning. It was the first time we introduced this system, and transferring the number correctly to SMS requires significant attention to detail.



Whenever the system flagged mistakes, we called the interviewer to check, and implicitly to remind them to get it right the next time. The entire CRRC team took turns, with two colleagues in each shift, in addition to the other people in our E-Day room (the cheerful morning group pictured above). Note that we chose codes that made a transposition error unlikely. Since "Yes" was 1, and "No" was 3, you'd have to go across the keyboard to get that wrong. 


The chart shows the error rates. In the morning, we quickly manage to reduce the errors by almost half in the first hour. In the afternoon, some exhaustion sets in, but the interviewers recover in the early evening. In the last hour, however, error rates almost quadruple within an hour -- our interviewers are well and truly tired. 

More detail on this to follow. We will also enter the paper questionnaires, and then compare with the SMS results, to see whether there actually were any transposition errors. For us, this was an exciting day. 

Sunday, May 30, 2010

Testing Mobile Innovation in our Surveys

In running an election-day survey (not an exit poll, which we are not so enthused about), we have decided to attempt something new: we are now aggregating the information via SMS. This gives us the information in real time, and the data will be available for immediate analysis the moment the last SMS has been received. On the image below you see how this looks on our screen. Interviewers send in a code response where letters (A, D, G, and so on) signify the question number, and the numbers are the response chosen. So the letter M, for example, stands for the question "Did you, or anyone from your family, have any problems with the voter’s list?", and "3" stands for "no".



To be sure, this only works with short questionnaires, and has forced some compromises in the answer options, which we have to keep simple. However, it is an interesting step forward. While there are solutions with Personal Digital Assistants, these are expensive and would make our interviewers walk around with expensive, possible distracting gadgets. So SMS seems a good alternative for now. Implementing this project ended up being a lot of work: creating a Virtual Private Network connection with the main mobile providers (Magti and Geocell both were helpful), programming the software to disaggregate messages, writing manuals, training interviewers, testing whether it all works, building redudancies so that it becomes a robust system, and creating a system for checking errors. Errors are flagged automatically in red (see below), and we call back interviewers to correct. 



We are also trying this SMS technology with the election monitors of ISFED, since this offers a rapid way of aggregating their results. At this point this is run as a pilot, to test the technology, and how it is adopted by monitors. Parts of the software have been provided by the National Democratic Institute, and the Open Society Foundation Georgia has generously funded this effort. A lot of effort has been invested by the entire team, but especially by Irakli Naskidashvili, Tbilisi's IT wizard, and Jonne Catshoek, our Crowdsourcing Project Manager.

This will all happen today, and we will let you know how it worked, and where we want to take these applications in the future. One of the great things at CRRC is that we have this chance to goof around and try new things. 

Election Day Portal

To track what is going on during election day, Georgia's leading monitoring organizations, the International Society for Fair Elections and Democracy (ISFED), the Georgian Young Lawyers Association (GYLA) and Transparency International (TI) have created a joint portal, VoteGeorgia.ge






This will join the feeds from the three organizations, while also giving you a map with region-specific information. The website has been designed by TI. NDI provided critical coordination, as well as access to survey results. CRRC is providing the maps for the effort. We work with GeoCommons to provide the data on the maps. Below a snapshot of a pre-election complaint. 




Note that the maps take time to load. They are not as fast and snazzy as Flash-based maps would be, since they get populated by data that is continuously updated, during the election day and after. Here the technology is still catching up -- we are also using this as an exercise to learn how best to make mapped information available over the Internet. 


We know there are some snags -- it does not work too well with the Safari browser, for example. Further suggestions gratefully received in the comments. This is a pilot, and we want to use this opportunity to get everything right in the future. 


On CRRC's side, David Sichinava, our GIS and Database Analyst, and Jonne Catshoek, our Crowdsourcing Project Manager, were the ones who made this happen. This work has been supported generously  by the Open Society Foundation Georgia.