Showing posts with label Machine Learning. Show all posts
Showing posts with label Machine Learning. Show all posts

Monday, May 11, 2020

AI and Russian propaganda: it’s not what it looks like

[Note: This article was originally published in the On Think Tanks Annual Review. It was written by David Sichinava and Dustin Gilbreath. David Sichinava is the Research Director of CRRC Georgia. Dustin Gilbreath is the Deputy Research Director of CRRC Georgia and the Communications Manager at Transparify. The views presented in this article do not reflect the views of East West Management Institute, USAID, or any related entity.]

In the think tank world, talk about artificial intelligence (AI) is common. Using it is less common. One of the underlying causes of this may be a perceived lack of familiarity with the methods. However, AI methods – including machine learning – are probably more familiar to many thinktankers than they realise. The Russian Propaganda Barometer project, recently conducted by the Caucasus Research Resource Centers (CRRC) Georgia, demonstrates the potential of these tools in think tanks for policy insight – particularly relating to discourse analysis, and developing targeting strategies.

Artificial intelligence and machine learning are more familiar than thinktankers think
To say that artificial intelligence in general, and machine learning algorithms specifically, is a dramatically changing industry would be an understatement. From optimising electricity usage in factories to deciding which advertisement to show you online, algorithms are in use all around us. In fact, algorithms have been shaping the world around us for decades.

The think tank and social science worlds are no exceptions to this. Indeed, most policy researchers will be familiar with, if not users of, algorithms like regression. Notably, this is a common tool in the machine learning world as well social science research.

Hopefully, knowing that regression is part of the machine learning toolbox will make it clear that machine learning is less foreign than many thinktankers may think.

While regression is one method in the machine learning toolbox, there are others. Although these methods are not new, this larger toolbox has only become commonly used in recent years as big data sets have become more available.

For many products and problems, machine learning solutions might be improvements on existing think tank practices. This is particularly true when it comes to developing a targeting strategy for programming, monitoring, or anything that focuses on understanding discourses.

The Russian Propaganda Barometer Project
CRRC Georgia implemented the Russian Propaganda Barometer project, funded by USAID through the East West Management Institute in 2018-2019. The project aimed to understand and monitor sources of Russian propaganda in Georgia, and to identify who was more or less likely to be vulnerable to the propaganda.

To monitor Russian propaganda, CRRC took all of the posts from public Facebook pages of potential sources of Russian propaganda (around 50,000 in total) in the Georgian language as identified by two other organizations working on the issue in addition to several pages missing from their lists. These posts were then analysed using natural language processing tools such as sentiment analysis. Network analysis was also conducted to understand the interlinkages between different sources.

One of the key insights from the project is that most of the sources of propaganda identified were in fact from far right organisations. While some of these are likely tied to Russia, an analysis of how they talked about the West and Russia suggests that most actually have more negative attitudes towards Russia than the West.

The analysis also called attention to the sharp rise in interest in the far right in Georgia. The number of interactions with far-right pages had increased by roughly 800% since 2015. While overall increasing internet use in the country likely contributed to this, it seems unlikely to be the only cause of the rise.

The results were presented in this dashboard, as well as a more traditional report. It enables users to see what the far right is talking about on a daily basis, and networks between different groups, among other metrics.



The project also aimed to inform a targeting strategy on countering anti-Western propaganda. To do so, we merged data from approximately 30 waves of CRRC and National Democratic Institute surveys that asked about a variety of preferences. From there, a ‘k-nearest neighbours’ algorithm was used to identify which groups had uncertain or inchoate foreign policy preferences. This algorithm basically identifies how similar people are based on whatever variables are included in the algorithm. Based on similarity, a prediction is then made about whatever outcome is of interest. This led to an algorithm that provided accurate predictions about two thirds of the time as to whether someone would be more or less likely to be influenced by Russian propaganda. Further research showed that the algorithm was stable in predicting whether someone was at risk of being influenced, using data that did not exist at the time of the algorithm’s creation.

The data analysis, while cutting edge in many respects, is not beyond the means of many quantitative researchers. Neither of us have MAs or PhDs in statistics: David is a geographer and Dustin is a political scientist.

While the Russian Propaganda Barometer addressed the research goals, we’d like to highlight that AI is no panacea. For the project’s success, we combined traditional think tank analysis of the situation in Georgia with AI to generate new insights.

The Russian Propaganda Barometer project is just one type of application of machine learning to policy research. There is good reason to believe more and more policy researchers will use these methods given their ubiquity in the modern world, together with the increasing availability of the large datasets needed to study these issues.  We hope that the Russian Propaganda Barometer project can serve as food for thought for others in service of this goal.

Monday, May 27, 2019

Does our algorithm still work?

Within the Russian Propaganda Barometer Project, funded by USAID through EWMI’s ACCESS program, CRRC-Georgia created a model, using a k-nearest neighbors algorithm, which attempts to predict whether a person falls into one of three groups: consistently pro-Western; anti-Western; or neither and potentially at-risk of being influenced towards an anti-Western foreign policy position. The model used data from NDI and CRRC’s polling between 2008 and July 2018.  It included variables for age, education level, settlement type, and when the survey was conducted.

In essence, this was done to see whether it is possible to guess a person’s foreign policy preferences using the aforementioned variables. The model successfully predicted people’s status 64% of the time, as described in this policy brief. This was a nine percentage point improvement over always guessing the most common status.

But, as is well known, social scientists can play around with data to make themselves look good. In this regard, one way of understanding how well a model works is whether it can predict observations that are inaccessible to the researcher at the time of the development of the model i.e. does the model make accurate predictions about people it does not yet have information about?

To test the model, CRRC-Georgia used the same algorithm to predict people’s status on the December 2018 and April 2019 NDI survey. Neither dataset was available to the researcher at the time the model was developed.

The results suggest the model works in near identical form, predicting 65% of responses accurately. As in the policy brief, the new data suggest that people in predominantly ethnic minority settlements, people with lower levels of education, and older people are significantly more likely to be at risk of being influenced by anti-Western propaganda in Georgia.


These results lead to a number of conclusions. First, the model does appear to work. In essence, it can guess someone’s foreign policy position correctly two thirds of the time if you know the type of settlement they live in, age, and education level.  Second, it re-affirms the recommendation in the policy brief that people working towards countering anti-Western propaganda in Georgia should prioritize working with ethnic minority communities; people who are not so young; and those with lower levels of education. At some level, these groups are relatively difficult for NGOs to reach. NGOs often focus on working with young people and the highly educated. Moreover, NGOs rarely have the capacity to work in Armenian and Azeri communities, aside from the organizations from those communities.

Despite the challenge in reaching these populations, the need is clear, and if Georgia is to prevent anti-Western propaganda from dividing society actors should work to counter Russian propaganda efforts among these groups.

The views expressed in this blog post do not represent the views of EWMI, USAID, or any related entity.

Replication code for the above data analysis is available here.


Tuesday, February 05, 2019

New Georgian study offers insights on Russian disinformation

[Note: This article originally appeared in Eurasianet.]

A study recently conducted by the Caucasus Research Resource Centers-Georgia confirmed widely held beliefs that pensioners and those with low levels of education are most susceptible to media manipulation. The findings suggest that Western efforts to counter Russian disinformation should focus on those groups in Georgia.

Another major finding of the study is that a solid, growing economy is perhaps the best antidote against disinformation.

The study, which was funded by USAID, was designed to enable policymakers to gain a better understanding of who in Georgia is susceptible to believing anti-Western disinformation. During the post-Soviet era, Georgia’s steadfast efforts to move closer to Western institutions, including NATO and the EU, have been a major source of tension in its relations with Russia. The two countries fought a brief, and from Tbilisi’s standpoint, disastrous war in 2008.

The CRRC-Georgia study can, in turn, help policymakers lay the groundwork for better-targeted Western initiatives to counter Russian disinformation, with the aim of reinforcing public support for Tbilisi’s embrace of Western values and institutions. Another aim is to foster a better understanding of attitudes and trends in order to reduce the odds that any new initiatives misfire and stoke the polarization of society.

CRRC-Georgia researchers pored over demographic data and developed an algorithm to hone their ability to predict whether individuals were at risk of being influenced or not by anti-Western disinformation; whether they already held pro-Russian or isolationist views; or whether they held pro-Western views.

The results showed that 55 percent of the sample size held pro-Western views, 36 percent of the sample size was ambivalent, uncertain, or inconsistent in their views and 9 percent held pro-Russian or neutral opinions.

The only mild surprise in the ensuing analysis was that a citizen’s residence in the capital Tbilisi was “no longer a significant predictor of at-risk status.”

Age is a major factor when it comes to consuming and believing disinformation: the older an individual is, the more susceptible he or she is to fake news.

“The results suggest that while one in five 18-24-year-olds are at-risk of being influenced by anti-Western propaganda, one in three people over the age of 65 are,” according to the findings.

Ethnicity also appears to have important implications for the effectiveness of disinformation. “Slightly under one in five people in predominantly ethnic Georgian settlements are at-risk of being influenced by anti-Western propaganda, while one in three are in predominantly minority settlements,” the report stated.

Those with a secondary education or better tended to be relatively impervious to disinformation, in terms of shaping attitudes about public affairs, the findings suggested.

Of those in the sample who were found to be at risk of being influenced by disinformation, many were worried about economic developments. “The economy may be a slightly more important issue for those who are at-risk, suggesting that messaging about the economy and actual economic improvement are likely to be important for this population,” CRRC-Georgia researchers wrote.

Russia’s weaponization of information has disrupted political processes in the West in recent years, including the 2016 U.S. presidential election and the Brexit campaign in the UK.

Policymakers in the West have only recently started to focus on crafting strategies that address Russian digital mischief-making.

The EU has invested in strategic communications aimed at countering Russian disinformation in Georgia and elsewhere. The CRRC-Georgia findings may help Western policymakers tweak initiatives so that they are more targeted, and thus, stand a better chance of achieving strategic objectives.

Efforts to counter Russian propaganda can take two broad forms – demand-side and supply-side. A supply-side strategy involves blocking disinformation at its source via the disabling of the source’s ability to distribute content. A demand-side strategy, meanwhile, aims to inoculate news consumers from the potentially pernicious effects of disinformation.

When it comes to the use of supply-side tactics, there are troubling ramifications for democratic societies that are built upon fundamental rights such as freedom of speech and access to information.

Given the supply-side dilemmas, developing demand-side initiatives that address issues relating to Russian state-sponsored disinformation would seem to offer a better, although potentially more difficult way forward.

Dustin Gilbreath is the deputy research director of CRRC-Georgia. The views expressed in this article represent the views of the author alone. The article was written within the auspices of the Russian Propaganda Barometer Project funded through the East-West Management Institute’s ACCESS program.