Executive summary

The aim of the project is to describe and reconstruct the information campaign carried out by Russia and pro-Russian activists in the internet and to reconstruct representations and frames of the Ukrainian-Russian conflict emerging from internet commentary sections and social media posts. Textual and visual analyses reveal the tools and methods used by pro-Kremlin commentators to build representations of Crimea’s annexation and the Ukrainian-Russia conflict.

The subject of analysis is the framing of how the Ukraine–Russia conflict played out in internet portals (DELFI, korrespondent.net, pravda.com.ua, kyivpost.com and onet.pl) and social media (Facebook, Vkontakte) in the period from 1 April to 31 December 2014. The effectiveness of influence on the internet were also analysed, particularly in mobilizing internet users to engage in communication.

The information warfare conducted around Russia’s annexation of Crimea continues to this day, in both traditional and new media spaces. Consequently, war in the internet has become a permanent front in the information war – it is waged not only in times of military interaction, but also in times of peace, as an element of state information policy. Long before the conflict in Ukraine, internet and news outlets were used to disseminate disinformation that aimed to mould Western public opinion in favour of the pro-Russia narrative. Even if these actions are called preventive measures and responses to “information aggression” by the West, they reflect a doctrine aimed at developing a favourable image of Russia abroad.

The analysis of internet content allows the reconstruction of propaganda objectives and of frames in which to portray current and past events. Frames are understood here as means – structures, forms and schemes that influence individuals’ interpretations of issues, facts, groups and ideas and ‘determine’ the choices people make. Frame analysis also enables future actions to be foreseen and a country’s strategic and operational objectives to be reconstructed. In the case of Russia, they remain the same: to rebuild the Russian empire while also exposing the decadence of democratic Western societies. These messages justify the necessity for ‘civilization change’ and Russia’s defensive actions. 

In internet discussions, several frames, in which to place the current Ukrainian-Russian conflict, recur continuously. The fundamental frame, describing the relationship Russia has with the outside world, is that of a decadent trans-Atlantic civilization trying to impose its liberal values on the whole world. This has led to civilization’s regression, barbarity and the spilling of blood. 

Russki mir is supposed to be the answer to the West’s ideological expansion, which has made puppets out of Eastern and Central European countries. ‘Being on a short leash from the West’ not only proves the intellectual feebleness of European leaders, but also damages their own national interests. Ukraine as a country has degenerated socially, systematically and politically. Fascism flourishes here, and primitive barbarism and cruelty toward other nationalities prevents constructive dialogue. 

The report identifies different methods of influence in news portals’ comment sections and in social media – building frames organizing discussions, communication techniques maximizing influence over internet users and printing visuals having impact on the conscious and unconscious mind. It is clear that target audiences also differ, therefore the groups under influence being analysed are the Russian-speaking audience, Russia itself, Ukraine, the Baltic States and Poland. 


The report comes to several fundamental conclusions. There is a strong relationship between the content of articles and the comments posted to them. Even though the discussion itself may deviate from the storyline of the source, the starting points are media reports. The correlation concerns topics, not the perspectives in which they are interpreted.

The comments see the images of the participants in the conflict being continually built, e.g., Russia is a superpower – a country determined to defend its interests, able to achieve its goals with the use of political and military measures. It is a peaceful country that does not react to aggressive Western policies. Ukraine is a country deprived of its roots, a fascist country, unable to survive by itself.

The number of comments is linked to the content of articles and depends on internet troll activity. Photographs of people displaying negative emotions are commented on more ‘eagerly’. General images of destruction, death and weapons result in a fall in the number of comments. The number of comments increases when content can be easily used by trolls to incite political antagonism (the political activity of the conflicting sides, their definition of what is happening in Ukraine and what role Russia has in all this) and social antagonism (dissatisfaction, protests, breaking of the law and ethnic conflict). 

A much larger number of comments, both validating Russia’s actions (justifying separatists’ military actions and Russia’s involvement) and blaming the West and Ukrainians (their aggression and fascist government) was also observed when articles portrayed Russia’s actions negatively.

Even though the narratives, frames in which Russia, Ukraine and the West are portrayed, are evoked constantly in comments, they vary between audiences (targets). Although frames of anti-Russian phobia appear everywhere, they occur more often in the Baltic States and Poland than in Ukraine on Russian- and Ukrainian-language websites. ‘Fascist Kiev’ is referred to everywhere, but in Poland, Wołyń and the genocide of Polish people by Ukrainians is evoked more often in this context. In other countries, Ukraine’s cooperation with fascist Germany is evoked more frequently. 

Organized troll activity in news portals and social media is coordinated and their audience-influencing techniques are advanced. A scheme for troll activity can be described in three phases: luring, taking the bait and hauling in. The coordinated and massive character of troll activity indicates that we are dealing with the phenomenon of (social) media weaponisation. However, it seems that the nature of the internet and Web 2.0 technologies mean that the effectiveness of this influence may be less than is supposed. First of all, because every propaganda action triggers counter-propaganda, which is obvious in the analysed material. Secondly, because there is no way to eliminate alternative sources of information (such as TV, radio or newspapers). There is no question, however, that the internet is a perfect tool for disinformation, not only on its own, but in combination with traditional media.

Analysing organized trolling is not straightforward, it requires high sensitivity on the part of the researcher, understanding the context of statements and the different communication techniques used by trolls. 

This still does not guarantee the 100% identification of trolls.

In order to understand the effectiveness of internet comments, linguistic analysis of statements is indispensable. Language defines our personal image of the world. Despite differences in languages, trolls use some universal communication instruments: categorization of ‘us’ and ‘not us’, the use of metaphors, idioms, building neologisms and with their help, describing people and events through stereotypes. These are adapted to the linguistic levels of users in the different languages.

Contemporary conflicts and propaganda are highly visualized, above all in social media. Images are more easily perceived than text in articles, but have similar functions. Photographs differ from articles in terms of the emotion they generate and their potential to evoke positive and negative connotations of the objects they portray.

In this way, Ukraine and Ukrainians are often portrayed in contexts of fascist symbolism and violence, while Russia and Russian soldiers (‘little green men’) are shown in contexts of security and military professionalism. 

In a viewer’s consciousness, photographs or other imagery are perceived as reality and the emotional system always considers visual experience to be real1. This conviction of the ‘truthfulness’ of images was eagerly used in the framing of the Ukrainian-Russian conflict. At the same time, many examples of falsified reality were observed, not only by means of Photoshop, but also through untruthful comments and the manipulation of images. Dates, locations and objects in photographs are manipulated to unambiguously prove Russia’s and Russians’ ‘innocence’, and the ‘lies’ of and ‘brutality’ inflicted on civilians by Ukrainian soldiers, and that the West 

Internet memes (digitalized units of information <text, image, film, sound> that are copied, processed and in this processed form, re-published on the internet) have become a widely used instrument for portraying conflict. Just as in the case of textual discussions, image ‘exchanges’ also see storylines developing with elements and means of demeaning enemies and motivational elements of conflict. These enhance the frames analysed in the internet and reinforce the myths popular in public discourse in Russia: the Myth of fighting for a new world order based on humanitarian values and the Myth of Great Russia.

Although this report has a variety of content, it focuses on the internet discourse stemming from Ukrainian-Russian antagonism. Therefore, it does not attempt to present the conflict in military and political dimensions. The consequence is a fragmentary image of the conflict between followers of the Kremlin and Kiev, in linguistic and symbolic terms. A case-study analysis of the convergence between different kinds of media, to construct a coherent image of the world in line with propaganda objectives, would be of great interest.


The weaponisation of social-media by Russia should be the subject of continuous in-depth analysis and monitoring by NATO’s command structures and its allies. This would require employing specialists with excellent Russian-language skills and the cultural awareness to be able to pick up on particular keywords, messages, historic links and interpretations. Similarly, it is important to measure the resonance and effectiveness of Russia’s propaganda activities in social media by using network analysis and testing the influence of different content on target audiences.

It is important to ensure the pluralism of information, opinions and voices speaking on behalf of NATO, the ‘West’ and also on behalf of the Kremlin and Russia. Varied information about the same events results in the mutual reduction of the influence of different senders. The Allied governments and NATO have to empower non-government voices such as journalists, experts, social activists and reputable NGOs by providing them with timely information on issues of importance, ensuring active feedback loops and identifying new information-sharing platforms.

Young audiences in the Allied countries and also in Russia may be internet-savvy but at the same time lack awareness of propaganda and other influencing techniques. Notwithstanding, all society members are susceptible to Russian propaganda as it resonates with their fears, needs and motivations. School education programmes on (digital) media literacy and social-awareness campaigns on the impact of propaganda on society should be introduced to mitigate the effects of hostile information campaigns, particularly online ones. Particular attention should be paid to the potential of manipulation with imagery as it is one of the most effective and widely used online propaganda methods.

It appears that the online-journalist community (both professional and non-professional) also lacks awareness of propaganda and other influencing techniques at times, or does not devote enough effort to checking and analysing sources. Since the media still remains an authority in the eyes of most people, it can unintentionally amplify rumours and propaganda messages as content is shared. Closer cooperation with journalists as regards information is needed, by supplying them with materials and content based on facts, and organizing workshops on the significance of what they publish during particular information-war campaigns.

A major component of combating internet trolls should be unmasking them and exposing their activities. Because, in this type of conflict, the volume of posts matters (even the most intelligent argumentation disappears in an abundance of less sophisticated, but more numerous messages on the part of the opponent), different institutions should activate internet users so that organized masses of troll posts could be opposed by organized groups of citizens aware of trolling. Combatting trolls should utilise two tactics: at the comment level, and exposing falsities. The first requires short, coherent, logical and, above all, numerous comments. It is important to block the propaganda effects of pluralistic ignorance, the spiral of silence and the bandwagon effect (see Section 3.3), which are inherent in the internet. The second level requires cooperation between internet users and researchers who are able to expose and compromise trolls. 

Trolls extensively employ personal attacks rather than argument, hence their comments often contain ‘hate speech’ (text that threatens, insults or attacks a person or group on the basis of national origin, ethnicity, race or religion). Whilst respecting freedom of speech, administrators of websites and social media portals should be more active in monitoring content for hate speech, and blocking and reporting it, as required by law.