The first part of the report looks at whether it is possible to evaluate the impact of debunking by discussing the major differences between fact-checking efforts, the objectives behind countering mis- and disinformation and the risks of a “backfire effect”. The second section is a guide to best practice and compares different types of fact-checking and debunking initiatives. The third part gives recommendations and describes future prospects for improving practice, assessing impact and developing policy solutions. 

The annex of the report gives a comprehensive overview of available resources on the topic, including handbooks, OSINT, video and image search as well as social media monitoring and network analysis tools. The report combines academic debates on the topic with real life practice drawn from the authors’ own experience, interviews with several expert organisations in the field such as the Atlantic Council, the National Democratic Institute, the EU’s East Stratcom Task Force and discussion with relevant officials from different EU and NATO countries. More than 200 fact-checking and debunking initiatives from across the globe were researched and assessed. 

Misinformation and disinformation disseminated online are a relatively recent phenomena, as are initiatives developed to limit the effect of such content. Questions remain over the effectiveness of two key counter-measures, fact-checking and debunking. This report makes a start in examining best practice: what it is, who does it and how it might be evaluated.

WHAT ARE DISINFORMATION AND MISINFORMATION?

Today’s information environment is increasingly characterised by the spread of misinformation and disinformation. Misinformation refers to verifiably false information that is spread without any intent to mislead. Disinformation refers to the creation, presentation and dissemination of verifiably false information for economic gain or to intentionally deceive the public. Whether it be published in a news article or an online blog, broadcast from a newsroom or government press conference, misleading and false information is frequently produced and reproduced, both intentionally and unintentionally.

IS IT POSSIBLE TO EVALUATE DEBUNKING EFFORTS?

Fact-checking is the long-standing process of checking that all facts in a piece of writing, news article, speech, etc. are correct. It derives from a need to hold those in power to account for their claims, and is traditionally conducted by journalists, newsrooms and political analysts. 

Debunking refers to the process of exposing falseness or showing that something is less important, less good or less true than it has been made to appear. The overall objective is to minimise the impact of potentially harmful mis- and disinformation. The main goals of organisations that debunk include: to assert the truth, to catalogue evidence of false information, to expose false information and conspiracies, to attribute the sources of disinformation, to build capacity and to educate.

Although there is overlap between debunking and traditional fact-checking, there are several differences to note:

  • Debunking is often not non-partisan. It can be done by governments to expose a ‘hostile’ actor, and sometimes takes the form of a ‘campaign’ or ‘initiative’. In contrast, fact-checking is conducted in the spirit of impartiality.
  • Debunking is targeted on a specific actor or topic. While fact-checking is broad in scope, debunking often begins with a decision about whose information should be corrected, based on an overall assessment of their intent and behaviour.
  • Debunking is strategic. Unlike with fact-checking, not all falsehoods should be focused on equally.
  • Debunking is focused on solving a strategic problem to reduce harm, and initiatives often ignore mis- and disinformation that is unlikely to have a high impact on their priority issues. 

RECOMMENDATIONS

  1. Improving practice.
    1. Understand the information environment. Assess existing initiatives you can learn from/collaborate with, and your target audience (to determine the actors, messages and channels they deem credible).
    2. What are you trying to protect? By clarifying purpose (e.g., safeguarding an election vs building wider media literacy) you can determine how you will work and who you will work with.
    3. Different tactics and tailored messaging for different audiences. Framing should vary depending on the topic and audience. Target audiences may have different expectations, so you need to carefully tailor a communication strategy to the context.
    4. Audience engagement strategy. Audiences most vulnerable to mis- and disinformation are unlikely to engage with long or overwhelming articles. Look to innovative formats, such as gamification.
    5. Assess your own vulnerabilities. Once you have identified a credible voice to do the debunking, what are the risks or vulnerabilities your organisation could face?
    6. Pick your battles. Resource and time limitations mean you must be selective in the issues and actors you choose to engage.
    7. Responses should balance between countering messages, countering narratives and countering brands. 
  2. Assessing impact
    1. Measurement and evaluation (M&E) of mis- and disinformation countermeasures is still in its very earliest stages. Some of this research tests the wording of corrections, to provide empirical evidence of countermessages that are likely to stick. Other research tests recall of facts after exposure to corrective information.
    2. Experiments can help to explain the human psyche and hone messaging aimed at specific behaviour change. This can be reflected in the indicators where appropriate. Many other objectives are political in nature, as they seek to signal intent, impose costs and disrupt adversary capabilities, support ally capabilities, reduce societal vulnerabilities and reassure the public.
    3. These indicators should be reflected in any assessment of what impact an initiative might have. But each of these indicators can look very different depending on the country, issues and adversaries in question.
  3. Developing policy solutions
    1. Debunking is not a standalone solution. Debunking should be positioned alongside coherent legislation, deterrence and resilience-building measures. Networks and alliances provide important direction, protection and support. Coordination with the likes of governments, intergovernmental organisations and philanthropists is important for coherence in the field. This pertains to clarifying mandates, objectives, subject-matter and geographical markets.
    2. There are opportunities to use funding to drive shared standards. More can be done to establish shared standards, norms and practices in order to create a more consistent and aligned product across organisations.
    3. There are significant problems accessing data from closed groups, chatrooms and messaging services. Digital platforms have the opportunity to counter misand disinformation through labelling, promoting and demoting content. Governments can support this through either collaboration or regulation of digital platforms.
    4. Granting organisations the mandate to achieve a specific goal is a crucial step. Providing a mandate that is specific enough to drive debunking activity but flexible enough to meet future challenges is an important capability to be resolved by policymakers. 

You can watch #StratComTalks about the effectiveness of fact-checking and debunking here: