Introduction
Here are steps to develop a risk mitigation plan from the Disinformation Toolkit 2.0: How Civil Society and Non-governmental Organizations Can Combat Harmful Mis- and Disinformation by InterAction. This excerpt is from pages 17 – 21.
Developing a Risk Mitigation Plan
This section summarizes steps you might consider taking to develop a strategy for identifying and responding to online disinformation that could affect your organization’s operations and the safety of your staff.
Think about your strategy in five parts, which are detailed below:
- Evaluate your media and information ecosystem to determine where your disinformation risk is greatest.
- Determine who is spreading the false information about your organization, leaders, or programs and develop a hypothesis about why they are sharing this information.
- Determine what they are spreading or saying and how it is spreading.
- Determine whether and how to counter this information and work with your organization’s leaders to design workflows within your organization.
- Confer with like-minded NGOs or similar stakeholders and develop a collective understanding and response plan to disinformation attacks.
The following are approaches to taking these actions. These suggestions should be viewed as conversation starters for you and your staff that will require additional institutionalization, based on your organization’s work and structure. The steps that you decide to take should be tailored to the unique context in which your organization operates.
Steps
1. Your Media Ecosystem
Understand all forms of media in which disinformation is spread—print, websites, and social media. One of the biggest ecosystems to analyze is the online media environment in which your programs operate. One of the first questions to ask yourself is:
How vulnerable is my media environment to abuse?
Consult your national staff and learn how information flows within the communities that matter most to your organization. If context-appropriate, conduct a rapid, anonymous survey among beneficiaries to determine access to TV/radio/newspaper; access to technology including mobile phones and mobile data; adoption of messaging apps; social media; and community information hubs or influential people in your community of focus. Keep in mind that collecting or storing data about beneficiaries or affected populations digitally can be dangerous, especially as it relates to mis- and disinformation in contexts where authorities who are surveilling civil society are also the perpetrators of disinformation.
Possible discussion questions for your project or program communications staff could include:
Questions about your audience:
- How do people get information about news, politics, and their community? How does the answer change with gender, age, economic status, location, and other key demographic factors?
- What are the sources of information most important for political news (e.g., people, institutions, technology tools)?
- How does the nature of these sources affect their spread and influence in your community (e.g., information in newspapers travels much slower than on Facebook)
- What information sources seem to matter to your core audiences?
Questions about your threats:
- Who are the distributors (i.e., who shares the posts that go viral) that affect your work or your organization? Are there specific Facebook or messaging groups that are particularly present?
- Who are likely creators (i.e., who develops the content that goes viral) of false claims that affect
your work or your organization? This refers both to individuals and organizations that may be
propagating such claims. - Do you have any hypotheses on how they disseminate their information and messages?
- What are their motivations?
2. Who Creates Disinformation? Why?
Disinformation researchers cite two primary actors that create and disseminate disinformation content:
- State or state-aligned groups and political actors with political goals. In the Philippines, the president’s office has built a propaganda machine, in the form of fake accounts and bot networks, that disparages organizations and journalists and disseminates narratives with specific political goals.
- Non-state actors, such as terrorist organizations, extremist groups, political parties, and corporate actors. During the migrant crisis in the Mediterranean, anti immigration news outlets published a number of false stories claiming that a large international NGO was working with human traffickers as part of their search and rescue program. While false, the NGO was forced to divert valuable resources to fight these accusations. These groups have political aims to recruit supporters, create confusion, or disparage groups who oppose them. Be careful to distinguish groups with politically motivated goals from individuals and groups motivated by economic incentives that create and disseminate false information.
These are actors who have identified methods to earn a living by creating and disseminating false information; they may support state and non-state actors in achieving their political goals. In the United States, reports of Macedonian teenagers building false information content farms showed how these cottage industries generate revenue and created an industry around the creation and dissemination of false information. On the political side, propagators aim to sow confusion or discontent among targeted communities. In Myanmar, for example, Facebook has been repeatedly jammed after major terrorist attacks with doctored photos and false information about the attacks from outside sources.
3. What are they Saying? Where is it Appearing?
Disinformation is disseminated through the Internet through websites; social media platforms, such as Facebook and Twitter; and messaging applications such as WhatsApp, Facebook Messenger, Telegram, LINE, Viber, and Instagram. However, the medium which is being used to distribute disinformation will vary depending on how actors are seeking to reach their intended audiences.
Commonly cited areas where disinformation has appeared include the following:
- Websites
- Facebook pages
- Messages through Facebook Messenger, WhatsApp, Viber, Telegram, LINE, Instagram, and others
- Posts in public or private Facebook groups
- Comments on highly visible news pages
- Traditional newspapers, radio, and television
Organizations may consider developing a system to record and log problematic posts, photos, or text content in a spreadsheet as they occur and share these materials with other groups who are experiencing attacks or observing worrisome trends. By aggregating and collecting this information, research partners may be able to support research that identifies sources and networks leading to the spread of disinformation.
4. Decide Whether and How to Take Further Action
Have discussions with your communications and security teams to determine whether actions need to be taken to counter disinformation. Depending on the circumstances and your organization’s goals, the following options may be appropriate for your response to disinformation attacks:
see pg. 21
5. Confer with like-minded NGOs and Organisations
Identify and coordinate with partners who share the same vulnerabilities. Consider joining the country level NGO forum or relevant country-focused InterAction Working Group if your organization is not already a Member. There is significant value in identifying and working with like-minded organizations to discuss vulnerabilities and attacks when they occur.
While disinformation attacks often target individuals and institutions, they are often more broadly targeted at civil society organizations, national NGOs, or international NGOs in humanitarian settings. Given these shared interests, it is typically more effective and safer to join forces, pool resources, undertake collective analysis, joint strategy development, and action planning. It also helps to speak collectively to U.N. agencies and other international organizations who may have access to more resources as part of a collective response strategy.
As an example of collective response, InterAction’s Together Project has developed a space for Muslim interest foundations in the U.S. to find allies who can carry important messages to different constituencies, including larger interfaith coalitions. These relationships have allowed the alliance to strategically deploy surrogates to promote positive messages at the local level (whether it is commemorating a holiday, supporting disaster response, or sharing content around significant political events) and to members of Congress when advocating for specific issues. Working together as a network and addressing the problem together has been an essential part of
sharing insights and brainstorming solutions.
If your organization has experienced a disinformation attack with international media coverage, you may also consider the following actions:
- Archive social media content. If this is an area of increased vulnerability for you, consider connecting with open source investigation labs or media organizations that focus on archiving social media information (see recommendations in the Databases of Disinformation Partners,Initiatives and Tools section of this document).
- Discuss the event with partners and donors to ensure you can “pre-bunk” the narrative before they hear it from other sources. Examine what happened to you and your colleagues with critical stakeholders, including your partners and donors.
- Contact vendors. Disinformation is also an urgent issue for technology platforms to address. If there were any issues related to engagement with the platforms directly in requesting removal of content, tell your organization’s policy contact.
- Conduct a formal, after-event assessment. Discuss how you would have handled the event differently or resources that you wish you would have had. Examine and assess the experience and work across your organization to establish protocols to prepare you and others in your organization for the next event.
Risk Assessment Tool
Specific factors make media more prone to abuse in areas undergoing a major transition or conflict. Assessing the presence of these factors can help you and your colleagues determine how vulnerable media might be to abuse by state and non-state actors.
InterAction’s Risk Assessment Tool helps you assess the level of vulnerability through rating:
- Social media use and access
- Traditional media institutions
- Journalists and media professionals
- Government institutions
- Civil society
- Dangerous content
See page 23-25 of the Disinformation Toolkit.
Access Full Report
Explore Further
- Five Key Takeaways from InterAction’s Disinformation Toolkit 2.0, InterAction
- Disinformation 101
- Disinformation vs Misinformation: Definitions & Types
- Disinformation and 7 Common Forms of Information Disorder
- Skill Up: Learn to Identify Disinformation with Games and Courses
- How To: Dealing with Disinformation
- Countering Disinformation Collection