Disinformation is corrosive: businesses must understand the risk and inoculate

Disinformation was once the near-exclusive provenance of nation-states, but not any more.

Disinformation breeds a calamitous mistrust in science and institutions, argues Hugh Taggart
Disinformation breeds a calamitous mistrust in science and institutions, argues Hugh Taggart

Disinformation campaigns are escalating as a problem for corporations, driving new forms of cybersecurity threats – those that aren’t just technical, but psychological.

There is no shortage of case studies to point to, from the unfounded allegations that Wayfair was involved in child trafficking, to the stabbing of a BT Group employee due to conspiracy theories that wireless 5G networks were being used to spread COVID-19.

For business, this is a clarion call to take the lead – not simply to protect their own reputations, but to steward and uphold an increasingly threatened information environment.

Disinformation breeds distrust and has harmful effects on society as a whole. When catastrophes strike, society’s faith in traditional institutions is what empowers us to persevere. Disinformation serves as a root canal on public trust, leaving us now in a calamitous mistrust of science amid a global pandemic.

Clearly this issue is larger than any institution. However, business is increasingly viewed as having an ethical responsibility to help solve society’s problems. Part of this includes combatting disinformation. To understand where the private sector can be effective in doing so, it has to leverage tools and strategies that are as flexible and inventive as those they seek to undo.

Here’s what they can do:

Understand the threat

First, identify and unmask the operations and motives behind these campaigns, understand what they are seeking to achieve and how the disinformation is likely to spread. Bad actors convene in the dark corners of the web on fringe and encrypted platforms to workshop potential targets and disinformation narratives. They target communities that are already predisposed to believe the elements of the disinformation campaign. Once their disinformation reaches mainstream social media feeds, bad actors know that the behavioral science phenomenon of the “illusory effect” will kick in. People see disinformation repeated and shared by multiple users, leading to increased belief that there is some truth to it.

Avoid 'debunking' tools

Many tools promise to “debunk” disinformation, but their effect is limited. Academic studies show that once a user has been exposed to disinformation it is virtually impossible to dislodge from their mind. Therefore, the current tools widely marketed to the private sector don’t work.


Experts in the field of behavioural science have built an impressive body of evidence supporting the theory that users can be successfully “inoculated” to disinformation. Disinformation inoculation presents users with intervention messaging before disinformation hits their newsfeeds. It has been proved that this method can trigger protective responses, such as enhanced critical thinking. To achieve this, companies must deploy "interventive" messaging to empower their audience to make informed decisions on whether information is real or manipulated.

However, to truly succeed, this effort must be the beginning of a larger collaboration.

The threat is too great to operate in silos. It is imperative that business works closely with the academic community and public sector to confront this crisis. The private sector has proven its ability to meet new challenges, such as developing a vaccine for COVID-19. The business case for defending against disinformation attacks is clear, but the moral imperative must be the North Star.

Hugh Taggart is co-chief executive, Edelman UK, and global crisis chair

Have you registered with us yet?

Register now to enjoy more articles and free email bulletins

Already registered?
Sign in