A comprehensive look at the state of PR seems to reveal a profession that isn’t too alarmed about the threat of fake news even though it has an existential interest in trusted information.
Data from the North American Communication Monitor, a survey sponsored by the Plank Center for Leadership in Public Relations at the University of Alabama, revealed that 21% of all enterprises have been affected by fake news and fully 36% report that fake information traveled to their employees on company intranets.
Yet, the majority of the 1,020 PR pros surveyed have no plans to adjust. A clear majority (59.7%) call fake news a serious threat but only 13% are doing anything to detect fake information.
Some 29% admit they have no plans to address the threat and only 19% think the issue has any bearing on their work.
And in what can only be interpreted as a profoundly quaint insight into current thinking, 42% seem to think personal experience is sufficient to detect what is fake.
That’s roughly the same mindset that once allowed business leaders to trust intuition over data and analytics. That time has passed, and so has the time to sit on the sidelines waiting for someone else to address fake news.
These North American findings are consistent with previous Communication Monitor studies conducted in Europe and Latin America. Only 12% of PR professionals in those regions saw fake news as an important issue for communication management. Only 12% of PR pros in Europe and 7% in Latin America, were using advanced approaches to deal with fake news.
There’s a troubling gap here, one the profession has a vested interest in addressing. If information can’t be trusted, what are the implications for the institutions that provide it — business, government, media — the same clients that pay PR pros to advance their brands and the reputations of their C-suite leaders?
These studies reveal a collective passivity across the profession that’s hard to explain. It’s time for PR leaders to stop paying fake attention to fake news.
At the geopolitical level, the facts are no longer in dispute. Russia infiltrated the 2016 electoral process in the U.S., that of the U.K. pre-Brexit, and elections in Brazil, France and Turkey. In the U.S., no votes were changed, but opinions and attitudes undoubtedly were, as major social platforms were infected with propaganda.
Facebook estimates that 126 million people in America alone engaged with content generated by Russia’s Internet Research Agency. Twitter found 36,000 Russian bots that posted 1.4 million tweets, viewed nearly 290 million times between September and November 2016.
Recent data from Oxford University confirms that "junk" news is shared on Facebook and Twitter four times as often as content from reputable sources.
In a remarkably brief period, the platforms viewed as powerful new channels for reach, engagement and personalization have become open to the direct delivery of propaganda. In effect, our most trusted sources of information — what comes to us via our personal networks — have become the most serious threat to our ability to discern credible information from fabrications.
Imagine the potential brand damage from a deep fake video of retail executives describing intentional bias in hiring practices, a smart phone maker detailing child labor violations or an energy company caught describing bribes it made in emerging nations.
To be sure, the solution will take a village. The tech industry will have to come forward with real solutions. If the technology to deceive exists, so must the technology to detect deception.
But at the same time, all entities with a stake in trusted information can drive change. That begins with a commitment to critical thought – the actual method and discipline of evaluating situations and content. Specifically, PR leaders can:
Educate clients and their own organizations to establish standards and promote responsible information habits.
Validate and read beyond the first page of search results. Cross check sources, validate legitimacy, and seek out instances where misinformation already has been debunked.
Take a deep breath (also called click restraint). We are triggered by the odd and sensational. The government is creating gay frogs; a presidential candidate ran a child sex ring out of the basement of a pizzeria. If the item provokes you, hesitate, inspect and validate before accepting it or sharing it.
Use shame to make disseminating misinformation equal to polluting a waterway and re-examine our relationships with media. We trade in real news — not the marginal or rewarmed — and certainly nothing that gives a reporter a reason to doubt the news value of what we bring.
Stand up for corporate character. We should advocate for the moral choice, nothing less. Refuse to allow businesses to make convenient, expedient decisions, and proudly advance the responsible, courageous, values-based decisions and directions of our clients and enterprises.
Finally, we should question the strategic business value of social platforms. If platforms like Facebook and Instagram are nice to haves but generate questionable actual outcomes, consider leaving. That decision will be taken up by the Plank Center board at its summer meeting.
A decision to leave may be largely symbolic and probably won’t end election meddling or disarm the conspiracy theorists. But it will, however, make a statement about this serious threat to a free society, about trust in information and institutions, the right side of history, and the decision to stand there.
Mark Harris is a visiting professor of letters at the University of Alabama and former vice president of communications for IBM’s global consulting business. He is a member of the board of advisors at the Plank Center for Leadership in Public Relations.
Dr. Bruce Berger is professor emeritus in public relations at UA, former chairman of the department of advertising and public relations, and founding director of the Plank Center and its director of research. He previously served as global communications leader for Whirlpool, among other leadership communications roles.