Tom Johnson, co-chief executive, H/Advisors
The intersection of social media and AI poses a formidable threat to democracies around the globe as we enter a pivotal election year in 2024. It’s a threat that governments, institutions and political and corporate leaders must all play a role in helping to prevent, warns Tom Johnson, co-chief executive of H/Advisors – a leading global strategic advisory network of corporate and financial communications consultancies.
Here, Johnson highlights the potential for misinformation, fuelled by rapidly evolving technologies, to sway public opinion and exacerbate the rise of extremism in the crucial 2024 elections. As we stand at the crossroads of technological advancement and democratic integrity, Johnson urges readers to contemplate the implications of this digital era on electoral processes and the very fabric of our society.
What impact will social media and AI have on electoral votes in 2024?
Social media is having an increasingly powerful impact on public perceptions and opinions. In the business world, we’ve learned that social media and digital tools can be just as impactful in shaping views as more traditional communications channels at this point, with the possible exception of direct one-on-one conversations.
The rapid advancement of AI technology has the ability to create a narrative and push it out to people in seconds. You could argue there is some societal benefit to having such tools if used the right way, but it also has the ability to do that at scale with misinformation.
From a political perspective, we have over 40% of the world’s population going to the polls in 2024 and these elections are going to be incredibly consequential as to where we, as a global society, go over the next decade. The potential for what misinformation could influence is terrifying.
What are the risks here for democracy?
Several studies have shown that electorates are increasingly gravitating more to their side of the political spectrum. Most are still roughly in the middle of that spectrum, but they are increasingly tuning out the other side. So, if you assume that people are only listening to half of the news, and that news is amplified in ways that are more one-sided, you already have an uninformed electorate. Then if you add untrue information on top of that, it makes it much harder to distinguish between fact and fiction. And it becomes harder for voters to understand what the issues are and what politicians will do about them.
At the same time, the people with extreme views can get much more attention by using digital means that simply didn’t exist before. These can then be amplified and become widespread before anyone can put a check on it. For politicians, there’s a risk they spend as much time trying to disprove negatives as they do trying to articulate a vision about something. That could be a real struggle.
Have social media and AI eclipsed the role of traditional media?
I don’t think it’s eclipsed it yet, but it’s getting close. We still have a number of well-sourced news organisations that are trying to sift through news and misinformation, and put the right information out there. But there are fewer reporters doing it, and the extreme voices on both sides actively work to discredit the traditional news media. Their influence with the public is less than it used to be, although people who seek out these platforms can still get a balanced and accurate view of what's happening in the world.
What can our society and democratic institutions do to adapt to this new world?
There has to be a unified push by governments, institutions and politicians to fight the plethora of misinformation that exists, for the betterment of society. Those who create the laws and shape perceptions need to make sure that they are combating misinformation and the platforms that promote this, and support organisations that will provide a balanced and nuanced view of what’s happening in covering these elections.
But the onus is also on consumers of that news to seek out correct information. As a society, we are less inclined to open ourselves up to understanding different points of view. Too many people focus on content they think reflects their political point of view, and attempt to understand other viewpoints so they can make educated choices. Social media enables that to an extent because you can tune out everything else. We all owe it to society to open our minds to all points of view, and to support those we believe will continue to sustain democracy.
Can these democratic institutions also use social media to their advantage?
Governments, institutions and businesses all have to embrace where technology is going as part of their communication platforms. AI can collect and collate a vast amount of information quickly, but you still need to verify it and then include all the necessary nuances. We must all be well educated on both what AI is capable of and how it can be manipulated so we know what to look out for.
How do you see this playing out over the year ahead?
There is no way of avoiding the impact digital platforms are going to have on how news is distributed. We are facing critical elections and the world could get reshaped dramatically over the next couple of years as a result. The real threat is that misinformation will infiltrate people's views and shape their perceptions. It’s vital that we’re able to guard against that.
Tom Johnson is co-chief executive of H/Advisors. Protecting democracy in the era of artificial Intelligence, social media and misinformation is the subject of the H/Advisors Breakfast Debate at Davos this year, taking place in January 2024. If you would like to attend or find out more information please email firstname.lastname@example.org