As UK politicians prepare their report on ‘fake news’, they could do worse than take lessons from the City. The media phenomenon, popularised by the US presidential election, of using false information to manipulate sentiment came as a shock to much of the population, but is less alien to market observers.
For decades the guardians of companies’ reputations have seen such tactics play out as micro traders and short sellers are able to secure quick returns from short-term share price moves.
Perhaps the CEO is rumoured to be leaving, an acquisition in Asia has governance concerns, or there’s a union dispute. Suddenly confidence is knocked and it’s a good time to check out.
Wrong? Yes. Illegal? Yes. Effectively policed by regulators? No. Corporate and public life can witness that, as quality journalism is pared back, media organisations continue to blur the line between commercial partners and editorial functions; one faintly compelling source is enough to get a story up in pixels. These tactics are getting more effective, just as a shrill, much-shared Facebook post can undermine a presidential candidate.
So to my point on effective regulation and how this helps us on fake news. The deterrent for being proved to have distorted a market is potentially severe; markets’ participants are regulated and misinformation to make a profit is among the more heinous crimes. Yet, for regulators, it is also one of the toughest offences to prove. How do you define the line between an activist investor making an argument about practices to provoke debate and someone distorting facts to move the share price?
Here lies the challenge the Culture, Media and Sport select committee faces in analysing fake news: it is fake only once the facts have been investigated and, as with all legal penalties for misinformation – be that data offences or defamation – punishment is retrospective and can lag years behind the offence.
Yet that doesn’t quite render them ineffective. If the punishment is severe and foreseeable enough, then, just like penalties for manipulating markets, they serve as an effective deterrent.
That moves the question to who is being policed and how. In opening its consultation, the committee’s chair, Damian Collins MP, hinted it had already selected an answer: that tech firms bear some responsibility for fake news on social media.
As with protecting data rights through the European courts, search engines are an easy answer for solving problems. They have corporate registrations, offices in convenient jurisdictions to serve them and, most importantly, money.
Yet if we take one lesson from the markets, the only way to root out tactical misinformation is for those who instigate it and those who benefit from it to face serious sanction.
The chances of effective global news regulation and proactive management of content by search firms are slim. The burden will fall on the targets to be ready to parry and quickly disprove attacks they face and police their own reputations.
If politicians want to help, they should look at the speed and availability of the weapons to let them do so.
Chris Scott is a partner at Schillings