Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
The writer is director of the Reuters Institute for the Study of Journalism
This is set to be a big election year, including in India, Mexico, the US, and probably the UK. People will rightly be on their guard for misinformation, but much of the policy discussion on the topic ignores the most important source: members of the political elite.
As a social scientist working on political communication, I have spent years in these debates — which continue to be remarkably disconnected from what we know from research. Academic findings repeatedly underline the actual impact of politics, while policy documents focus persistently on the possible impact of new technologies.
Most recently, Britain’s National Cyber Security Centre (NCSC) has warned of how “AI-created hyper-realistic bots will make the spread of disinformation easier and the manipulation of media for use in deepfake campaigns will likely become more advanced”. This is similar to warnings from many other public authorities, which ignore the misinformation from the most senior levels of domestic politics. In the US, the Washington Post stopped counting after documenting at least 30,573 false or misleading claims made by Donald Trump as president. In the UK, the non-profit FullFact has reported that as many as 50 MPs — including two prime ministers, cabinet ministers and shadow cabinet ministers — failed to correct false, unevidenced or misleading claims in 2022 alone, despite repeated calls to do so.
These are actual problems of misinformation, and the phenomenon is not new. Both George W Bush and Barack Obama’s administrations obfuscated on Afghanistan. Bush’s government and that of his UK counterpart Tony Blair advanced false and misleading claims in the run-up to the Iraq war. Prominent politicians have, over the years, denied the reality of human-induced climate change, proposed quack remedies for Covid-19, and so much more. These are examples of misinformation, and, at their most egregious, of disinformation — defined as spreading false or misleading information for political advantage or profit.
This basic point is strikingly absent from many policy documents — the NCSC report, for example, has nothing to say about domestic politics. It is not alone. Take the US Surgeon General’s 2021 advisory on confronting health misinformation which calls for a “whole-of-society” approach — and yet contains nothing on politicians and curiously omits the many misleading claims made by the sitting president during the pandemic, including touting hydroxychloroquine as a potential treatment.
This oversight is problematic because misinformation coming from the top is likely to have a far greater impact than that from most other sources, whether social media posts by ordinary people, hostile actors, or commercial scammers. People pay more attention to what prominent politicians say, and supporters of those politicians are more inclined to believe it and act on it.
We know this from years of research. Millions of Americans believed there was systematic voter fraud in the 2020 elections, that weapons of mass destruction were found in Iraq, that human activity played little role in climate change, and that the risks and side effects of Covid-19 vaccines outweighed the health benefits. What all these misleading beliefs have in common is that they have been systematically advanced by political actors — by the right in the US. But in, for example, Mexico, there is plenty of misinformation coming from the left.
Meanwhile, the policy discussion remains bogged down with how to police AI-generated content, while distracting us from how some politicians — perhaps conscious of how tech companies eventually blocked Trump in the dying days of his presidency — are pushing for legal exemptions from content moderation.
Of course there will be examples of AI-generated misinformation, bots, and deepfakes during various elections next year. But the key question is how politicians will be using these tools. A pro-Ron DeSantis political action committee has already used an AI version of Trump’s voice in a campaign ad. This is not some unnamed “malicious actor”, but a team working on behalf of the governor of a state with a population larger than all but five EU member states. We have seen examples of similar activity in elections in Argentina and New Zealand too.
When it comes to the most serious misinformation, the calls tend to come from inside the house. Technology will not change that, so let’s stop gaslighting the public and admit clearly as we head into a big election year that misinformation often comes from the top.
Read the full article here