Hope for resilient and connected communities

A post-truth future

Combatting disinformation's global spread

OVERVIEW

Disinformation, deepfakes and conspiracy theories are becoming harder to detect and combat as generative AI becomes easier for anyone to use and extended reality becomes more accessible. This digital splintering of truth, alongside diminishing trust in political leaders, makes it increasingly hard to establish what is real and what is false. Some even challenge the notion of objective truth. People’s lived experiences naturally differ, yet imagining shared futures – and collaborating to achieve them – is far harder if we can’t establish a shared baseline of truth. 

SIGNALS

Increasingly powerful, user-friendly generative AI is making it harder to distinguish the fake from the real. Cybersecurity experts doubt that growing tech solutions to deepfakes, like digital watermarks or detection software, will be able to keep up, advising instead, “assume nothing, believe no-one, doubt everything.” Even absent deliberate disinformation, the sheer volume of information available and the power of online influencers can muddy the search for facts. Misinformation – inadvertent mistakes or unchecked AI hallucinations – spreads rapidly online and further decreases trust in what we read there.

This digital confusion goes alongside diminishing trust in institutions. Over 60% of people surveyed in 28 countries believe that establishment leaders – in politics, business and journalism - are purposely trying to mislead by saying things they know are false or exaggerated. Even official efforts to counter disinformation are vulnerable to criticism, as for example the short-lived US Disinformation Governance Board or Elon Musk’s criticism of the Brazilian Supreme Court’s social media regulation. Academic institutions are not immune; scientific publishers have retracted hundreds of fraudulent research papers and even closed journals altogether.

The very notion of establishing an objective truth can be controversial, given the importance of acknowledging people’s lived experiences. Increasing polarisation and even the growing divergence of young men and women’s perspectives undermine the idea of common truths. Yet if we want to build resilient and inclusive societies, is there not value in establishing some starting points we can all agree on? We can use digital tools; generative AI is already helping fact checkers. But they’re fallible (millions of research papers are at risk of disappearing forever) so human critical thinking will be vital, too. 

SO WHAT FOR DEVELOPMENT?

Evidence-based policymaking may become harder and more contentious if it cannot be grounded in a shared baseline of truth.  A study of 70,000+ people worldwide found trust in scientists moderately high; 75% agreed scientific methods are the best way to find out if something is true (though trust levels vary among countries and are linked to political orientation). Should scientists wield more influence in policymaking?

Deepfakes can be used to manipulate public opinion, further dividing us on critical issues like climate change or inequality. Even the gradual personalisation of online news feeds is exacerbating echo chambers, as AI algorithms curate content based on a user’s past consumption, interests and even location, creating “filter bubbles” where people only see information that confirms their existing beliefs. That online manipulation may partly explain the growing gulf in political views between young women and young men. Such polarisation will make it even harder for future generations to agree on historical facts and learn from the past.

The very preservation of knowledge cannot be taken for granted. Digital archives are vulnerable; 176 open access research journals have already vanished from the internet. As AI plays an increasingly important role in structuring and organizing information, its judgments will influence what becomes public knowledge. That may erode for future generations their authority or power to decide what is true or worthy of attention.

Meanwhile, preserving and strengthening the human capacity for critical thinking can help insulate us against mis- and disinformation and keep us alert to the value of knowledge and truth. Media literacy – becoming mandatory in many US schools - can improve detection of disinformation. An experiment showed that adding “trust” and “distrust” buttons on social media could curb misinformation by incentivizing people to share what they trust.