If this is the era of fake news, then it’s likely to be the calm before the storm. The true tempest will begin to howl once the age of deepfakes commences.

Should uncannily faked video content overlap with a new pandemic or increased cyber warfare, then we’ll be in line for a more monstrous kind of misinformation. The EU has already proposed regulation artificial intelligence (AI) which will require organisations to disclose any deepfake usage and creation.

No doubt this has been spurred on by recent events. As a GlobaData report on misinformation states, the Covid-19 pandemic has been “fertile ground for fake news and the exploitation of public fear.” Social media platforms have struggled to contain the spread of hoaxes, conspiracy theories and quack cures, with some reviewing their ad policy to forbid the promotion of medical misinformation.

Despite these efforts, they have found themselves being grilled by politicians about their failure to prevent the spread of misinformation on their platforms. The debate grew particularly contentious following the deadly riots on Capitol Hill on 6 January after a mismatched mob of Donald Trump supporters, QAnon conspiracists and anti-maskers were egged on by fake news that the US presidential election had been stolen.

However, most problems aren’t caused by malicious “maskholes”, but by people impulsively sharing information without thinking critically about it.

This poses some questions. Can audiences be educated to think more before posting? How can fake news be properly regulated? Verdict finds out with Henry Brown, director of data & analytics consulting at Ciklum; Jared Ficklin, chief creative technologist at argodesign; Andy Parsons, director of Content Authenticity Initiative at ADOBE; Andy Patel, researcher, AI Center of Excellence at F-Secure; and Rachel Roumeliotis, vice president of data and AI at O’Reilly.

Through separate discussions with these experts we investigate whether AI has any part to play in proceedings, and how much social media platforms can realistically solve the issue.