Carole Cadwalladr leads DTX discussion on policing misinformation

The exposé of the Facebook and Cambridge Analytica scandal shook the world in 2018. But little has changed in the four years since.

Cybersecurity expert Lisa Forte and Carole Cadwalladr, the investigative journalist who broke the Facebook/Cambridge Analytica story in The Observer newspaper, spoke at London’s Digital Transformation Expo (DTX) in London in October. The discussion focussed, among other things, on the challenges journalists face when confronted with misinformation.

Cadwalladr launched the Real Facebook Oversight Board in 2020 as “an emergency response to the US elections”. The initiative calls Facebook (Meta) the “world’s disinformation machine” and demands that Meta:

  • fix its algorithm and up-rank credible and trusted news;
  • fully invests in non-English language content moderation;
  • gives access back to banned researchers to monitor content around global elections.

Challenge for newsrooms

With many stories published by global newsrooms gathered from local news outlets, fact-checking and source verification has become a minefield to navigate. Local newsrooms are targeted with misinformation presented as facts. These are then picked up by national news outlets, and finally by international outlets, before being spread on social media.

With a dearth of moderators, Meta, formerly Facebook, allows most information on its platform to spread unchecked and unverified, say campaigners. Estimates of the number of moderators working on the platform range from 15,000 to 30,000.

Influencer power

Cadwalladr pointed out that Generation Z, those born between the mid-1990s and late-2000s, have become somewhat complacent regarding misinformation and spoke about the power of super-influencers such as Elon Musk and the detrimental impact they can have. With 109m followers, Musk’s account is currently the third most followed on Twitter and wields a considerable influence on platform users.

A 2020 poll by Common Sense Media of 13-18 year olds showed that 77% get news and headlines from social media, while 39% “often” get news from personalities, influencers, and celebrities on social media, the most popular platform being YouTube.

Different countries’ approaches

Meta, Amazon, and Google all have their EMEA headquarters in Dublin. Members of the European Parliament complained in October that these tech giants lobbied the Irish parliament for favourable legislation on tech, while Ireland’s Data Protection Commission (DPC) denied in 2021 that it lobbied in the interests of Meta.

Germany is one country that has put in place anti-hate laws to combat misinformation. As a result, Cadwalladr said, one in six Meta moderators works on German-language content, despite this accounting for just a fraction of the site’s content.

While most governments are to some extent complicit, Australia and Canada have reportedly both made steps to crack down on media monopolies.

Who guards the guardians?

A member of the audience asked a very pertinent question – who should police those who are moderating the platform, and what rules should they have to abide by? It’s a tricky question, Cadwalladr admitted, and one that for now remains open ended.

The speakers stressed that we need to think about how to safeguard privacy in Web3 and the Metaverse. The US midterm election in November will once again throw the issue of Meta and misinformation into the spotlight.

This is not a complete reproduction of what was said at this conference – it is an edited version based on the reporter’s understanding of what was relayed. This content has not been approved/endorsed by the speakers.