CovidCheck report highlights inconsistencies in platforms’ response to disinformation


A new report published by DCU FuJo and the Broadcasting Authority of Ireland examined how digital platforms implemented the EU’s monitoring programme for Covid-19 disinformation. The researchers found that the reports submitted by platforms were were highly repetitive, often irrelevant, and generally failed to provide the data that was requested. It argues that more robust procedures for reporting and monitoring online disinformation need to be developed for the EU Code of Practice on Disinformation to become a more effective tool in fighting disinformation.

Facebook, Google, Microsoft, Mozilla, TikTok and Twitter are all signatories to the self-regulatory Code of Practice on Disinformation. CovidCheck is the third monitoring report commissioned by the BAI, and prepared by FuJo, on the implementation of the Code in Ireland. This report provides analysis of the 47 monthly transparency reports, submitted by the signatories to the Code between August 2020 and April 2021 in response to the European Commission’s June 2020 communication on tackling COVID-19 disinformation. This analysis is supplemented by Irish case studies focused on Facebook and TikTok, and a review of the signatories’ transparency regarding the use of AI and automation in fighting COVID-19 disinformation.

The research found that while the code has proven a useful instrument in prompting signatories to respond to concerns about disinformation, there are shortcomings in relation to its implementation and scope. Researchers cited difficulties in assessing the timeliness, completeness and impact of the actions undertaken by the signatories. The report sets out nine recommendations for more effective reporting and monitoring of disinformation. The report recommends that:

  • Reporting be standardised, as far as possible, to ensure necessary and relevant information is provided in a manner that facilitates monitoring.
  • Signatories provide clear definitions of relevant policies to combat disinformation, clear definitions of common terms, and how these terms are operationalised on their services.
  • Relevant stakeholders introduce a framework to address disinformation in comments that is consistent with Article 10 of the European Convention on Human Rights and the principle of freedom of opinion.
  • Regarding platforms’ active users, clear parameters are defined for the reporting of granular data about specific action areas and in relation to EU Member States.
  • Meaningful KPIs (key performance indicators) are defined for the reporting of results and outcomes in relation to key areas including: content labels, content and account removals, factchecking and media literacy campaigns. In addition, signatories should report on their own efforts to measure the efficacy of these actions and provide data to independent researchers to verify that efficacy.
  • The commitment to an independent auditor be implemented under the revised Code and that the signatories provide adequate funding and resources to support this position.
  • In order to verify the implementation of actions, standardised procedures are agreed for future monitoring.
  • Signatories report on their use of automated systems to combat disinformation, including an explanation of what systems are used, what languages are covered, what kinds of disinformation they are trained to detect, and what risk assessments have been conducted on the AI systems used to tackle disinformation. It is also recommended that the European Commission articulates the need for risk assessments related to disinformation in the strengthened Code.
  • Signatories embrace the need for transparency and data-sharing with researchers, as well as expand and improve services that allow researchers to access data. It is also recommended that the European Commission creates a clear regulatory framework for accessing data for research on disinformation and expands the scope of its current proposal to include more stakeholders, including members of civil society organisations.

The research for this report is part of a larger project implemented by the European Regulators Group for Audio-visual Media Services (ERGA) that is designed to assist the European Commission in monitoring the effectiveness of the Code. The BAI chaired the ERGA sub-group in 2019 and continues to play an active role in the group. It follows two previous reports researched by FuJo: ElectCheck 2019 and CodeCheck 2020.

CovidCheck was written by Eileen Culloty, Kirsty Park, Charis Paevangelou, Trudy Feenane, Alex Conroy, and Jane Suiter.