What’s next for the EU Code of Practice on Disinformation?

13 May 2020

Two recent reports may shed some light on the future of the EU Code of Practice on Disinformation after a year of monitoring, so where does the future lie for the Code and the platforms which are signatories to it?

The reports point to some likely next steps. First, the Code is a voluntary document underpinned by self-regulation. That time may be up and co-regulation at the very least may be on the way. Second, the 13 signatories do not cover all stakeholders and it may be expanded to include for example Tik-Tok and WhatsApp. Third, there is a case for far greater provision of data to research institutions.

But first some background.  In October 2018, representatives of online platforms such as Facebook, Twitter and Google, agreed to implement the EU Code of Practice on Disinformation. The Code is a self-regulatory set of standards consisting of 15 commitments organised under five pillars:

A.         Scrutiny of ad placements 

B.         Transparency of political advertising and issue-based advertising 

C.          Integrity of Services 

D.         Empowering consumers 

E.          Empowering the research community 

While this was a positive first step and highlighted a willingness from industry to cooperate in tackling the issue of disinformation, the self-regulatory nature of the Code leaves its success dependent upon voluntary compliance from online platforms in following through on their commitments. The European Regulators Group for Audiovisual Media Services (EGRA) has assisted the European Commission in assessing the effectiveness of the Code by monitoring its implementation across Europe. 

FuJo has been involved in this work in Ireland, partnering with the Broadcasting Authority of Ireland (BAI), Ireland’s National Regulatory Authority (NRA), to produce the ElectCheck and CodeCheck reports. Both reports identified areas that required improvement. ElectCheck highlighted that while platforms made public repositories of political adverts available during the 2019 European elections, the available data is largely limited, particularly when it comes to targeting and spending. The CodeCheck report assessed the implementation of pillars D and E in Ireland from Google, Facebook, Microsoft and Twitter and found that while progress had been made, there were significant inconsistencies between the platforms with more action needed by all, particularly with the labelling of trustworthy content. 

Last week, ERGA published a report assessing the implementation of the code, based on work like ElectCheck and Codecheck completed by NRAs across Europe. As the report makes very clear, on the credit side, the Code of Practice is “a unique and innovative tool in the fight against online disinformation”.  It represents a partnership and relationship between the platforms, EU institutions and NRAs which is not simply based on legal requirements, but on a shared desire to combat the very real issues associated with disinformation.

Additionally, while the Covid-19 pandemic did not occur during the assessment period, the report does note positive actions taken by various platforms to address disinformation related to Covid-19, again from each platform’s own volition. 

However, as a second EC report from the Directorate-General for Communications Networks, Content and Technology noted, the Code simply does not have a high enough public profile to put sufficient pressure for change on platforms.

Some of these issues relate to the generic nature of the commitments and these can be partially addressed through specific recommendations such as drafting a set of consistent terminology and definitions for terms like ‘issue-based advertising’ or making specific requirements for reporting such as requiring platforms to provide datasets to NRAs every six months. 

The other major barrier is the fact that the Code is non-compulsory, the difficulty in verifying the implementation of the Code and the lack of enforcement measures. The major conclusion drawn from the ERGA report is that for the Code to be truly effective, it requires moving from a self-regulatory model to something that has more oversight capacity such as co-regulation or a more conventional regulatory approach.

The EGRA report focuses on co-regulation as a next step, which would be a reasonable evolution from self-regulation as it continues in a co-operative approach but provides more legislative oversight in the form of sanctions and redress mechanisms perhaps through the anticipated Digital Services  Act.

Additionally, the scope of the Code will always be limited by the number of signatories in a self-regulatory or co-regulatory approach, a relevant concern when major platforms such as Tik-Tok, WhatsApp and Messenger are not currently signatories.

The issue of legislating around disinformation has always been contentious and fraught with difficulties, but there may be an eventual move to co-regulation and later something more conventional. It is also vital to note, as does the EC report, that researchers’ access to data is still limited and many databases are not user friendly. 

This website uses cookies to ensure you get the best experience. The majority of the cookies used on this website are associated with analytics, collecting information about how visitors use our site. The cookies collect information in an anonymous form that does not identify an individual. Learn more
Current status: AcceptedDeclinedNot yet accepted