Brussels, 16 June 2022: Big Tech has seven months to prove the new EU Code of Practice on Disinformation presented today is a “new dawn” in the fight against disinformation. A group including civil society actors, fact-checkers, source-raters and anti-disinformation companies joined the EU Code with their own commitments after being part of the effort to draft it, to ensure it has a chance to become an effective tool against disinformation.
This group cautions that a new Code will not necessarily lead to substantial changes in the way Big Tech companies act, so it will remain vigilant to ensure that online platforms deliver on their promises.
The framework to tackle disinformation that is set out in the Code is a global first and includes innovative measures such as the creation of a permanent, multi-stakeholder institution (the Permanent Taskforce) to monitor the Code, quantitatively evaluate the performance of each platform and improve the Code over time. It also includes detailed commitments, including for the algorithms that have recommended and accelerated the spread of disinformation; the promotion of fact-based information and fairly funded fact-checking organisations; and the creation and funding of an independent body to grant researchers access to platform data.
The group welcomes the Code, but warns that it is too soon to celebrate. The platforms now have 7 months (until January 2023) to show that the new actions they are taking under the Code are enough to be considered ‘risk mitigation’ under the Digital Services Act – January 2023 is the deadline for platforms to submit their first set of performance reports to the EU Commission. Only then would an accurate assessment of how seriously platforms are fighting disinformation be possible, the group said.
If the major platforms fail to deliver, the group will demand assurances from the European Commission that platforms’ participation in the Code of Practice is insufficient to comply with DSA obligations. If that happens, platforms could face fines of up to 6% of their turnover. To ensure close scrutiny of the platforms, the group – consisting of Avaaz, Demagog, Faktograf, Globsec, Maldita, Newsback, NewsGuard, Pagella Politica, Who Targets Me, and VOST Europe – has decided to join the Code with their own commitments.
They will also be part of the permanent taskforce that monitors the Code’s implementation as it evolves into a Code of Conduct under the DSA.
Some of the most innovative elements of the Code:
- Until the DSA is in place, there are no real consequences for signing out or failing to deliver – Until the DSA becomes enforceable for Very Large Online Platforms (likely in Spring 2023), the Code remains a self-regulatory tool. Although big platforms are expected to sign-up to all commitments relevant to them, there are no consequences if they do not do it, or if they fail to deliver. Also, some companies that are not considered Very Large Online Platforms under the DSA but they play a strong role in disseminating disinformation are not present or have declined to sign commitments that are extremely relevant for their activities.
- A missed opportunity to enforce universal debunking of disinformation – despite the clear science on debunking, the majority of major platforms have committed to only test, but not fully implement, retroactive warnings to inform users who have been exposed to disinformation. They agree to report back to the taskforce with the results, but it must be ensured that such tests are conducted with a transparent methodology and full scrutiny from experts.
- The Large Platforms refuse to empower their consumers with trustworthiness indicators – almost all large platforms rejected the European Commission’s encouragement that they immediately empower their users with information about the trustworthiness of the news and information sources in their social media news feeds and search results.
- Vague measurement metrics – several performance indicators (called Service Level Indicators within the Code) are formulated in a general way which fails to explicitly specify what numbers and data the platforms must include in their reporting. The quality of the data will therefore only be determined when platforms report in 7 months time.
- Kept in the dark on data access – the section on empowering researchers does not contain details about which datasets will be made available to researchers, academics and CSOs. This is particularly concerning, seeing as some platforms are currently withdrawing access to some of their data sharing initiatives.
- Only a timeline for an ‘IPCC for disinforma tion’ – to measure the scale of the problem – the Code commits to measuring the scale of disinformation on each platform, just like we measure the amount of CO2 in the atmosphere. While the initial goal was to find an agreement on how such measurements would be performed, the Code sets a deadline of aligning on a common methodology by 6 months after the signature of the Code.
Testimonies
Jorge Gomes, European Coordinator, VOST Europe said: “The signing of the Code of Practice is not the end of this process, on the contrary, it’s rather the beginning, where it will important to make sure the CoP has a true impact on fighting disinformation”
Luca Nicotra, Campaign Director at Avaaz said : “This Code is a global first, in large part because civil society stepped in to push for greater ambition. But if platforms now don’t step up their actions, it’s not worth the paper it’s written on. This is why we need monitoring with teeth, from the EU Commission, that boldly flags platform failures. Otherwise, this Code could become just a cheap way to avoid the fines they could face under the Digital Services Act.”
Pawel Terpilowski, chief editor of Demagog said : “We consider the new Code as an important step forward in fighting disinformation but ultimately actions speak louder than words. Now is the time for actions. We have a long and difficult way ahead of us. A lot of work needs to be done. There is a crucial role of civil society organizations in making sure that big tech companies will implement those measures in a meaningful manner. We are looking forward to contributing to this process”.
Ana Brakus, executive director of Faktograf said : “The new Code promises to bring a major shift in the way very big online platforms approach and accept their role in shaping the realities of peoples in the EU. Still, we must admit that these commitments will shape the behaviors of actors even beyond the borders of the EU, especially in Southeastern Europe. The platforms in question have another big chance to accept and acknowledge their accountability for their actions in a democratic society. We hope they take this chance honestly and to the benefit of our societies, not just their bottom line.”
Dominika Hajdu, Policy Director, Centre for Democracy & Resilience, GLOBSEC said : “The implementation of the Code in the upcoming months will be crucial for the assessment of the Code’s success. We expect the Code to enhance cooperation between the platforms and the research community and deliver detailed reporting of actions taken per member state, which should help us assess the scope of disinformation and the efforts to counter it in countries with unique languages, including many in Central and Eastern Europe, where disinformation has had a disruptive effect on democracies.”
Carlos Hernández-Echevarría, Head of Public Policy and Institutional Development at fact-checker Maldita.es said : “We welcome the new Code, but this is just the beginning of the road. We will need to see that those commitments translate into real actions by the platforms, and that those actions lead to meaningful results. We are very much looking forward to working with other actors in making sure this is an effective tool and not just good words”.
Delphine Gatignol, Director, Newsback said: “At Newsback, we recognise that online platforms have their work cut out to address the problem of disinformation effectively. We believe the Code of Practice will help them adhere to their responsibilities by providing a framework and setting goals. We want these goals to be meaningful and achievable. As Newsback has developed technology to help in the fight against disinformation we, with this group of co-signatories are well placed to advise and monitor this.”
Gordon Crovitz, co-CEO, NewsGuard said: “Despite the Commission’s recommendation that platforms provide their users with access to “indicators of trustworthiness, focused on the integrity of the source,” to “support users in making informed choices,” the large platforms other than Microsoft refused to commit under the revised Code to empower their users with this critical tool. Meta, Google and other large platforms will continue to distribute and their algorithms will continue to promote disinformation sources in their social media feeds and search results, without providing independent and transparent source ratings, thus placing their advertising revenues ahead of reader safety. This refusal to take responsibility is especially shocking at a time when Facebook, Google and the other large platforms remain the key tools for spreading Russian disinformation about its invasion of Ukraine and for other propaganda efforts seeking to undermine democracies.”
Tommaso Canetta, deputy director of Pagella Politica said: “The new Code is promising and the related expectations are really high. This is why we think that the implementation phase will need to be carefully and closely monitored. If the intentions of all the signatories will be proven to be genuine, the potential consequences of this Code are in our opinion very positive”.
Sam Jeffers, Who Targets Me said: “We’re pleased that civil society organisations like Who Targets Me were able to bring their expertise, experience and perspective to the revised EU Code of Practice on Disinformation. This made the Code stronger. Over the coming months and years, we look forward to further collaboration to ensure the Code is fully implemented, accurately reported on and able to evolve to meet the challenges ahead.”