This article was originally published by Radio Free Europe/Radio Liberty and is reprinted with permission.
The European Commission (EC) will propose for the first time that the European Union impose sanctions on “foreign actors” from states such as Russia or China spreading disinformation as it pushes for a tougher oversight mechanism for online platforms, according to a draft document seen by RFE/RL.
The European Democracy Action Plan, which the EC is set to present on December 2, says that the 27-member bloc “needs to use more systematically the full range of tools in its toolbox for countering foreign interference and influence operations and further develop them, including by imposing costs on the perpetrators.”
“Possible ways of doing so range from publicly identifying commonly used techniques (so as to render them operationally unusable) to imposing sanctions following repeated offenses,” it says.
It is the first time that the EC — the EU’s executive body — has suggested in an official document the imposition of sanctions for the spread of disinformation.
Warning that information can be “weaponized by foreign actors,” the document goes on to say that “certain third countries (in particular Russia and China) have engaged in targeted influence operations and disinformation campaigns around COVID‑19 in the EU, its neighborhood, and globally, seeking to undermine democratic debate, exacerbate social polarization, and improve their own image.”
The action plan notes that the East StratCom Task Force, a division of the European External Action Service (EEAS) that monitors Russian disinformation, has so far identified more than 500 examples of pro-Kremlin disinformation on COVID-19 this year and over 10,000 examples of pro-Kremlin disinformation since it started monitoring in 2015. The EEAS is the EU’s diplomatic corps.
The document also suggests much tougher EU rules on online platforms that “can be used by malicious operators for disseminating and amplifying false and misleading content and have been criticized for lack of transparency in the use of algorithms to distribute content online and for targeting users on the basis of the vast amount of personal data generated from online activity.”
In 2018, the EC put together a code of practice on disinformation that platforms such as Facebook, Google, and Twitter joined voluntarily to report on actions taken on the transparency of ad placements and moves against fake accounts and bots.
However, amid what it said were disappointing results, the EC is now suggesting “a more robust approach based on clear commitments and subject to appropriate oversight mechanisms is necessary to fight disinformation more effectively.”
The document says the upcoming Digital Services Act (DSA), due to be unveiled by the EC later this year, “will propose rules to ensure greater accountability on how platforms moderate content, on advertising, and on algorithmic processes.”
“Very large platforms will be obliged to assess the risks their systems pose — not only as regards illegal content and products but also systemic risks to the protection of public interests and fundamental rights, public health and security,” it says.
The DSA aims to update the European Union’s legal framework for online business.