Article 14.03.2022 3 min. read Text European officials urged Big Tech to ban Kremlin-related accounts in the effort to tackle the propaganda online, as the Internet – and particularly the social media – became an important front of Russian invasion on Ukraine. But such “digital sanctions” are just a Band-Aid on a bullet wound. Yet again we call therefore for robust platform regulations in the Digital Services Act instead. The current crisis only confirms how badly overdue systemic solutions are. Disinformation campaigns as a weapon The disinformation campaigns related to the situation in Ukraine have been emerging in Poland quite extensively over the last days. Poland, which is at the forefront of hosting refugees fleeing war, has become particularly vulnerable to (likely Russian-led) propaganda and manipulation, amplifying fear, extremism, and polarisation in order to weaken grassroots efforts and extraordinary mobilisation among the Polish society to support people fleeing from Ukraine. Disinformation seems to be particularly widespread on social media, especially when amplified by public officials who get wide outreach (e.g. MPs). Unfortunately there have also been instances of false or misleading information distributed in traditional media, such as Polish public service TV (which, however doubtful in terms of quality journalism since political takeover in 2016, remains the only source of news in many Polish households). At the same time the independent media have themselves become a target of disinformation spread via social media in order to undermine trust in their Ukraine war coverage. Political reaction so far: missing the point Political reactions in the region to war-related online disinformation so far have been focused on urging big platforms to ban Kremlin-related accounts (see e.g. the letter of prime ministers of Poland and the Baltics addressed to big tech companies). A call to impose such "digital sanctions" is understandable: in the context of information warfare forming an important part of the ongoing hybrid conflict, EU governments have the right to expect that US tech companies will stand with them on the same side of the barricade. But in fact, what they call for, is just a Band-Aid on a bullet wound which will not bring the desired effect in the long run and will not stop the propaganda. Civil society response: time to #fixthealgorithms Therefore, at this particularly difficult moment, we also demand a more systemic and thoughtful response from our governments addressing the core problem of algorithmic amplification of disinformation on big tech’s online platforms: business models that rely on encouraging clicks and engagement, irrespective of the content’s informative value or authenticity of sources. Thanks to documents disclosed by Frances Haugen, we have got evidence that platforms such as Facebook are aware of harmful effects of their algorithms, but still allow it in the name of making profit. The solution to this are not ad-hoc “sanctions”, but robust platform regulation, such as the Digital Services Act, ensuring at least: risk assessment for recommender systems’ algorithms, obligation for very large online platforms to provide at least one recommender system not based on profiling, prohibition on the use of so-called inferred data, in particular revealing sensitive attributes (such as race, religion, health condition), for advertising purposes – to prevent online manipulation and exploitation of users’ vulnerabilities. The current crisis only confirms how badly we need strong regulations in this area and actually how overdue such regulations already are. As the trialogue negotiations on the DSA run, the opportunity to fix the algorithms and address the real core of the problem is still there. Dorota Głowacka, Anna Obem Panoptykon: Deadling with Disinformation. A Handbook for Journalists Panoptykon: Big Tech platforms are hurting us. 50 organisations urge the EU to #fixalgorithms Fundacja Panoptykon Author Topic algorithms Previous Next See also Article Can the EU Digital Services Act contest the power of Big Tech’s algorithms? A progressive report on the Digital Services Act (DSA) adopted by the Committee on Civil Liberties, Justice and Home Affairs (LIBE) in the European Parliament in July is the first major improvement of the draft law presented by the European Commission in December. MEPs expressed support for default… 03.08.2021 Text Article Anxious about your health? Facebook won’t let you forget There is little point in telling Facebook which posts you do not want to see – it will not listen. 07.12.2023 Text Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive… 28.09.2021 Text