Article 21.09.2021 4 min. read Text The list of negative consequences of how dominant online platforms shape our experience online is neither short nor trivial. From exploiting users’ vulnerabilities, triggering psychological trauma, depriving people of job opportunities to pushing disturbing content to others, these are just some examples. While members of the European Parliament debate their position on the Digital Services Act, Panoptykon Foundation, together with 49 civil society organisations from all over Europe, urge them to ensure protection from the harms caused by platforms’ algorithms. The Ugly Face of Data-Hungry Algorithms Ad delivery algorithms and recommender systems are responsible for what we see once we visit Facebook or YouTube. Their code may be complicated – it is artificial intelligence after all – but their job isn’t. The goal is to maximise the platforms’ profits from surveillance-based advertising. And that translates to keeping the user on the platform, so that they can watch more ads, while leaving more and more traces to be collected by data-hungry algorithms. But you can't make an omelette without breaking eggs. In this case, the eggs being users’ self-image (affected by the algorithm’s choice of photos on Instagram), the quality of the public debate (recommender systems notoriously promote divisive, sensationalist content), or access to job offers (data-driven ad delivery algorithms, which select viewers from larger sets of eligible targets, have been shown to discriminate against people based on gender, race, or age). Platforms know more about their users than they tell them. Every bit of a users’ online activity, on and off the platform, is used to make predictions about them in order to determine the content they will see online, or not. Advertisers may not intend to discriminate against anyone, but the algorithmic fixation on campaign targets can have that effect. The pile of evidence on the harmful consequences of algorithms used by large online platforms is growing – although investigating them is difficult due to pervasive opacity. Civil Society Calls for Improvements in the Digital Services Act The debate on the draft DSA proposal presented by the European Commission largely focuses on issues related to user content moderation. Although important, they are less inconvenient for platforms because they do not challenge their surveillance-based business model or affect their attention-maximising algorithms. But human rights defenders are not going to let it go: 50 civil society organisations – including European Digital Rights, European Partnership for Democracy, Amnesty International, Electronic Frontier Foundation – have joined Panoptykon in calling on the members of the European Parliament’s internal market and consumer protection committee to empower users and ensure effective oversight of algorithms in their amendments to the Digital Services Act. Protection by default is an essential part of the solution. Users should not be forced to abdicate control of their data as a condition of access to a service. By default, they should be able to use the platform without having to share their personal data for advertising or recommendation purposes. The DSA should also prohibit the use of deceptive interfaces or consent screens, designed to impair users’ free choice. But for users’ choices to be actually informed and free, the algorithms need to be more transparent. You can't make an informed choice unless you know how algorithms used by a platform work. Thus, disclosure of all the key information about the algorithms is our baseline, as it will give users better insight into how content they see is selected. In addition, access to data for academic researchers, journalists and civil society organisations is crucial to scrutinise how algorithms work and audit their effects. In the past it was journalists, independent researchers and civil society organisations who shed light on the harmful consequences of platform algorithms. But access to data depended on the company’s good will – which has its limits. Especially when the findings could cause publicity that is not welcomed (like those published by AlgorithmWatch). Last but not least, organisations demand that users’ can react when they find content recommended by the platform objectionable. Users should be able to modify the recommendation system so that it works for them. But they should also be allowed to break away from the platform’s centralised system and choose an independent recommendation service – commercial or not – that better aligns with their interests. Signatories of the letter argue that it is this ecosystem innovation that has the potential to truly empower users as well as exert pressure on big platforms to be make real improvements in their own systems. The great data protection reform has failed to curb the negative effects of big tech’s data-driven optimisation algorithms. Instead of real influence on how our data is gathered and used by companies, ‘consent’ is hidden in the Terms of Use and deceptive interfaces which nudge users towards making choices they wouldn’t have otherwise made. The Digital Services Act is a unique opportunity to fix this. Are MEPs going to wield it? Civil society organisations are doing what they can to persuade them to use the solutions at their disposal to enable a digital world which is both innovative and beneficial to society. Anna Obem, Karolina Iwańska Full text of the open letter [PDF] Fundacja Panoptykon Author Topic social media Previous Next See also Podcast Algorithms of Trauma 2. How Facebook Feeds on Your Fears Worried about your health? Facebook won’t let you forget. 07.12.2023 Audio Article Who will not be blocked by Facebook? SIN wins the first court battle The District Court in Warsaw (Appellate Division) upheld its interim measures ruling from 2019 in which it temporarily prohibited Facebook from removing fan pages, run by the Polish NGO “SIN”, on Facebook and Instagram, as well as from blocking individual posts. This means that – until the case is… 14.07.2021 Text Article Three layers of your digital profile Your online profile is not always built on facts. It is shaped by technology companies and advertisers who make key decisions based on their interpretation of seemingly benign data points: what movies you choose watch, the time of day you tweet, or how long you take to click on a cat video. 18.03.2019 Text