Article 23.11.2021 1 min. read Text Facebook Files provided yet another confirmation that the company's extremely profitable recommender systems come at a high price paid by vulnerable individuals and our societies. Algorithms optimised for engagement amplify toxic content, such as hate speech or disinformation, and target humans based on their vulnerabilities. The way out from this vicious circle, generating high profits at the expense of human welfare, leads through radical structural change: unbundling content hosting and content curation on large platforms, so that other providers can offer alternative recommender systems, and users have a real choice on which one to pick. Currently, major social media platforms offer hosting and content curation as a bundle, and users have no choice but to take it all. In order to enable the development of alternative recommender systems - including algorithms that cater to users’ needs, instead of exploiting their vulnerabilities and maximising engagement at all costs - we need to separate hosting from content curation. For this diversified environment to flourish, third party recommender systems must be able to operate on social media platforms, meaning they need to be interoperable with them, and users have to be free to select the recommender system that better fits their needs. In this scenario, very large platforms like Facebook will no longer have immense power over our information diet, shifting control back to the people. This webinar offers a deep dive into the topic. The panelists will explain what legal and technological changes are necessary to achieve a landscape of alternative recommender systems within large platforms. Experts in EU policy, software engineering, security and the protection of fundamental rights will answer questions related to practical functioning of such interoperable, open environments. Webinar co-organised by Article 19, Access Now, European Partnership for Democracy and Panoptykon Foundation. Moderator: Katarzyna Szymielewicz (Panoptykon) Speakers: Cory Doctorow (Electronic Frontier Foundation) Dr. Ian Brown (expert on interoperability and data protection issues, FGV Brazil) Maria Luisa Stasi (Article 19) Marc Faddoul (YouChoose.ai, provider of alternative recommender system) MEP Kim van Sparrentak (European Greens) Watch recording on YouTube Fundacja Panoptykon Author Topic algorithms Previous Next See also Article Three layers of your digital profile Your online profile is not always built on facts. It is shaped by technology companies and advertisers who make key decisions based on their interpretation of seemingly benign data points: what movies you choose watch, the time of day you tweet, or how long you take to click on a cat video. 18.03.2019 Text Article Anxious about your health? Facebook won’t let you forget There is little point in telling Facebook which posts you do not want to see – it will not listen. 07.12.2023 Text Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive… 28.09.2021 Text