Article Webinar: Alternative recommender systems in the DSA [recording] Facebook Files provided yet another confirmation that the company's extremely profitable recommender systems come at a high price paid by vulnerable individuals and our societies. Algorithms optimised for engagement amplify toxic content, such as hate speech or disinformation, and target humans based on their vulnerabilities. 23.11.2021 Text
other Safe by Default – Panoptykon Foundation and People vs BigTech’s Briefing Moving away from engagement-based rankings towards safe, rights-respecting, and human centric recommender systems. 05.03.2024
other Joint Submission on the Commission’s Guidelines for Providers of VLOPs and VLOSEs on the Mitigation of Systemic Risks for Electoral Processes Part 1 introduces how recommender systems contribute to systemic risks. Part 2 responds to the Commission’s proposals to moderate virality of content that threatens the integrity of the electoral process. 07.03.2024
Article Win against Facebook. Giant not allowed to censor content at will By blocking the accounts and groups of Społeczna Inicjatywa Narkopolityki (SIN, the Civil Society Drug Policy Initiative), Meta has infringed on the organization’s personal rights. On Wednesday, a Polish court issued a watershed decision in a case supported by the Panoptykon Foundation, thereby confirming that Internet platforms cannot block users at will. The court also confirmed that banned users have the right to sue in their own country. 14.03.2024 Text
other For Algorithmic Pluralism! In the wake of the restitution of the États Généraux de l’information and on the occasion of the Numérique en commun(s) national event, a collective of 50 personalities, associations, French and international companies... 25.09.2024