Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers, but does not affect Facebook’s own profiling and ad delivery practices. While much has been written about the disinformation and risks to democracy generated by social media’s data-hungry algorithms, the threat to people’s mental health has not yet received enough attention. 28.09.2021 Text
other Civil society letter to WCIT-12 and ITU on Internet regulation Informacje na ten temat 17.05.2021
other Safe by Default – Panoptykon Foundation and People vs BigTech’s Briefing Moving away from engagement-based rankings towards safe, rights-respecting, and human centric recommender systems. 05.03.2024
other Joint Submission on the Commission’s Guidelines for Providers of VLOPs and VLOSEs on the Mitigation of Systemic Risks for Electoral Processes Part 1 introduces how recommender systems contribute to systemic risks. Part 2 responds to the Commission’s proposals to moderate virality of content that threatens the integrity of the electoral process. 07.03.2024
Article Wojciech Wiewiórowski will remain DPC for the second term The current Polish Data Protection Commissioner (DPC) will remain on his post for another, second term after the Polish Parliament confirmed his nomination on 25 July 2014. The decision did not come as a surprise: Wojciech Wiewiórowski was the only candidate for the post and has an excellent background for the role. Just like during the previous nomination process four years ago, EDRi member Panoptykon monitored the process, to ensure its transparency to the public. 30.07.2014 Text