Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers, but does not affect Facebook’s own profiling and ad delivery practices. While much has been written about the disinformation and risks to democracy generated by social media’s data-hungry algorithms, the threat to people’s mental health has not yet received enough attention. 28.09.2021 Text
other Terms of Use Terms of Use set forth the principles of using the website panoptykon.org (hereinafter: the website), i.e. a service rendered electronically by the admin of the website of Panoptykon Foundation (hereinafter: Foundation). 08.01.2024
other Privacy policy [until 10 October 2022] This version of privacy policy was in force until 10 October 2022. 22.02.2024
other Privacy policy [until May 25, 2018] This version of privacy policy was in force until May 25, 2018. Collected data and principles of its use 22.02.2024