Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers, but does not affect Facebook’s own profiling and ad delivery practices. While much has been written about the disinformation and risks to democracy generated by social media’s data-hungry algorithms, the threat to people’s mental health has not yet received enough attention. 28.09.2021 Text
Article GDPR and online advertising – Panoptykon consults IAB Poland’s code of conduct We have taken part in the public consultations of the draft code of conduct which is supposed to help apply the GDPR to the internet advertising sector. The code was prepared by the Polish office of the Internet Advertising Bureau. 23.10.2018 Text
other Open letter to EU member states from consumer groups, NGOs and industry representatives in support of the ePrivacy Regulation 03.12.2018
other Panoptykon Foundation’s submission to the consultation on the ‘White Paper on Artificial Intelligence’ 16.06.2020
other Submission in the consultations of the European Data Protection Board Guidelines 8/2020 on the targeting of social media users 15.10.2020