Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers, but does not affect Facebook’s own profiling and ad delivery practices. While much has been written about the disinformation and risks to democracy generated by social media’s data-hungry algorithms, the threat to people’s mental health has not yet received enough attention. 28.09.2021 Text
Report New report: To track or not to track? Towards privacy-friendly and sustainable online advertising Our new report: To Track or Not to Track? Towards Privacy-friendly and Sustainable Online Advertising shows that it’s actually possible to reform the ad tech industry without bankrupting online publishers. But in order to make that happen, EU policymakers must create a regulatory push by enforcing the GDPR and adopting new rules which would incentivise the uptake of alternatives. 25.11.2020 Text
Report Black-Boxed Politics: Opacity is a Choice in AI Systems By Agata Foryciarz, Daniel Leufer and 17.01.2020 Text
Report Digital propaganda or 'normal' political polarization? Case study of political debate on Polish Twitter We are right to be worried about the polarization of public debate, the rise of populism and digital propaganda. It also goes without saying that social media have a growing impact on our politics and society. 17.04.2018 Text
Report "Profiling the Unemployed in Poland" – report In October 2015 we published a report on “Profiling the Unemployed in Poland: Social and Political Implications of Algorithmic Decision Making”. 09.03.2016 Text