Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers, but does not affect Facebook’s own profiling and ad delivery practices. While much has been written about the disinformation and risks to democracy generated by social media’s data-hungry algorithms, the threat to people’s mental health has not yet received enough attention. 28.09.2021 Text
Report Access of public authorities to the data of Internet service users. Seven issues and several hypotheses The report looks at what happens at the interface of Internet service providers and public authorities in Poland. Who sends requests for users data, how many and for what purpose, what legal procedures are followed and what safeguards apply. During our research we analysed legal provisions and collected data from both major Internet service providers and public authorities. On that basis we were able to identify several systemic problems that should be solved in order to ensure adequate standard of protection for individuals. 02.05.2014 Text
Report Data Retention in Poland: The issue and the Fight This paper is aims to give a brief overview of the following issues: (i) Polish data retention regime and its drawbacks; (ii) the use of data retention in practice and available data on the subject; (iii) campaign run by the Panoptykon Foundation over last two years; and (iv) political shifts that occurred in Poland. 05.08.2012 Text