Article Who is more real: me or my digital profile?, re:publica 2015 [VIDEO] Sharing information is less and less our free choice. The society requires high visibility: those, who don't expose themselves become suspicious or excluded. But sharing is just the beginning. The real purpose behind it is profiling. Be that our insurance or health care scheme, unemployment benefit or school curriculum – more and more services depend not so much on who we are in reality, but on the quality of our digital profile. Who designs these algorithms? What business and political stakes are behind? Katarzyna Szymielewicz comments on contents of our digital profiles and its implications. 08.05.2015 Text
Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers, but does not affect Facebook’s own profiling and ad delivery practices. While much has been written about the disinformation and risks to democracy generated by social media’s data-hungry algorithms, the threat to people’s mental health has not yet received enough attention. 28.09.2021 Text
Article Limits to harmful surveillance in online advertising? Joint statement ahead of the vote in the European Parliament next week “We don’t have to manipulate our customers or exploit their vulnerabilities to scale up” – European entrepreneurs and social organizations appeal to the MEPs to put an end to invasive and privacy-hostile practices related to surveillance-based advertising and thus open the market to ethical and innovative online ads, which respect users’ rights and their choices. On the opposite bench – the Big Tech lobby fights for the status quo to remain – despite the well-documented social and individual harms caused by the current ads ecosystem. 13.01.2022 Text
other Safe by Default – Panoptykon Foundation and People vs BigTech’s Briefing Moving away from engagement-based rankings towards safe, rights-respecting, and human centric recommender systems. 05.03.2024
Article Polish attempt at a “transparency report” In our first attempt at a “transparency report”, we looked at what happens at the interface of Internet service providers and public authorities in Poland. Who sends requests for users' data? How many and for what purpose? What legal procedures are followed and what safeguards apply? Our pilot study includes analysis of legal provisions and collection of data from both major Internet Service Providers and public authorities. The report explains systemic problems that were identified in our research and that should be solved in order to ensure adequate standard of protection for individuals. 07.05.2014 Text