Article Discrimination in datafied world Data-driven technologies are not neutral. A decision to collect, analyse and process specific kind of information is structured and motivated by social, economic and political factors. Those data operations may not only violate the right to privacy but also lead to discrimination and oppression of socially marginalised communities. Discriminatory data processes and algorithms are a massive challenge for the modern human rights movement that requires non-standard solutions. The report “Between Anti-discrimination and Data” – tries to shed light on this problem from the perspective of European civil society organisations. 10.07.2018 Text
Article Panoptykon files complaints against Google and IAB Europe On the International Data Protection Day, 28 January 2019, Panoptykon Foundation filed complaints against Google and IAB Europe under the General Data Protection Regulation (GDPR) to the Polish Data Protection Authority (DPA). The complaints are related to the functioning of online behavioural advertising (OBA) ecosystem. 28.01.2019 Text
Article Polish law on “protecting the freedoms of social media users” will do exactly the opposite Polish government’s proposal for a new law on “protecting free speech of social media users” introduces data retention, a new, questionable definition of “unlawful content”, and an oversight body (Free Speech Council) that is likely to be politically compromised. In this context, “Surveillance and Censorship Act” would be a more accurate name. 10.02.2021 Text
Article No control over surveillance by Polish intelligence agencies. ECHR demands explanations from the government The European Court of Human Rights demanded the Polish government to provide an explanation in the case of surveillance by intelligence agencies. 18.12.2019 Text
Article Who is more real: me or my digital profile?, re:publica 2015 [VIDEO] Sharing information is less and less our free choice. The society requires high visibility: those, who don't expose themselves become suspicious or excluded. But sharing is just the beginning. The real purpose behind it is profiling. Be that our insurance or health care scheme, unemployment benefit or school curriculum – more and more services depend not so much on who we are in reality, but on the quality of our digital profile. Who designs these algorithms? What business and political stakes are behind? Katarzyna Szymielewicz comments on contents of our digital profiles and its implications. 08.05.2015 Text