Article Discrimination in datafied world Data-driven technologies are not neutral. A decision to collect, analyse and process specific kind of information is structured and motivated by social, economic and political factors. Those data operations may not only violate the right to privacy but also lead to discrimination and oppression of socially marginalised communities. Discriminatory data processes and algorithms are a massive challenge for the modern human rights movement that requires non-standard solutions. The report “Between Anti-discrimination and Data” – tries to shed light on this problem from the perspective of European civil society organisations. 10.07.2018 Text
Article Digital sanctions won’t solve the problem of war propaganda online. Robust platform regulations will European officials urged Big Tech to ban Kremlin-related accounts in the effort to tackle the propaganda online, as the Internet – and particularly the social media – became an important front of Russian invasion on Ukraine. But such “digital sanctions” are just a Band-Aid on a bullet wound. Yet again we call therefore for robust platform regulations in the Digital Services Act instead. The current crisis only confirms how badly overdue systemic solutions are. 14.03.2022 Text
Article Activists v. Poland. European Court of Human Rights hearing on uncontrolled surveillance On 27 September the hearing was held at the European Court of Human Rights, following the application against Poland lodged by activists from Poland’s Panoptykon Foundation and Helsinki Foundation for Human Rights, joined by a human rights attorney. The group alleges that the state violated their right to privacy by allowing the intelligence agencies to act beyond scrutiny. Their case has been supported by the United Nations special rapporteur, Polish Ombudsman and the European Criminal Bar Association, attending the Strasbourg hearing as well. 04.11.2022 Text
Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers, but does not affect Facebook’s own profiling and ad delivery practices. While much has been written about the disinformation and risks to democracy generated by social media’s data-hungry algorithms, the threat to people’s mental health has not yet received enough attention. 28.09.2021 Text
Article Moglen supports Snowden’s nomination for Sakharov Prize [VIDEO] Eben Moglen, famous lawyer and publicist, in cooperation with Richard Stallman created Free Documentation License GNU. Director-Counsel of the Software Freedom Center and founder of the Freedom Box Foundation at the request of Panoptykon Foundation comments on Edward Snowden’s nomination for Sakharov Prize. 08.10.2013 Text