Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers, but does not affect Facebook’s own profiling and ad delivery practices. While much has been written about the disinformation and risks to democracy generated by social media’s data-hungry algorithms, the threat to people’s mental health has not yet received enough attention. 28.09.2021 Text
Article Digital sanctions won’t solve the problem of war propaganda online. Robust platform regulations will European officials urged Big Tech to ban Kremlin-related accounts in the effort to tackle the propaganda online, as the Internet – and particularly the social media – became an important front of Russian invasion on Ukraine. But such “digital sanctions” are just a Band-Aid on a bullet wound. Yet again we call therefore for robust platform regulations in the Digital Services Act instead. The current crisis only confirms how badly overdue systemic solutions are. 14.03.2022 Text
Article Discrimination in datafied world Data-driven technologies are not neutral. A decision to collect, analyse and process specific kind of information is structured and motivated by social, economic and political factors. Those data operations may not only violate the right to privacy but also lead to discrimination and oppression of socially marginalised communities. Discriminatory data processes and algorithms are a massive challenge for the modern human rights movement that requires non-standard solutions. The report “Between Anti-discrimination and Data” – tries to shed light on this problem from the perspective of European civil society organisations. 10.07.2018 Text
Article First court decision in SIN vs Facebook: the internet giant must not restrict the organisation’s activities in its services The District Court in Warsaw, in its interim measures ruling, has temporarily prohibited Facebook from removing fanpages, profiles and groups run by SIN (a Polish NGO) on Facebook and Instagram, as well as from blocking individual posts. This means that – at least until the case is decided – SIN’s activists may carry out their drug education without concerns that they will suddenly lose the possibility to communicate with their audience. The court has furthermore obliged Facebook to store profiles, fanpages and groups deleted in 2018 and 2019 so that – if SIN wins the case eventually – they can be restored together with the entire published content, comments by other users, as well as followers and people who liked the fanpages. This is not the only good news: the court has also confirmed that Polish users can enforce their rights against the tech giant in Poland. The court’s decision is not final – after the delivery of the decision, Facebook Ireland will have the right to appeal it with the Appeal Court. 02.07.2019 Text
Article SIN v Facebook: tech giant sued over private censorship in landmark case in Poland On 7 May 2019 Spoleczna Inicjatywa Narkopolityki (Civil Society Drug Policy Initiative, “SIN”), supported by the Panoptykon Foundation, filed a lawsuit against Facebook in a strategic litigation aimed at fighting private censorship on the Internet. Online platforms act as the ‘gatekeepers’ to online expression, thus gaining tremendous power over the information circulated on the Internet – power which they wield without an adequate accountability or responsibility. Moderation is necessary to fight illegal, harmful content but unfortunately perfectly legal and socially valuable materials often fall prey to it. We hope that our lawsuit against Facebook will help change this. 07.05.2019 Text