Report 09.03.2016 1 min. read Text In October 2015 we published a report on “Profiling the Unemployed in Poland: Social and Political Implications of Algorithmic Decision Making”. The report focused on the lack of transparency and discriminating character of profiling assistance for the unemployed in Poland, introduced by the Ministry of Labor and Social Policy in May 2015. By dividing people into three categories by their “readiness” to work, the place they live, disabilities and other data the system’s default criteria of profiling assistance are leading to discrimination. In our report, we formulated recommendations, how to amend the system to respect fundamental rights. First of all, more transparency should be provided, to ensure that people seeking help of the labor offices are assisted according to the fair and transparent criteria. The regulations should be based on the common law, not just internal regulations Any terms based on stereotypes – therefore leading to social exclusion – should be reformulated. Secondly, the unemployed should be granted possibility to appeal from the result of the categorization. The report was based on statistics from 100 labor offices and interviews with both public officials and the unemployed. Fundacja Panoptykon Author Previous Next See also Article DSA: Polish translation needs correction. Transparency rules for recommender systems do apply to smaller platforms as well We found an essential mistake in the Polish official translation of the Article 27 of Digital Services Act: 11.05.2023 Text Article Digital sanctions won’t solve the problem of war propaganda online. Robust platform regulations will European officials urged Big Tech to ban Kremlin-related accounts in the effort to tackle the propaganda online, as the Internet – and particularly the social media – became an important front of Russian invasion on Ukraine. But such “digital sanctions” are just a Band-Aid on a bullet wound. Yet… 14.03.2022 Text Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive… 28.09.2021 Text