Report 09.03.2016 1 min. read Text In October 2015 we published a report on “Profiling the Unemployed in Poland: Social and Political Implications of Algorithmic Decision Making”. The report focused on the lack of transparency and discriminating character of profiling assistance for the unemployed in Poland, introduced by the Ministry of Labor and Social Policy in May 2015. By dividing people into three categories by their “readiness” to work, the place they live, disabilities and other data the system’s default criteria of profiling assistance are leading to discrimination. In our report, we formulated recommendations, how to amend the system to respect fundamental rights. First of all, more transparency should be provided, to ensure that people seeking help of the labor offices are assisted according to the fair and transparent criteria. The regulations should be based on the common law, not just internal regulations Any terms based on stereotypes – therefore leading to social exclusion – should be reformulated. Secondly, the unemployed should be granted possibility to appeal from the result of the categorization. The report was based on statistics from 100 labor offices and interviews with both public officials and the unemployed. Fundacja Panoptykon Author Previous Next See also Article Discrimination in datafied world Data-driven technologies are not neutral. A decision to collect, analyse and process specific kind of information is structured and motivated by social, economic and political factors. Those data operations may not only violate the right to privacy but also lead to discrimination and oppression of… 10.07.2018 Text Article Exceptional laws for an exceptional time In early December 2018 the United Nations Climate Change Conference (COP24) will take place in Katowice, Poland. The act to regulate the organisation of the event restricts civil liberties to an extent that has already become familiar to Polish citizens. 15.06.2018 Text Article Webinar: Alternative recommender systems in the DSA [recording] Facebook Files provided yet another confirmation that the company's extremely profitable recommender systems come at a high price paid by vulnerable individuals and our societies. Algorithms optimised for engagement amplify toxic content, such as hate speech or disinformation, and target humans… 23.11.2021 Text