Article 19.08.2015 1 min. read Text Image Panoptykon Foundation have received the Open Society Foundations, the Ford Foundation, and the Media Democracy Fund’s grant for investigation of the implications of algorithmically driven categorization and resource distribution to Poland’s 1.8 million unemployed citizens. More and more decisions that determine our lives are made by algorithms using extensive data about citizens. Although societies are increasingly aware of the specter of constant surveillance as an invasion of privacy, implications of large-scale data collection, retention, and analysis—by corporations or governments – are not properly researched. To amend that gap the Open Society Foundations, the Ford Foundation, and the Media Democracy Fund launched call for proposals to investigate the implications of algorithmic decision making for open society issues. Founders choose 12 projects form 10 countries, which results will help to better understand those problems – one of them is Panoptykon’s. More information about call for proposals and projects that won funding: Preserving Open Society in a World Run by Algorithms and on the Open Society Foundations webpage. Fundacja Panoptykon Author Previous Next See also Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive… 28.09.2021 Text Article Webinar: Alternative recommender systems in the DSA [recording] Facebook Files provided yet another confirmation that the company's extremely profitable recommender systems come at a high price paid by vulnerable individuals and our societies. Algorithms optimised for engagement amplify toxic content, such as hate speech or disinformation, and target humans… 23.11.2021 Text Article Discrimination in datafied world Data-driven technologies are not neutral. A decision to collect, analyse and process specific kind of information is structured and motivated by social, economic and political factors. Those data operations may not only violate the right to privacy but also lead to discrimination and oppression of… 10.07.2018 Text