Article 19.08.2015 1 min. read Text Image Panoptykon Foundation have received the Open Society Foundations, the Ford Foundation, and the Media Democracy Fund’s grant for investigation of the implications of algorithmically driven categorization and resource distribution to Poland’s 1.8 million unemployed citizens. More and more decisions that determine our lives are made by algorithms using extensive data about citizens. Although societies are increasingly aware of the specter of constant surveillance as an invasion of privacy, implications of large-scale data collection, retention, and analysis—by corporations or governments – are not properly researched. To amend that gap the Open Society Foundations, the Ford Foundation, and the Media Democracy Fund launched call for proposals to investigate the implications of algorithmic decision making for open society issues. Founders choose 12 projects form 10 countries, which results will help to better understand those problems – one of them is Panoptykon’s. More information about call for proposals and projects that won funding: Preserving Open Society in a World Run by Algorithms and on the Open Society Foundations webpage. Fundacja Panoptykon Author Previous Next See also Article Monologue of the Algorithm: how Facebook turns users data into its profit. Video explained Does Facebook identify and manipulate your feelings? Is it able to recognize your personality type, habits, interests, political views, level of income? Does it use all the information in order to reach you with personalized ads or sponsored content? You bet! 13.01.2018 Text Article Discrimination in datafied world Data-driven technologies are not neutral. A decision to collect, analyse and process specific kind of information is structured and motivated by social, economic and political factors. Those data operations may not only violate the right to privacy but also lead to discrimination and oppression of… 10.07.2018 Text Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive… 28.09.2021 Text