Article 19.08.2015 1 min. read Text Image Panoptykon Foundation have received the Open Society Foundations, the Ford Foundation, and the Media Democracy Fund’s grant for investigation of the implications of algorithmically driven categorization and resource distribution to Poland’s 1.8 million unemployed citizens. More and more decisions that determine our lives are made by algorithms using extensive data about citizens. Although societies are increasingly aware of the specter of constant surveillance as an invasion of privacy, implications of large-scale data collection, retention, and analysis—by corporations or governments – are not properly researched. To amend that gap the Open Society Foundations, the Ford Foundation, and the Media Democracy Fund launched call for proposals to investigate the implications of algorithmic decision making for open society issues. Founders choose 12 projects form 10 countries, which results will help to better understand those problems – one of them is Panoptykon’s. More information about call for proposals and projects that won funding: Preserving Open Society in a World Run by Algorithms and on the Open Society Foundations webpage. Fundacja Panoptykon Author Previous Next See also Article Monologue of the Algorithm: how Facebook turns users data into its profit. Video explained Does Facebook identify and manipulate your feelings? Is it able to recognize your personality type, habits, interests, political views, level of income? Does it use all the information in order to reach you with personalized ads or sponsored content? You bet! 13.01.2018 Text Article Three layers of your digital profile Your online profile is not always built on facts. It is shaped by technology companies and advertisers who make key decisions based on their interpretation of seemingly benign data points: what movies you choose watch, the time of day you tweet, or how long you take to click on a cat video. 18.03.2019 Text Article Webinar: Alternative recommender systems in the DSA [recording] Facebook Files provided yet another confirmation that the company's extremely profitable recommender systems come at a high price paid by vulnerable individuals and our societies. Algorithms optimised for engagement amplify toxic content, such as hate speech or disinformation, and target humans… 23.11.2021 Text