Article 05.04.2015 1 min. read Text EDRi, Panoptykon Foundation and Access expressed their concern regarding the proposed Directive on EU Passenger Name Record. In current form, the proposal poses the risk of discrimination e.g. on religious grounds. Moreover, the proposal will bring significant costs to Member states. And all these with lack of evidence that such measures are effective in prevention of serious crimes. Passenger Name Records (PNR) are data containing information provided by passengers and collected by air carriers for commercial purposes. This can contain several pieces of information such as dates, itinerary and contact details. All PNR data is stored in airlines’ databases. Many of these types of data can be used and aggregated to build profiles. For instance, meal preference can provide information about religious affiliation, hotel reservations can indicate passengers’ personal relationships, etc. – as explained Diego Naranjo from EDRi. Opinion by EDRi, Panoptykon Foundation and Access regarding proposal for a Directive on an EU Passenger Name Record [PDF, 245,32 KB] Fundacja Panoptykon Author Previous Next See also Article Monologue of the Algorithm: how Facebook turns users data into its profit. Video explained Does Facebook identify and manipulate your feelings? Is it able to recognize your personality type, habits, interests, political views, level of income? Does it use all the information in order to reach you with personalized ads or sponsored content? You bet! 13.01.2018 Text Article Discrimination in datafied world Data-driven technologies are not neutral. A decision to collect, analyse and process specific kind of information is structured and motivated by social, economic and political factors. Those data operations may not only violate the right to privacy but also lead to discrimination and oppression of… 10.07.2018 Text Article Webinar: Alternative recommender systems in the DSA [recording] Facebook Files provided yet another confirmation that the company's extremely profitable recommender systems come at a high price paid by vulnerable individuals and our societies. Algorithms optimised for engagement amplify toxic content, such as hate speech or disinformation, and target humans… 23.11.2021 Text