European officials urged Big Tech to ban Kremlin-related accounts in the effort to tackle the propaganda online, as the Internet – and particularly the social media – became an important front of Russian invasion on Ukraine. But such “digital sanctions” are just a Band-Aid on a bullet wound. Yet again we call therefore for robust platform regulations in the Digital Services Act instead. The current crisis only confirms how badly overdue systemic solutions are.
You are here
Following a number of complaints filed in 2018 and 2019, including by Panoptykon and Bits of Freedom, and coordinated by the Irish Council for Civil Liberties, the Belgian Data Protection Authority has found that the consent system developed and managed by the adtech industry body IAB Europe, and used by many websites in the EU, is illegal under the GDPR.
Following Panoptykon’s General Data Protection Regulation (GDPR) complaint against one of the biggest Polish news website, Interia.pl – the Polish Data Protection Authority has confirmed that online publishers should give users access to their advertising profiles generated for the purposes of delivering behavioural ads.
“We don’t have to manipulate our customers or exploit their vulnerabilities to scale up” – European entrepreneurs and social organizations appeal to the MEPs to put an end to invasive and privacy-hostile practices related to surveillance-based advertising and thus open the market to ethical and innovative online ads, which respect users’ rights and their choices. On the opposite bench – the Big Tech lobby fights for the status quo to remain – despite the well-documented social and individual harms caused by the current ads ecosystem.
Facebook Files provided yet another confirmation that the company's extremely profitable recommender systems come at a high price paid by vulnerable individuals and our societies. Algorithms optimised for engagement amplify toxic content, such as hate speech or disinformation, and target humans based on their vulnerabilities.
A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers, but does not affect Facebook’s own profiling and ad delivery practices. While much has been written about the disinformation and risks to democracy generated by social media’s data-hungry algorithms, the threat to people’s mental health has not yet received enough attention.
The list of negative consequences of how dominant online platforms shape our experience online is neither short nor trivial. From exploiting users’ vulnerabilities, triggering psychological trauma, depriving people of job opportunities to pushing disturbing content to others, these are just some examples. While members of the European Parliament debate their position on the Digital Services Act, Panoptykon Foundation, together with 49 civil society organisations from all over Europe, urge them to ensure protection from the harms caused by platforms’ algorithms.
Algorithmic decision-making can carry more weight than you might expect. While algorithms do innocuous or helpful things like changing the traffic signals when you approach an intersection, they also decide what content to show in your social media feed. There are also algorithms that assist real people in deciding whether you can get a mortgage, get into a particular university or qualify for insurance. How does it work? Katarzyna Szymielewicz explains three layers of digital profile in the video.
A progressive report on the Digital Services Act (DSA) adopted by the Committee on Civil Liberties, Justice and Home Affairs (LIBE) in the European Parliament in July is the first major improvement of the draft law presented by the European Commission in December. MEPs expressed support for default protections from tracking and profiling for the purposes of advertising and recommending or ranking content. Now the ball is in the court of the leading committee on internal market and consumer protection (IMCO), which received 1313 pages of amendments to be voted in November. Panoptykon Foundation explores if the Parliament would succeed in adopting a position that will contest the power of dominant online platforms which shape the digital public sphere in line with their commercial interests, at the expense of individuals and societies.
The District Court in Warsaw (Appellate Division) upheld its interim measures ruling from 2019 in which it temporarily prohibited Facebook from removing fan pages, run by the Polish NGO “SIN”, on Facebook and Instagram, as well as from blocking individual posts. This means that – until the case is decided – SIN’s activists may carry out their drugs-related education on the platform without concerns that they will suddenly lose the possibility to communicate with their audience. The decision is now final. What does it mean on the broader scale?