Article 31.08.2023 3 min. read Text Image “Our algorithm will suck you in”, “Scroll more, sleep less”. These are slogans from a “new, revolutionary social platform” Vlop!, being advertised on a mysterious, Black Mirror-like LED truck, cruising the Polish capital city. In reality, the platform does not exist, and was made up as part of a social campaign highlighting the actual harmful effects of BigTech companies that operate today. The truck is big, black, with huge LED screens, like the one from Charlie Brooker's Black Mirror first season. It is circling Warsaw’s main streets and neighbourhoods, broadcasting an ad for a “new, revolutionary social media app” – Vlop!, along with disturbing, provocative slogans such as: “You will want more and more and more”, “We know you better than you know yourself”. In reality, the Vlop! application does not exist, and was fabricated for a campaign by Panoptykon, a Polish civil society organization protecting human rights in the context of modern technologies and surveillance. The name “Vlop!” is a reference to Very Large Online Platforms (VLOPs) – a term used by the EU’s new legislation, Digital Services Act (DSA), with respect to the biggest platforms such as Facebook, TikTok, YouTube and X (previously Twitter). Panoptykon’s campaign is being launched after the DSA has become fully effective with regard to VLOPs on the 25th of August. To comply with the new regulation, cybergiants must, among other things, introduce changes concerning their additive algorithms which they use to personalise users’ feeds. They must now explain how those algorithms work and offer at least one recommender system not based on tracking users’ personal data. Panoptykon, along with other European digital rights NGOs, are trying to raise awareness about the new law and monitor its implementation on the platforms. The organisation focuses in particular on countering the harms related to the addictive nature of newsfeeds’ algorithms and their influence on our mental wellbeing. Vlop! – the fake “new, revolutionary platform” created as part of Panoptykon’s “Better newsfeed is possible” campaign is set to draw our attention to the problems related to already existing applications we happily use everyday. Facebook and TikTok (and the likes) algorithms are built with one aim only: to maximize our engagement and time spent on the platform, often leading to amplifying clickbait or harmful content. For some users, especially among young people, compulsive scrolling and over-exposure to toxic content may exacerbate depression, anxiety, eating disorders and other mental health issues. What the fake Vlop! advertises in Panoptykon’s campaign, in reality has already been present in our phones for a long time. Along with the campaign, Panoptykon has launched the www.vlop.me website, explaining the mechanisms and risks behind big platforms’ algorithms, as well as advocating for change and social pressure on the cybergiants. It won't be easy to fix recommender systems. Where to start? Panoptykon, Irish Council for Civil Liberties and People vs Big Tech coalition investigated their most harmful features and call for change: Fixing Recommender Systems. From identification of risk factors to meaningful transparency and mitigation. The creative concept of Vlop! with all the campaign assets was prepared by the Warsaw-based creative group NOWY®ŁAD. Fundacja Panoptykon Author Topic social media Previous Next See also Report Algorithms of trauma: new case study shows that Facebook doesn’t give users real control over disturbing surveillance ads A case study examined by Panoptykon Foundation and showcased by the Financial Times, demonstrates how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. The experiment shows that users are unable to get rid of disturbing content: disabling sensitive… 28.09.2021 Text Podcast Algorithms of Trauma 2. How Facebook Feeds on Your Fears Worried about your health? Facebook won’t let you forget. 07.12.2023 Audio Article Discrimination in datafied world Data-driven technologies are not neutral. A decision to collect, analyse and process specific kind of information is structured and motivated by social, economic and political factors. Those data operations may not only violate the right to privacy but also lead to discrimination and oppression of… 10.07.2018 Text