Vlop! IS HERE ALREADY! YOU USE IT EVERYDAY
Thousands of notifications and likes, engaging videos and content that won’t let you put down your phone … You know it, don’t you?
Vlop! “A new, revolutionary social media app”, does not, in fact, exist. We made it up for our campaign (see video below). But, VLOPs do actually exist, and you use them everyday. VLOP is short for “Very Large Online Platforms”, and refers to apps like Facebook, YouTube, Instagram, X, or TikTok.
Watch the original ad [IN POLISH]
In our campaign, we emphasized the actual mechanisms used by VLOPs to draw attention to the problems they cause. These problems stem from VLOPs business model and algorithms, which are optimized to engage users’ attention for as long as possible.
THE ALGORITHM WILL SUCK YOU IN
VLOPs aren’t looking for great ideas but for profits. Algorithms designed to deliver your personalized feed feed on your data and your vulnerabilities. They strive to hold your attention with emotive content which is not only toxic but also addictive, and for some people – especially the young – may cause anxiety, depression, addictions, or eating disorders.
“YOU WILL ASK FOR MORE”
When you score a high number of ‘likes’ on your social media posts, the same area of your brain is activated as if you ate chocolate or won money. Algorithms use the same mechanisms as casinos – they make us scroll endlessly and provoke FOMO (fear of missing out), and eventually lead to addiction – that’s why it’s so difficult to put down your phone.
“WE KNOW WHAT YOU REALLY LIKE”
Big tech companies use the vast digital footprint we leave behind us when using the internet. They know what you click, what you like, what catches your eye, what you discuss and with whom, and how often you use a particular app. Even if you don’t disclose certain information, for example, your sexual orientation or health history, the algorithms will identify it from your activity online. Then they will use this information to feed you with content that will more effectively capture your attention to the screen. You don’t want to read about accidents involving children but such posts catch your eye anyway? You will only be served more and more posts of that theme.
“UNCENSORED PATOSTREAMING*”
VLOPs deliberately display the most engaging, emotional and provoking content. Many controversial materials – like patostreams, extreme videos, fake news – leak through the platforms’ moderation net and find their way to your feed.
*Patostreaming is the real-time video broadcasting of pathological behaviours such as bullying or violence.
“ONLY PEOPLE WHO THINK LIKE YOU”
Platforms deliberately stuff your feed full of the content that confirms your views, usually published or ‘liked’ by your ‘friends’, or basically, people like you. That’s what we ‘like’, don’t we? At the same time, you only get content from outside your bubble when your ‘friends’ already laughed at it or criticized it. This increases the chance that you yourself click on the ‘angry face’. But, also, it causes different groups in the society to drift further apart and makes it more and more difficult to find mutual understanding.
A BETTER NEWSFEED IS POSSIBLE!
On 25 August 2023, a new law came into force. The Digital Services Act (DSA) or the “new constitution of the internet” means new obligations for VLOPs. The big tech companies can no longer pretend nothing is wrong.
Among other obligations, platforms will have to amend their recommender systems. Since 25 August, the platforms do not only have to explain how their feeds work, but they also must offer their users a choice of at least one other system of content curation which is not based on tracking (meaning: it does not use personal data to recommend content). The platforms must also analyze the risks related to their algorithms on a regular basis and prove how they mitigate such risks.