Image
Logo Vlop!
Image
VLOP campaign key image

Vlop! IS HERE ALREADY! YOU USE IT EVERYDAY

Thousands of notifications and likes, engaging videos and content that won’t let you put down your phone … You know it, don’t you?

Vlop! “A new, revolutionary social media app”, does not, in fact, exist. We made it up for our campaign (see video below). But, VLOPs do actually exist, and you use them everyday. VLOP is short for “Very Large Online Platforms”, and refers to apps like Facebook, YouTube, Instagram, X, or TikTok.

Watch the original ad [IN POLISH]

Video file

In our campaign, we emphasized the actual mechanisms used by VLOPs to draw attention to the problems they cause. These problems stem from VLOPs business model and algorithms, which are optimized to engage users’ attention for as long as possible.

THE ALGORITHM WILL SUCK YOU IN

VLOPs aren’t looking for great ideas but for profits. Algorithms designed to deliver your personalized feed feed on your data and your vulnerabilities. They strive to hold your attention with emotive content which is not only toxic but also addictive, and for some people – especially the young – may cause anxiety, depression, addictions, or eating disorders.


“YOU WILL ASK FOR MORE”

When you score a high number of ‘likes’ on your social media posts, the same area of your brain is activated as if you ate chocolate or won money. Algorithms use the same mechanisms as casinos – they make us scroll endlessly and provoke FOMO (fear of missing out), and eventually lead to addiction – that’s why it’s so difficult to put down your phone.


“WE KNOW WHAT YOU REALLY LIKE”

Big tech companies use the vast digital footprint we leave behind us when using the internet. They know what you click, what you like, what catches your eye, what you discuss and with whom, and how often you use a particular app. Even if you don’t disclose certain information, for example, your sexual orientation or health history, the algorithms will identify it from your activity online. Then they will use this information to feed you with content that will more effectively capture your attention to the screen. You don’t want to read about accidents involving children but such posts catch your eye anyway? You will only be served more and more posts of that theme.


“UNCENSORED PATOSTREAMING*”

VLOPs deliberately display the most engaging, emotional and provoking content. Many controversial materials – like patostreams, extreme videos, fake news – leak through the platforms’ moderation net and find their way to your feed.

*Patostreaming is the real-time video broadcasting of pathological behaviours such as bullying or violence.


“ONLY PEOPLE WHO THINK LIKE YOU”

Platforms deliberately stuff your feed full of the content that confirms your views, usually published or ‘liked’ by your ‘friends’, or basically, people like you. That’s what we ‘like’, don’t we? At the same time, you only get content from outside your bubble when your ‘friends’ already laughed at it or criticized it. This increases the chance that you yourself click on the ‘angry face’. But, also, it causes different groups in the society to drift further apart and makes it more and more difficult to find mutual understanding.

Image
VLOP campaign image 2

A BETTER NEWSFEED IS POSSIBLE!

On 25 August 2023, a new law came into force. The Digital Services Act (DSA) or the “new constitution of the internet” means new obligations for VLOPs. The big tech companies can no longer pretend nothing is wrong.

Among other obligations, platforms will have to amend their recommender systems. Since 25 August, the platforms do not only have to explain how their feeds work, but they also must offer their users a choice of at least one other system of content curation which is not based on tracking (meaning: it does not use personal data to recommend content). The platforms must also analyze the risks related to their algorithms on a regular basis and prove how they mitigate such risks.

Fixing Recommender Systems:

From the identification of risk factors to meaningful transparency and mitigation

Read

WE NEED PRESSURE TO DRIVE CHANGE

Panoptykon and other civil society organizations advocate for transparency regarding the algorithms used by VLOPs, and moreover, for measures mitigating their toxic effects. We want platforms’ recommender systems to be safe. We want people to have real choice over their feeds – and we want platforms to respect that choice.

Whether or not platforms take DSA obligations seriously, largely depends on pressure from the people.

We monitor the changes introduced by platforms like Facebook or TikTok, and we test new ideas for better recommender systems. We present our findings in reports to the European Commission (which, according to the DSA, is in charge of ensuring platforms follow the new rules).

WANNA KNOW MORE?

Listen to Algorithms of Trauma 2. How Facebook Feeds on Your Fears to know how the platform exploits its knowledge about your anxieties and fears.

Want to escape from the algorithmic trap of commercialised social media? Check tools offered by decentralised internet, e.g. Mastodon or Pixelfed.