Human biomass on Facebook
Does Facebook identify and manipulate your feelings? Is it able to recognize your personality type, habits, interests, political views, level of income? Does it use all the information in order to reach you with personalized ads or sponsored content? You bet! Mark Zuckerberg's empire is based on analysing the digital dandruff users leave behind on the Internet, transforming it into valuable consumer profiles and selling it to advertisers. In this factory you are not even a product. You are just a human biomass, which only gains value after being shaped and worked on by an algorithm. How does it work exactly? Take a look.
As a Facebook user you may feel loved and valued most of the time. When your birthday is around the corner, Facebook shows you a special video saying how wonderful and special person you are. It keeps reminding you the good time you had with your friends and makes it easier to contact everybody. Moreover, you get a new portion of (disputably) relevant information selected specially for you – and (allegedly) for free– whenever you want! You don’t need to spend hours on the Internet crawling through hundreds of websites and search results. Facebook is a great pal and it really cares about you, doesn’t it?
No. Rather than a friend, Facebook is an impenetrable black box with special powers. The first one is typical for any black box: it registers almost everything. You may think it is particularly interested in your activity within the social network, but it’s not so. It tracks everything you do, even if you are not logged in. The Facebook black box is stuffed with data and hard-working algorithms that determine your personal experience on the Internet. The algorithms decide what appears on your screen and what is hidden from you. They are responsible for spreading the news and making some pieces of information more visible than others. They are responsible for the new form of labour and exploitation. The black box is, in fact, a factory. You, the user, is not a client. You become just a raw material, human biomass converted into a sellable digital profile on Internet stock market – in Facebook Ad Manager.
Why should you worry about a well-crafted ad or a stream of emotionally engaging news? Let’s see. Do you have anything to say on what criteria are used to select the content for you? Do you understand the logic behind the process of presenting the particular piece of information in your browser? No. Users are not meant to nor do they understand the rules of the game. Then how can we stand up against being an object of an obscure or a dishonest deal? Facebook neuronal networks are watching, listening and tracking patterns in your behaviour. With data of over two billions users interacting daily with each other, algorithms learn very quickly. They know, certainly before you, which of the post in your News Feed you will click and how it will resonate among your network of Facebook friends.
People for sale
The first wakeup call about Facebook’s shabby game came in 2014 when the company admitted to have manipulated the emotional state of over 700,000 unaware users. The experiment was designed to analyse behaviour of users exposed to negative and positive information. However, the case was forgotten due to the discoveries about (alleged) Russian interference with the US presidential elections and the Brexit campaign in 2016. It is impossible to judge whether the microtargeting ad campaigns determined the final result of both campaign. Nevertheless, the Facebook employees bragging in from of politicians about the company’s effectiveness in political marketing both in U.S. and Europe is not a secret. „Facebook deploys a political advertising sales team, specialized by political party, and charged with convincing deep-pocketed politicians that they do have the kind of influence needed to alter the outcome of elections” – claimed Antonio Garcia-Martinez, former Facebook sales team employee. Martinez, who has worked as a product manager for some years, openly argues that Facebook actually uses the information about its users’ emotional disposition for marketing purposes. As he explains in the article published by the Guardian, it is the only reason why the platform is interested in knowing so much about us.
Emotions hit the roof in May 2017 when a leaked document from the Facebook office in Australia proved the company’s capacity to identify users emotional state. It described an unofficial research conducted (for company sales purposes) on a group of 14 years old users to prove Facebook effectiveness. How many brands would let go an opportunity to directly target a teenager who is feeling “stressed”, “defeated”, “overwhelmed”, “anxious”, “nervous”, “stupid”, “silly”, “useless”. Or wants to lose some weight? Seems an easy target, right? And not only for a snicker shop.
The secrets of the algorithms
Advertisers want Facebook to provide the highest click through rate. Choosing the right algorithm in order to accomplish this mission is crucial. However, this is not the only „task” the neuronal network of Facebook is occupied with. It is constantly learning how to recognize a photo of living creatures and inanimate objects. How to read different accents and analyse human voice. Thanks to the neuronal network Facebook can better understand its users conversations, thoughts and hidden desires. It is able to determine information about them and their habits (where they live, work, how they move around the city, where, how often and what for they travel), consumption styles (what, when, why they buy) and their level of income (how much they pay for a certain product or a service).
Facebook algorithm is claimed to take into consideration hundreds of (unknown) variables. The platform interface gives users only limited control over some features (we can, for example, indicate our favourite websites or friends) but it’s a just a tip of the iceberg. Fortunately, we can always sneak a look into Facebook patent applications in different countries and their business communication (like documents leaked from Australian office).
This is what Vladan Joler from Share Lab did, when we started working on the ”Monologue of the Algorithm: how Facebook turns users data into its profit”. Thanks to his amazing work, Facebook is no longer a black box – we can all see what happens inside the algorithmic factory.