Report 17.04.2018 2 min. read Text Image We are right to be worried about the polarization of public debate, the rise of populism and digital propaganda. It also goes without saying that social media have a growing impact on our politics and society. However, one should be cautious not to confuse observations with explanations. Focusing on fake news, bots and algorithms it is easy to miss real agents behind the screens: humans. Not only those who create content and tech tools, but also average users. Can we, as individuals, control and influence the quality or diversity of information we receive online? Is it our responsibility to “consume responsibly”? Are we forced to live inside information bubbles or can we do something about it? Seeking answers to those questions, Panoptykon Foundation invited researchers from the University of Pennsylvania – Dr Emad Khazraee and Pawel Popiel – to design a case study of political debate on Polish Twitter (social media venue most populated by political influencers and journalists). We wondered who was creating trends, who spoke to whom and what type of conversation it was. Did influencers with different opinions confront each other? Was the debate manipulated or otherwise influenced by false amplifiers? In September and October 2017, when Polish streets and social media venues were fuming with civic unrest (women protesting against ban on abortion, young doctors fighting for public healthcare reforms, citizens defending independent judiciary), we collected and analysed nearly one million tweets. Here is just a snippet of what we found out: Polish Twitter does not encourage confrontation of opinions. In both political bubbles that we identified influencers talked about the same topics but hardly ever talked to each other. It is prominent individuals and well-established organisations (politicians, journalists, mainstream media) that shape political discourse on Twitter, not bots. Bots (false amplifiers) can be used for influence-boosting but on their own they are not in a position to change or create trends. Using network analysis (without deeper, qualitative analysis) it is nearly impossible to differentiate false amplifiers from professionally managed accounts. DOWNLOAD THE FULL REPORT [PDF] Fundacja Panoptykon Author Previous Next See also Article Webinar: Alternative recommender systems in the DSA [recording] Facebook Files provided yet another confirmation that the company's extremely profitable recommender systems come at a high price paid by vulnerable individuals and our societies. Algorithms optimised for engagement amplify toxic content, such as hate speech or disinformation, and target humans… 23.11.2021 Text Article Anxious about your health? Facebook won’t let you forget There is little point in telling Facebook which posts you do not want to see – it will not listen. 07.12.2023 Text Article Can the EU Digital Services Act contest the power of Big Tech’s algorithms? A progressive report on the Digital Services Act (DSA) adopted by the Committee on Civil Liberties, Justice and Home Affairs (LIBE) in the European Parliament in July is the first major improvement of the draft law presented by the European Commission in December. MEPs expressed support for default… 03.08.2021 Text