Article 26.02.2026 12 min. read Text Image What needs to be done to make social media a safer space for people of all ages? Together with MEPs from three major political groups and officials from the European Commission, civil society experts led by Panoptykon debated possible solutions during the hearing held in the European Parliament on 24th February. Independent data paints a different picture to claims made by VLOPs, yet platforms dodged having to answer difficult questions. YouTube, TikTok, and Meta refused the invite for the hearing.The Digital Services Act – the EU regulation adopted in 2022 – was meant as a remedy for multiple risks posed by social media platforms to children but also adults. For the first time, the EU forced the biggest platforms to assess systemic risks, limit harmful design, and protect children. On paper, it was a real breakthrough. In practice, everybody, including the Commission, knew that it will take years of investigations, public pressure, and bold political moves to force VLOPs to comply.How far are we in this fight in 2026? Image On invitation of Kamila Gasiuk-Pihowicz (EPP) and Irena Joveva (Renew) Panoptykon co-hosted a hearing in the European Parliament – Protecting children online: Europe at a crossroads. At the hearing we confronted the world described in VLOPs’ official DSA reports and the world documented by independent researchers or civil society organisations, and experienced by young people themselves.These two worlds do not match.We invited representatives of Meta, TikTok and YouTube to join the discussion and answer questions directly, yet none of them showed up.As the white paper Artificial Intelligence in Education: State of Knowledge, Competency Gaps and Systemic Recommendations for Poland – shared with us by our colleagues just yesterday – makes clear, children need “the ability to understand the impact of AI systems on democratic processes, the phenomenon of post-truth and deepfakes” and this capacity, the paper argues, is “the very foundation of digital citizenship.”Platforms that exploit rather than cultivate that capacity are failing our children – and failing the law. It is deeply concerning that companies whose representatives and lobbyists are frequently present in this house and across Brussels decided not to take part in today’s discussion with the Commission, Members of Parliament and civil society. Closed-door meetings are not enough. We are asking very basic questions – questions that come directly from the platforms’ own reports. Civil society is not demanding confidential business information. We want clear data and measurable indicators. We need to know whether the measures platforms describe on paper are actually working in real life.MEP Kamila Gasiuk-Pihowicz, EPP Image MEP Kamila Gasiuk-Pihowicz, EPP (Photo: TZOVARAS Stavros)What platforms claim – and what the research showsAcross their DSA risk assessments, platforms insist they are providing “the strongest safeguards” for minors, especially by:reducing addictive design,introducing (in the case of minors even “by default”) prompts such as “go to bed” and time limits,moderating harmful content effectively,preventing algorithmic amplification of illegal, harmful and age-inappropriate content, verifying users’ age.We would be among the first ones to applaud these policies. But independent research paints a very different picture. During the hearing, researchers and civil society experts focused on three areas where the gap between claims made by VLOPs and reality is most striking.1. Addictive design: platforms are still designed to keep children hookedVLOPs highlight they all have screen‑time management tools and nudges to take a break. Yet research consistently shows that infinite scroll, autoplay, and hyper‑personalised feeds still work exactly as intended – to maximise time spent on the platform. Their design choices are optimised for attention extraction rather then well-being, with young people describing their experiences as being “stuck” in endless sessions.According to to recent survey of Reset Tech in partnership with YouGov [1], 93% of European youngsters experiences that feeling at least once a week and 63.5% every single day. One in five of these teens reported losing sleep every night.The most addictive platforms, according to these findings [survey mentioned in the paragraph above], are TikTok, Instagram and YouTube. 75% of teen TikTok users, 69% of Instagram users and 68% of YouTube users reported staying on these platforms longer than they had intended.The users know exactly what is trapping them: over 80% point directly to endless streams of content and hyper-personalised feeds as the cause. Shifting the burden to “user choice” is a proven failure. Our survey shows that most teens do not use time-management tools, often because they simply do not know they exist. For instance, 41% of teenagers surveyed didn't know YouTube had these tools.Michiel Van Hulten, Reset Tech2. Harmful content: algorithms still recommend what hurtsPlatforms claim their measures (such as parental controls, minimum age requirements, signals for estimating the age of users, etc.) to limit access of minors to age-inappropriate content are effective – even if not perfect, they “result in a significantly lower residual risk profile”, to quote from YouTube Risk Assessment [2]. Yet, according to experiments by Amnesty International, Reset Tech, and others: recommender systems actively push harmful material, minors are still exposed to pro‑eating‑disorder, self‑harm, and depressive content.In recent study conveyed in Poland 39% of 7th grade students stated that they were receiving suicide-related content on social media while not seeking for it. Another study [3] suggests “minors face disproportionately higher levels of harmful videos, spanning 7-15% versus 4-8% for adults, that suggests algorithmic systems push harmful content to children's accounts regardless of potential harms.In addition, the effectiveness of reporting mechanisms is questionable. According to research on content moderation under DSA conducted by Hate Aid [4], “after exhausting all available legal remedies, only 57% of reported illegal content was removed during the project period”.3. Age verification: a leaky barrier that shifts responsibility from platforms to their usersAge verification: a leaky barrier that shifts responsibility from platforms to their users.Platforms proudly present age‑control tools as a solution. In reality, their gates are wide open to minors. Over 58% Polish kids between 7-12 years old has a social media account [5]. Parental control tools also do not solve the problem – they just shift the responsibility from platforms onto children and their parents as if the harms experienced by children could be prevented with more (self) discipline. The problem will persist even if the minimum digital age is raised.In the hearing, youth activist Leandra Voss stated: “age restrictions shift the blame away from the harmful design of the platforms. Thus they simply delay when this harm is done to people.” Image MEP Christel Schaldemose, S&D (left), Michiel Van Hulten, Reset Tech (right) (Photo: TZOVARAS Stavros)This is not a bug. It is their business model.The business model of big tech giants is unacceptable, towards our kids. (…) They come online in their very formative years. They spend more time online than with their teachers and parents. And yet we have not been able to fix the situation.The aim of this legislation [DSA] is crystal clear: we wanted to protect our kids. Unfortunately, it is a daily struggle with online platforms because they are not doing it voluntarily. We have to push them over and over again (…).In the parliament we made a report on minors, where we call for a harmonized EU age limit. In an ideal world it is not necessary. (…) But in this world we have now the tools, the social media are simply not safe enough.MEP Christel Schaldemose, S&DA rare consensus: three political groups, one diagnosisAlignment among MEPs from EPP, S&D, and Renew – three major political groups – was symptomatic of the urgency of the task that European institutions are facing. The business model of platforms is incompatible with children’s safety. The EU has legal and political tools to demand fundamental changes, not superficial corrections. 2026 is the time to use them!Platforms profit from attention. Their incentives are to keep users scrolling, not to protect their wellbeing. This cross‑party consensus matters: it signals that the political centre of gravity in Europe is shifting.Infinite scroll is not just a feature but a design for one more dopamine hit. It has always been so, and it will remain so, because this is the business model(...). Allowing them to do as they wish to maximize engagement is not an option. Clearly the regulators are the ones who have to draw the lines. We need to stop asking if the kids are disciplined enough. We need to start asking if the product is safe enough. We should be drawing the lines for the dealers, not the innocent users.Safety by design isn’t a limitation for freedom, it’s a legal necessity for a functioning society.MEP Irena Joveva, Renew Image From the left: MEP Irena Joveva (Renew), MEP Kamila Gasiuk-Pihowicz (EPP) and MEP Christel Schaldemose (S&D) (Photo: TZOVARAS Stavros)The Commission was in the room and acknowledged both: the problem and readiness to act upon itEuropean Commission officials responsible for reviewing platforms’ DSA reports and drafting the upcoming Digital Fairness Act participated in the debate and expressed several crucial commitments.1. Platforms must ensure that their reports are “comprehensible”Commission officials recognised what civil society has been saying for months [6]: current risk assessments provided by platforms are vague, with unclear methodologies and often claims are not backed by data. Platforms’ claims must be matched by evidence, failing to do so would just make these reports empty statements.It must be clear how are the services working, how are the risks coming about and how is the assessment arrived at. This is also the expectation we absolutely have towards regulated entities.Eike Graef, European Commission – DG Connect2. Researchers and CSOs are invited to share their evidenceThe Commission announced the launch of the DSA Whistleblower Tool, which enables researchers, civil society organisations, and independent experts to submit evidence reporting harmful practices, and access data relevant to DSA enforcement.3. The Commission is ready to go further than the DSAOfficials confirmed that the upcoming Digital Fairness Act may introduce: stricter rules on addictive design, obligations to switch off certain manipulative features, ‘user empowerment’ requirements for recommender systems, and specific, higher protections for children as consumers. In the meantime the question of a minimum digital age is disputed by a high-level expert panel convened by President von der Leyen. If the political pressure on the issue around Europe continues – a new Europe wide legislation will follow. The technical part of age verification is not part of the scope of Digital Fairness Act, however age assurance may be introduced as one of the measures for protecting minors in the upcoming update of horizontal consumer law (DFA).We share your pain when it comes to the lack of data. At this stage of our impact assessment [for the DFA] it is very important for us to have reliable data, good metrics. Again I invitate you to provide us with any research you have. I am also saying this to colleagues from industry if they are listening somewhere.Maria-Myrto Kanellopoulou, European Commission – DG Just Image Maria-Myrto Kanellopoulou, European Commission – DG Just (Photo: TZOVARAS Stavros)Why this hearing matteredThe stakes are too high to rely on self‑reporting by companies whose profits depend on keeping children online. That was the main reason we organised the hearing and invited all stakeholders to participate in the exchange: we wanted platforms to hear our questions, explain how their mitigation measures are meant to work, and share relevant metrics that could inform their work. By choosing not to attend, they showed lack of interest in public consultations with civil society and decision-makers alike.Across participants, we shared concerns that, two years into the DSA enforcement, platforms’ declarations of intent are not enough in addressing harm, and that risk assessment should be carried out transparently. It was agreed that mitigating online risks for children requires confronting VLOPs business model, which is at the heart of the problem.Adoption of the DSA in 2022 was a crucial first step. Uncompromising enforcement is what needs to happen now. As CSOs, we will continue to support the Commission by providing independent evidence. We will also make sure that the upcoming DFA explicitly prohibits pervasive dark patterns (still) used by VLOPs and (at least) the most obvious forms of addictive design.Adoption of the DSA in 2022 was a crucial first step. Uncompromising enforcement is what needs to happen now. As CSOs, we will continue to support the Commission by providing independent evidence. We will also make sure that the upcoming DFA explicitly prohibits pervasive dark patterns (still) used by VLOPs and (at least) the most obvious forms of addictive design. Katarzyna Szymielewicz Author Maria WróblewskaCooperation Linki i dokumenty Brief for the Hearing at the European Parliament on DSA Enforcement and the Protection of Minors (Feb 2026)351.21 KBpdf [1] Reset Tech x YouGov: Stuck Sleepless in the App: How Platforms Designed for Extended Use Impact Minors [2] Google: Report of Systemic Risk Assessments 2025 [3] Eltaher F., Gajula R.K., Pechuan L.M., Crotty P., Protecting Young Users on Social Media: Evaluating the Effectiveness of Content Moderation and Legal Safeguards on Video Sharing Platforms [4] HateAid: Rights Without Reach [5] Instytut Cyfrowego Obywatestwa, Internet dzieci [6] Mozilla Foundation: Request for Basic Data Topic social media children and youth ISPs liability Previous Next See also Article SIN v Facebook: tech giant sued over private censorship in landmark case in Poland On 7 May 2019 Spoleczna Inicjatywa Narkopolityki (Civil Society Drug Policy Initiative, “SIN”), supported by the Panoptykon Foundation, filed a lawsuit against Facebook in a strategic litigation aimed at fighting private censorship on the Internet. Online platforms act as the ‘gatekeepers’ to… 07.05.2019 Text Article Who will not be blocked by Facebook? SIN wins the first court battle The District Court in Warsaw (Appellate Division) upheld its interim measures ruling from 2019 in which it temporarily prohibited Facebook from removing fan pages, run by the Polish NGO “SIN”, on Facebook and Instagram, as well as from blocking individual posts. This means that – until the case is… 14.07.2021 Text Podcast Algorithms of Trauma 2. How Facebook Feeds on Your Fears Worried about your health? Facebook won’t let you forget. 07.12.2023 Audio
Brief for the Hearing at the European Parliament on DSA Enforcement and the Protection of Minors (Feb 2026)351.21 KBpdf