SIN vs Facebook

Online platforms such as Facebook, YouTube, and Twitter increasingly control what you can see and say online. These solutions are supposed to target harmful content, such as hate speech or incitement of violence. Unfortunately, there has also been a number of instances when legal and valuable content was removed, including historical photos, war photography, publications documenting police brutality and other human rights’ violations, coverage of social protests, works of art, and satire.

Such unjustified and excessive removal of content by online platforms is often referred to as ‘private censorship’. It is particularly dangerous today, when the online environment is dominated by a handful of platforms with a global reach which have become the key channels of communication, as well as the important sources of news.

Private censorship is further exacerbated by the lack of transparent rules on content moderation or effective appeal procedures, which makes it difficult to challenge a platform’s decision.

This needs to change! In the case SIN vs Facebook we are fighting to defend the rights of users whose freedom of expression was unduly restricted by arbitrary banning by the tech giant. In March 2024 we won before the court of first instance. But this is not the end of the case yet – Meta appealed and the case went to the Appeal Court.

Video file
SIN vs Facebook case

The Civil Society Drug Policy Initiative (Społeczna Inicjatywa Narkopolityki, or “SIN”) is a Polish NGO which has for many years conducted educational activities concerning the harmful consequences of drug use as well as provided assistance to people who abuse such substances, including harm reduction activities.

In 2018, without any warning or clear explanation, Facebook removed fan pages and groups run by SIN. The platform had characterized them as “in violation of Community Standards”.

In January 2019, one of the SIN’s accounts on Instagram, a subsidiary of Facebook, was also removed in similar circumstances.

On 7 May 2019, SIN – supported by the Panoptykon Foundation – filed a lawsuit against Facebook, demanding restoration of access to the removed pages and accounts, as well as a public apology.

Support us

We continue our fight against private censorship. We prepare for costs related to the next stage of the legal action. How you can help:

Case background

Since 2011 SIN has run a Facebook fan page, which it used to warn against the dangers of substance abuse. Suddenly, in 2018 Facebook moderators (or its algorithms) raised objections to SIN’s activity: first one of the SIN-moderated groups and then the entire fan page, followed by 16,000 users, were found “in violation of Community Standards” and removed. SIN has attempted to use the mechanism provided by Facebook to challenge these removals, but to no avail. Members of SIN still do not know which particular content was deemed by Facebook as a violation of its Community Standards, and for what reason.

SIN’s target audience are young people who are particularly at risk of experimenting with drugs and who are active in social media. Facebook was the key communication channel of the organisation, which SIN used to promote its activities and mission, to contact its volunteers, and to raise funds.

Through Facebook, people using drugs could seek SIN’s help, while Instagram allowed SIN to reach younger users. The removal of these pages, groups, and accounts has made it considerably more difficult for the organisation to carry out its educational activities and other statutory tasks, as well as reduced the reach of the published information and the possibility to communicate with a larger audience.

Case timeline
7.05.2019. Lawsuit filed

SIN, supported by the Panoptykon Foundation, filed a lawsuit against Facebook, demanding restoration of access to the removed pages and accounts, as well as a public apology.

11.06.2019. Interim measures ruling

The District Court in Warsaw, in its interim measures ruling delivered on 11 June 2019, has temporarily (for the duration of the proceedings) prohibited Facebook from removing fan pages, profiles and groups run by SIN on Facebook and Instagram, as well as from blocking individual posts. The court has furthermore obliged Facebook to store profiles, fan pages, and groups deleted in 2018 and 2019 so that – if SIN wins the case eventually – they can be restored together with the entire published content, comments by other users, as well as followers and people who liked the fan pages. The court has also confirmed its jurisdiction and that the Polish law applies to this case. The ruling is not final – Facebook has a right to appeal it.

More information about the ruling

6.08.2019. Statement of the claim returned by Facebook

Facebook’s attorney argued that Facebook was entitled to refuse to accept the documents as “none of the Facebook Ireland employees on the litigation team are Polish speakers”. The court summoned SIN to make an advance payment for the translation of court documents. On 6 August 2019 SIN sent to court claimant’s position. SIN pointed out that Facebook has directed its services to Polish users in Polish, which proves that they can protect their rights without translating into English the statement of claim. SIN requested the court to determine whether Facebook justifiably refused to accept the court documents.

More information (in Polish) about the language of the case

12.03.2020. The translator urged to speed up their work

Despite the fact that the court has not yet settled the dispute regarding the language of the proceedings, at the beginning of January the lawsuit (including interim measures ruling) was sent to the translator. The work on translating the 28-page document has been going on for over two months. Thus SIN asked the court to urge the translator to speed up their work. The lack of progress in this case so far is a success of Facebook's strategy of using procedural questions as a pretext to extend the proceedings. Unfortunately, due to the current epidemiological situation, this strategy has a good chance of winning.

July 2020. SIN obliged to pay for the English translation of the case files

Court obliged SIN to pay 8,841.56 PLN (c.a. 2000 EUR) for the English translation of the lawsuit and the interim measures ruling. The court also decided to deliver the translated documents to Facebook via the court in Ireland.

15.09.2020; 19.10.2020. Facebook’s response to the SIN’s lawsuit and its appeal against the interim measures ruling

Facebook filed a response to the SIN’s lawsuit and an appeal against the 2019 interim measures ruling.

Facebook questioned both the infringement of SIN’s personal rights, and the jurisdiction of the Polish court. The company asked the appellate court to dismiss the SIN’s lawsuit and to overturn the interim measures ruling.

16-20.11.2020. Crowdfunding campaign for the translation bill

In less than 5 days we raised the amount needed to cover the cost of translating the lawsuit and the interim measures ruling as ordered by the court. We raised 9,305.80 PLN (the exact cost of the translation is 8,841.56 PLN). Big thank you to everyone who contributed to our campaign!

Still, we believe that – in a dispute with a global corporation like Facebook – everyone should have the right to argue their case in their own language. Thus, we lodged an appeal against the court decision to translate the documents. If our appeal is accepted, we will use the funds to cover further case-related costs.

26.11.2020. SIN response to Facebook’s appeal

SIN filed a response to Facebook’s appeal of the interim measures ruling. SIN addressed Facebook’s arguments, and asked the court to quash the company’s appeal and to uphold the first instance decision from 2019.

14.05.2021. Facebook’s appeal quashed. The original interim measures ruling upheld

The Appelate Division of the District Court in Warsaw quashed Facebook’s appeal. The company questioned the 2019 interim measures ruling (which was favourable to SIN). The decision is final and binding for Facebook! It means that for the duration of the proceedings:

  1. Facebook is prohibited from removing current fan pages, profiles, and groups run by SIN on Facebook and Instagram, as well as from blocking individual posts;
  2. SIN’s profiles, fan pages, and groups removed in 2018 and 2019 are to be securely backed up, so that – if SIN eventually wins the case – they can be restored together with the entire published content, comments by other users, as well as followers, and people who liked the fan pages;
  3. the Court has also confirmed that it has a jurisdiction to hear the case, and that the Polish law applies to it.

Read more about the Court’s decision

16.03.2022. The court decision obliging SIN to pay for the English translation of the case files upheld

SIN is to cover the cost of the English translation of the lawsuit and the interim measures ruling after the court dismissed SIN’s appeal in this regard (8,841.56 PLN, approx. 2,000 EUR). The court’s decision in this regard is final. However, if Facebook (now Meta) eventually loses the case, it will be obliged to reimburse the fee. Meanwhile, we paid the fee thanks to individual donations we received for this purpose in 2020 (big thanks to everyone who contributed to it!).

25.01.2023. Amicus curiae brief by German Society for Civil Rights, and Meta’s demand to reject it

The Society for Civil Rights e.V. (Gesellschaft für Freiheitsrechte e.V. or “GFF”), in its brief filed with the Warsaw District Court, summarised the recent case law of German courts, including the Federal Tribunal, in cases similar to SIN vs Facebook, i.e. regarding arbitrary censorship in social media. According to the German Federal Tribunal, online platforms cannot arbitrarily block content or remove social media accounts. It confirmed that users should be entitled to receive justification of platform’s content moderation decisions and have an effective right to appeal. Moreover, if the platform intends to remove the entire account/page, in principle it should hear the user’s counterarguments before the removal.

Meta requested that the Polish court refuses to accept the amicus curiae brief.

More about the brief (in Polish)

7.02.2023. The first court hearing before the Warsaw District Court

The court questioned the president of SIN who told about the activities of the organization and the negative effects caused, in particular, by the removal of the main SIN’s Facebook page. The court also admitted evidence from the hearing of two former board members of the organization who, at the time of the removals, managed the SIN's social media accounts. They will be questioned during the next hearing on the specific circumstances around the removals of the SIN's Facebook and Instagram accounts and groups.

More about the first hearing

23.06.2023. The second court hearing before the Warsaw District Court

The court questioned a former SIN board member who was responsible for managing the organisation’s social media accounts at the time of the removals. The final witness in the trial, the SIN’s former president of the board, will be questioned during the third (and, most likely, the last) hearing scheduled on February 13, 2024, at 10.00. We expect the court to deliver the judgment soon afterwards.

More about the second hearing (in Polish)

13.02.2024. The third court hearing before the Warsaw District Court

At the hearing, the court heard the last witness: the president of SIN at the time their Facebook account was blocked. It closed the court’s proceedings and announced that it would deliver its verdict in a month.

13.03.2024. Judgment of the Warsaw District Court

We won before the first instance court! The Warsaw District Court ruled that, by arbitrarily blocking the accounts and groups of SIN, Meta infringed the organization’s personal rights. The court obliged Meta to restore the blocked content and publicly apologize to SIN.

More about the judgment

July 2024. Justification of the judgment delivered to parties

Meta appealed the decision on jurisdiction and the judgment on the merits to the Court of Appeals.

Justification of the judgment of the court of first instance – analysis (in Polish)

Nie
Frequently asked questions about the SIN vs Facebook case

The story of SIN is a good example of the threats posed by private censorship online. In the SIN vs Facebook case we fight for the court to recognize that non-transparent and arbitrary actions of the social network led to an unjustified infringement of rights of that organisation. We hope that as a result of the case:

  • online platforms will improve the transparency of their decision-making process in cases of removal of content: in particular so that a banned user knows specifically which content was found inadmissible, exactly which clauses of the Community Standards it violated, and why;
  • any bans imposed by the platforms will be proportionate to the infringement: while platforms should remove any harmful content, one problematic post should not, in principle, result in the removal of the entire page or account;
  • blocked users will find it easier to effectively challenge any removal decision which they believe to be wrong or unjustified. We want the online platforms to justify their decisions, and to create an internal appeal mechanism which would permit an effective challenge of their decisions and respect banned users’ right to be heard.

More information about our long-term goals in the What has to change to curb the risks of private censorship? section.

A favourable judgment would serve as a helpful precedent for other persons who believe that they have been unfairly banned by an online platform, and would make it easier for them to assert their rights in court.

The lawsuit will hopefully help change the practices of online platforms (more information in the section: What needs to change to curb the risks of private censorship?). Using lawsuits to fight for systemic changes is often referred to as “strategic litigation”. Examples of NGOs successfully using strategic litigation to strengthen human rights protection in the context of new technologies, include the following cases:

  • Digital Rights Ireland convinced the Court of Justice of the European Union to annul the EU Data Retention Directive (e.g. phone records). In many EU countries the judgment sparked legal changes which boosted data protection and the right to privacy in the context of data storage;
  • Max Schrems, the founder of NOYB – the European Center for Digital Rights, filed a complaint against Facebook. It led the Court of Justice of the European Union to annul the European Commission’s decision on the Safe Harbour scheme. The scheme allowed American companies to transfer personal data of EU citizens onto their servers in the US without providing the necessary data protection safeguards. In fact, as demonstrated by the information released by Edward Snowden, that data could easily be accessed by American security agencies;
  • in a settlement with the American Civil Liberties Union, Facebook promised to no longer allow advertisers of loans, job offers and real estate to target their ads in a discriminatory manner (e.g. on the basis of age or race).

The SIN vs Facebook litigation complements other Panoptykon’s advocacy, education, and research projects, aimed at promoting responsible and transparent operation of online platforms (including the scope of content moderation). For example:

  • we advocated for regulations curbing private censorship to be included in the Digital Services Act;
  • we joined NGOs from accross the world in signing a letter to Mark Zuckerberg concerning due process in content moderation;
  • we drafted an opinion on the proposal of the EU’s Regulation on preventing the dissemination of terrorist content online (submitted to the Polish Ministry of Digital Agenda);
  • we filed complaints against Google and IAB, concerning data protection in the context of digital advertising.

The arbitrary banning by Facebook of SIN’s fan pages, accounts and groups has infringed SIN’s personal rights (articles 23-24 of the Polish Civil Code).

In the case of legal entities (e.g. companies, organisations, etc.), personal rights are “non-pecuniary assets which allow [an entity] to carry out its statutory operations (Judgment of the Polish Supreme Court from 14 November 1986, case no. IICR 295/86).” To achieve its objectives, public interest organisations such as SIN need to reach the broadest audience possible. In practice this is not feasible without Facebook.

In our lawsuit we claim that the following personal rights of SIN were infringed:

  • freedom of speech: as the ban imposed by Facebook prevented SIN from freely expressing its opinions, disseminating information, and communicating with its audience;
  • reputation and recognition: as banning SIN on Facebook suggests that the organisation’s activity was harmful and thus undermines SIN’s trustworthiness.

We claim that Facebook’s actions were unlawful: the company infringed the principles of freedom of speech, due process, and the right to an effective remedy.

These principles should be respected not only by states, but also by private entities including, as is the case here, global tech giants. Online platforms actions cannot undermine users’ fundamental rights. Internal regulations which allow for arbitrary censorship of content are void, and a user’s consent (expressed for example when accepting the terms and conditions) is ineffective. This is especially the case where users are faced with monopolies and have no real choice but to agree to the Terms & Conditions that online platforms impose on them.

The case in its first instance was reviewed by the Warsaw District Court (Poland). Meta filed an appeal before the Appeal Court in Warsaw.

SIN (Civil Society Drug Policy Initiative) is a Polish NGO created in 2011. It focuses on drug abuse education and harm reduction, i.e. efforts aimed at reducing the negative consequences of substance abuse. SIN works primarily in clubs and at music festivals as well as online, where it provides assistance and raises awareness of the risks of particularly dangerous drugs.

Harm reduction focuses on protection of the health and life of drug users. According to this approach, while the use of drugs should be discouraged, those who cannot be dissuaded from drug abuse, should be encouraged to do so in a manner least harmful to themselves and those around them.

Over the years, harm reduction has helped save countless lives around the globe. The approach is recommended by the United Nations, the European Union (EU drugs strategy 2013-2020), the National Bureau for Drug Prevention, the Red Cross, Médecins du Monde, and hundreds of other institutions and organizations that work in the field of drug abuse prevention.

More information at SIN website

First of all, we wanted to support a public interest organisation that uses Facebook to advance its goals.

Secondly, we taken into consideration the importance of Facebook for SIN’s statutory work (given for example its target audience, who is predominantly young people). Removing the fan page considerably interfered with SIN’s everyday work. Research confirms that drug users are often “immune” to institutional, top-down education (from experts, in schools). Direct communication and engagement is a more effective way of reaching this group. It also shows Facebook is crucial to the NGOs conducting public health campaigns, especially when it comes to connecting with young people who tend to be more difficult to reach via traditional media.

SIN is the plaintiff (the suing party) in these proceedings, and Facebook is the defendant (the sued party). Panoptykon provides legal support to SIN. We have joined forces with Wardyński and Partners, a law firm, who have agreed to represent SIN in court pro bono.

The case SIN vs Facebook is also a part of our wider campaign aimed at protecting human rights in the online platform environment (check the Why do you believe that suing Facebook is the best way forward? section).

Facebook has over 3 billion users globally (in about 100 countries), and 20 million in Poland alone. When we launched the case, it was by far the most popular social network in Poland, used by roughly 80% of internet users and having a greater reach than two of the largest online information services in Poland, Onet and Gazeta.pl, combined. 

Facebook controls the online world not only because of its financial power and market dominance, but also thanks to the special role it has come to play in our society as a crucial channel of communications, vital forum of public debate, and a leading source of information. Consequently, the way it decides to moderate content shapes how we see the world and other people.

It is time for Facebook to finally accept that with great power comes great responsibility. We firmly believe that a company with such an incredible influence over our freedom of expression has to respect it, and we have the right to hold it accountable should it fail.

Of course the issue of private censorship is not limited to Facebook. It affects other online platforms as well, in particular those that, together with Facebook, control an important part of the information circulated online (for example YouTube, X, TikTok, and Google Search).

A 2016 study analysed the Terms and Conditions of 50 internet platforms and concluded that as many as 88% of them have the right to ban an account without a warning or any means of challenging such decisions. While our lawsuit concerns Facebook, we hope that it will lead to systemic changes to also regulate other platforms (check the What needs to change to curb the risks of private censorship? section).

Quitting Facebook may work out for individuals, but from an NGO’s perspective, it is not a really effective strategy. Why?

  • First, we are well aware that even if every person reached by our request actually removed their Facebook accounts, in view of the scale of the company’s operations this would have no practical effect on it. Thus we wanted to do something that would have a real impact.
  • Secondly, we believe that Facebook can be used for the common good by organisations, journalists, artists etc. We hope that in the future the market for social networks will be more competitive, allowing everyone to chose the platform with the strongest ethics and best standards of service. Since this is not the case today, we want to do something to help those for whom Facebook is an important tool in their research, charitable or professional activities, and who do not have a real alternative to it because of its dominant position.
  • Thirdly, the problem does not only concern Facebook, but also other internet platforms. So even if you are not a Facebook user, you are not safe from private censorship online.

When we filed our lawsuit in May 2019, we knew that Facebook would not be making our case easier. However, we were surprised at how hard the company tried to avoid confronting our core arguments and delayed the proceedings instead. For instance, Facebook refused to accept our lawsuit because “there are no Polish-speaking employees in its litigation team”. You got it right: the company with almost 20 million Polish users claims that it “does not understand Polish”.

As a result, the court commissioned the official translation of the case documents, and ordered SIN to pay the costs: 8,841.56 PLN* (2,000 EUR). We launched a (successful) crowdfunding campaign to raise this amount to be able to continue our struggle in court. Thank you all who supported the campaign!

* Still, we believe that in a dispute with global corporations such as Facebook everyone has the right to argue their case in their own language. Thus, we lodged an appeal against the court decision to translate the documents. If our appeal is accepted, we will use the collected funds to cover further case-related costs.

Tak
Frequently asked questions about private censorship

They were plenty:

  • The Pulitzer-winning photo of a naked girl fleeing napalm bombing during the Vietnam War, known as the Napalm girl. The 1973 picture by Nick Ut was republished by one of the Norwegian newspapers – and promptly taken down by Facebook for allegedly promoting child nudity. The same happened to an archive photograph of Jewish children who had been stripped and starved by the Nazis during the Second World War, posted by the Anne Frank Centre in Amsterdam. Ironically, the Centre’s post commented on dwindling Holocaust awareness in the US.
  • Pictures of the 2017 Marsz Niepodległości (the Independence March – an annual demonstration to mark the Polish Independence Day) taken by Chris Niedenthal. The photos depicted participants of the March: some of them were masked young men wearing symbols of extreme nationalist organizations and holding red burning flares. It seems that Facebook may have concluded that the pictures promoted totalitarian symbols.
  • Works of art such asThe Origin of the World by Gustave Courbet or The Descent from the Cross by Peter Paul Rubens, both of which featured female nudes and other naked figures. Facebook found the art problematic because it allegedly “promoted nudity”.
  • Pictures of people whose appearance is “unusual”: a picture of a plus size model Tess Holiday, promoting an event organized by an Australian feminist organization, or a picture of a severely burnt Swedish man, Lasse Gustavson, a former firefighter who lost his hair, eyebrows, and ears in the line of duty. In the case of Holiday’s photo, Facebook apparently concluded that the photo violates its rules on the promotion of a healthy lifestyle. The reasons for removing the picture of Gustavson are unknown.
  • The fan page of a popular British satirical magazine VIZ, which has existed since 1979. The magazine, famous for its provocative humour, creates parodies of British comic books and tabloids. It is not clear which exact post led Facebook to ban the page. In Poland YouTube removed one of the episodes of the satirical show Przy kawie o Sprawie, (loosely translated as “Discussions over coffee”) titled “Is it OK to hit men?”. The episode concerned violence and discrimination against women. It was blocked because it allegedly incited violence.

Many of these take-down cases have caused a public outcry, leading Facebook to admit that it was wrong and restore the removed content. Unfortunately, not every author of a removed post can count on the support of public opinion. We hope our lawsuit will change this, and that every user will gain a real possibility to successfully challenge private censorship.

Arbitrary and non-transparent content moderation by online platforms such as Facebook limits our freedom of speech, including the right to information. Even if we find a decision to remove content unfair, wrong, or harmful, we have no tools to question it.

This may have a number of negative consequences:

  • Quality of information

    A private company decides what a user can see or share. Such a company is driven by its own profits, not by public interest. Therefore, it may tend to promote content that is profitable (e.g. emotive posts which help boost advertisement revenue), rather than content of high quality and informative value (content that is not “click-bait” may be at a greater risk of being banned).

  • Informal government pressure

    Governments can restrict access to politically inconvenient content. Instead of following the required procedures (e.g. getting a court order), they can “choose the easy way” and start using the self-regulation mechanisms of the online platforms to achieve the same result without the procedural restrictions. Such incidents have been already reported.

  • Pluralism in public debate

    Non-transparent rules of content moderation mean that online platforms, when deciding to remove certain posts or pages, can be influenced by their private beliefs and political opinions, e.g. to favour the right or the left of the political spectrum.

  • Discrimination

    Moderation criteria, which are introduced and enforced by online platforms in an arbitrary manner, may result in discrimination of certain communities (e.g. LGBT people or members of religious groups). Minorities are particularly exposed to the threat of the so-called abusive flagging campaigns – coordinated efforts by large groups of their opponents to ban content generated by minorities. These communities already find it difficult to make their voices heard in the public debate. Banning them on Facebook only worsens their marginalization.

  • Technical errors

    Tools used by online platforms to moderate content (filters, moderators) are not infallible. Errors can result from incorrect interpretation of the context of the publication (like in the case of the “Napalm girl” photo – check the Who else has been a victim of private censorship online? section for more examples). Facebook and YouTube are known to have removed publications documenting war crimes in Syria or violence against the Rohingya in Myanmar. Thus, the portal obstructed the work of prosecutors and human rights organizations who could use such materials as evidence against the perpetrators in court proceedings.

We want to put an end to dominant online platforms arbitrarily dictating the limits of freedom of expression without any accountability for their decisions. What needs to change to achieve this?

  • Online platforms should comply with the freedom of expression standards developed by, for example, the European Court of Human Rights on the basis of the European Convention of Human Rights.

    What does this mean?

    • If a particular statement is acceptable in public debate in accordance with the freedom of expression standards, it should be permitted also on online platforms.
    • Even if a particular post or video exceeds the limits of free speech, any “sanctions” imposed by the internet platforms should always be foreseeable and proportionate (e.g. posting one abusive post cannot result in removal of the whole page or account).
    • Freedom of expression is not an absolute right and therefore it can be restricted to prevent abuse. This is why online platforms have the duty to prevent harmful content (e.g. hate speech). Removal of such content is justified and cannot be considered an attack on freedom of speech (more about this topic in the Isn’t the fight against excessive blocking tantamount to allowing more hate speech and other harmful content online? section).
  • Online platforms should create internal mechanisms (“due process”) which will ensure that their decisions are made in a transparent and non-arbitrary manner.

    What does this mean? A user whose content was removed should receive an explanation stating the reasons for the decision, including:

    • Identification of the content that it deemed unacceptable;
    • A reasoned statement identifying the clause of the Community Standards that was breached;
    • Information on how the particular piece of content was identified as potentially abusive and how it was assessed (e.g. whether it was notified by another user, or whether it was “caught” by an automatic filter; whether the decision to remove the content was made by a human moderator or by an algorithm);
    • Information concerning what the user can do if he or she disagrees with the platform’s assessment.

    This would allow each user not only to dispute the objections raised by a platform, but also to avoid similar situations in the future.

  • In order to be able to effectively challenge a decision, each user should have the opportunity to present arguments in their “defence”. The user’s appeal should be considered by persons who were not involved in the making of the original decision and it should be carried out within a clearly pre-determined time frame.
  • Users should have the possibility to have the final decisions of platforms verified by an independent external body, such as a court of law.

    What does this mean?

    If a user believes that the removal of their content by a platform was wrong and that they had no real opportunity to defend themselves, they should be able to turn to the courts to analyse the matter, and to order a revision of the decision (i.e. restoring the removed content or account). This has been recommended by, i.a., the Council of Europe. It is the court, not a private company, that has the required authority to competently assess whether a particular statement exceeds the limits of free speech. This is why the courts should have the final say in these matters.

In 2023 a new regulation came into force: Digital Services Act (DSA) obliged internet platforms to justify their moderation decisions, and to create effective internal procedures to appeal. Did Meta oblige? That shall verify the European Commission, which, according to the DSA, supervises platforms’ compliance to the new act.

We should not forget thought that even the best internal procedures are just a part of the solution to private censorship. In the SIN vs Facebook case we advocate for every user’s right to seek resolution from an external body such as a court of law (see more in the What needs to change to curb the risks of private censorship? section). According to the DSA, also a brand new type of institution shall be created for the purpose.

The judgment of the District Court in Warsaw in our case already proved that Meta can be sued in Poland and disputed over private censorship on internet platforms may be resolved in court.

No! An Internet that respects users’ right to freedom of expression does not equal allowing hate speech, incitement to violence, or other harmful content. Without a doubt, online platforms are an important link in the fight against this type of breach, and they should remove such content. Private censorship, however, is not the solution to this problem.

This has been clearly spelled out by the authors of the joint declaration of the UN Special Rapporteurs on freedom of expression and on violence against women, who stress the importance of combating cybercrime while at the same time avoiding excessive removal of legal content.

They note that private censorship not only will not help us to effectively combat hateful content, but quite the opposite, it will result in greater discrimination (check the What are the risks of private censorship online? section). This is because, like hate crime, it could target communities that are particularly at risk of discrimination.

The solutions that we are fighting for will not undermine the fight against harmful content on online platforms. Rather, they will ensure the necessary protection to users whose freedom of expression was wrongfully restricted (check the What needs to change to curb the risks of private censorship? section).

Towards the end of 2018, the Polish government announced that it had signed a Memorandum of Understanding with Facebook. In accordance with the MoU, Polish users would have had an additional right to challenge the portal’s decision to remove content through a designated “point of contact”. The “point of contact” was closed on 25 August 2023.

While operating, it still failed to resolve the problem of private censorship at a systemic level. Why? The procedure was afflicted with the very flaws that lay at the root of the litigation in SIN vs Facebook. In particular, the portal continued to hold arbitrary power to decide what should be taken down, and it continued to exercise this power based on unclear criteria and without sufficient safeguards protecting users from abuse. Equally, the MoU did not impose any reporting duties on Facebook (e.g. to publish regular reports on how the point of contact operates) and it did not establish any independent, external control over the portal’s decisions.

We explain what reforms are necessary to effectively boost users’ rights in What needs to change to curb the risks of private censorship? section.

Tak
Actors and allies
SIN logo

The Civil Society Drug Policy Initiative (Społeczna Inicjatywa Narkopolityki, “SIN”) is a Polish NGO specialising in educational activities concerning the harmful consequences of drug use, and providing assistance to people who abuse such substances, including harm reduction activities. In 2018, without any warning or clear explanation, Facebook removed its fan pages and groups. In January 2019, one of the SIN’s accounts on Instagram, a subsidiary of Facebook, was also removed in similar circumstances.

On 7 May 2019 SIN, supported by the Panoptykon Foundation, filed a lawsuit against Facebook, demanding restoration of access to the removed pages and accounts, as well as a public apology.

Panoptykon logo

Panoptykon Foundation is a Polish civil society organisation working to protect freedom and human rights in the context of new technologies. We diagnose threats resulting from surveillance practices, intervene in the cases of abuse, develop alternative legislative solutions, stimulate critical reflection, and encourage action for change.

Panoptykon provides legal support to SIN and runs communication activities around the case.

Wardyński & Partners logo

Wardyński & Partners, founded in 1988, is one of the largest independent law firms in Poland, committed to promoting the civil society and the rule of law. The law firm participates in non-profit projects and pro bono initiatives. Its lawyers are active members of Polish and international legal organisations.

Wardyński & Partners, at the request of Panoptykon, represents SIN in the proceedings pro bono. The legal team involved in the case includes: Łukasz Lasek and Piotr Golędzinowski (both representing SIN in court). We also thank Angieszka Lisiecka and Bartosz Troczyński for their engagement in the case.

Digital Freedom Fund logo

The Digital Freedom Fund (DFF) supports strategic litigation to advance digital rights in Europe. DFF provides financial support and seeks to catalyse collaboration between digital rights activists to enable people to exercise their human rights in digital and networked spaces.

DFF has provided funding to Panoptykon to support the case SIN vs Facebook.

Gesellschaft für Freiheitsrechte

German NGO Gesellschaft für Freiheitsrechte (“GFF”) filed an amicus curiae brief with the Warsaw court to support SIN vs Facebook case. In its brief, GFF summarised the case law of German courts including the Federal Tribunal in cases regarding arbitrary censorship in social media.

Nie