On 7 May 2019 Spoleczna Inicjatywa Narkopolityki (Civil Society Drug Policy Initiative, “SIN”), supported by the Panoptykon Foundation, filed a lawsuit against Facebook in a strategic litigation aimed at fighting private censorship on the Internet. Online platforms act as the ‘gatekeepers’ to online expression, thus gaining tremendous power over the information circulated on the Internet – power which they wield without an adequate accountability or responsibility. Moderation is necessary to fight illegal, harmful content but unfortunately perfectly legal and socially valuable materials often fall prey to it. We hope that our lawsuit against Facebook will help change this.
SIN is a Polish civil society organization which for the past 10 years has been providing drug education, cautioning against harmful effects of psychoactive substances and helping drug users. SIN works mainly in the interest of young people who tend not to listen to experts or teachers but who are very active on social media. The group focuses on harm reduction which is a drug-prevention strategy recommended by many institutions, including the United Nations and the European Union.
In 2018, without any warning or clear explanation, Facebook removed a fan page and a group run by SIN. The platform had characterized their activity as ‘in violation of Community Standards’. In January 2019, one of the SIN’s accounts on Instagram, a subsidiary of Facebook, was also removed in similar circumstances. When Facebook banned SIN from its social networks, it essentially decided that information about damage reduction and drug education does not deserve the protection of the freedom of expression. Thus, it lumped these two topics of crucial societal importance together with unquestionably reprehensible content such as calls for violence, hate speech and Nazi symbols. “By doing so Facebook made it harder for us to help people who need it the most. It also undermined our reputation by suggesting our actions were illegal” – says SIN’s Jerzy Afanasjew. “We have set up a new fan page and we are trying to rebuild trust and reach, but this takes time. It also does not help that we don’t know what exactly Facebook content moderators didn’t like in our communication in the past. Because of that we have no way of being sure that our efforts won’t be nullified in the future with just one click”.
Today the few dominating global corporations act as the main channels of communication and access to information. Facebook, Google or Twitter present content selected by algorithms on the basis of users’ online activity. However, they also moderate the content published online, thus deciding what users will not get to see.
Panoptykon fights for content removal to be based on clear and easily accessible rules and for users to have the right to effectively contest the decision. This means for example that the user has to be informed why his or her content was blocked, be able to present arguments in his or her defense and that the appeal should be considered by people who did not participate in the making of the original decision. In addition, final decisions of platforms should be subject to independent scrutiny by the courts.
Panoptykon hopes that the court case will incentivize online platforms to move away from their current opaque and arbitrary methods of blocking and to introduce solutions which will better protect our freedom of speech. We expect the case to set standards that will influence policies of not only Facebook, but also other platforms.