The case of a Euro-Atlantic NGO being sanctioned for Nazi sympathies illustrates the systematic abuse of Meta’s content moderation.

Imagine scrolling through your Facebook feed, expecting to see updates from family and friends, only to find yourself entangled in a pool of disinformation. Conspiracy theories about marginalized groups, warmongering, and reverential posts about the country’s leader matched with insulting memes about another’s. And nothing to dispute the “reality” of the narratives you see. In Bulgaria, this is not hard to imagine.

The country’s persistent vulnerability to the Kremlin’s disinformation has, over the years, made this experience all too common amid dwindling media freedom and the longstanding oligarchic grip on the media. Yet Bulgarians increasingly resort to social media platforms as their primary source of news, “armed” with only low levels of the capacity for critical thinking, as revealed by PISA’s 2022 results.

Facebook reigns supreme, dominating over 95% of the social media landscape. However, Meta’s opaque algorithms and biased moderation only serve to amplify sensationalist content, bolstering the influence of pro-Russian narratives. Without a concerted, collaborative effort to formulate and disseminate counter-narratives, this susceptibility has remained unmitigated. The platform has evolved into fertile ground for manipulation by foreign authoritarian regimes, creating an information battleground where the routine suppression of dissenting viewpoints is also prevalent.

The Atlantic Council of Bulgaria’s Facebook page often features images critical of Russia’s war against Ukraine. This grab from the page shows notifications from Meta about violations of its community standards and use of hate speech.

The Russian Playbook

Troll farms and bots have been employed to magnify the reach and credibility of these disinformation efforts. “Mushroom websites,” notable for generating a plethora of identical articles, play a significant role in amplifying various pro-Kremlin narratives. At the Center for the Study of Democracy, my organization, we have uncovered highly effective examples of such “impostors” – online media designed so similarly they’re almost indistinguishable. It is a clever practice. For example, websites affiliated with the Share4Pay network pay users to disseminate pre-made clickbait – and very often disinformation content – via social media channels.

Even before the Russian invasion of Ukraine, the Kremlin troll army was going strong on Facebook. Once the full-scale war started, they revved up a few gears without an adequate response from the gatekeepers.

According to Atanas Tchobanov, writing for the Bureau for Investigative Reporting and Data (BIRD), the job of moderating Bulgarian Facebook has been in the hands of TELUS International Bulgaria since 2019. TELUS International Bulgaria is a division of the Canadian-based company, TELUS International, which received a contract from Meta. Tchobanov points to a distressing pattern:

“There is a visible silencing of journalists, media and public figures who are critical of Russian aggression in Ukraine and at the same time a reinforcement of pro-Russian propaganda in a social network that in Bulgaria has the status of mainstream media. A phenomenon that cannot be explained solely by the standards of the Facebook community.”

Exploiting the weaknesses in Facebook content moderation practices appears to have become yet another tool in the Kremlin’s extensive arsenal of hybrid warfare tactics.

As RFE/RL reported in 2023, public figures in Bulgaria with high engagement on the platform and many followers have found themselves blocked, while more offensive posts by individuals or smaller groups remain untouched. This discrepancy is particularly noticeable when such posts have expressed anti-Russian or pro-Ukrainian views.

Dampening Discourse

As an analyst and researcher deeply invested in countering disinformation and nurturing critical thinking among my fellow Bulgarians, I can firmly attest to the routine silencing of journalists and public figures critical of Russia’s aggression, particularly on Facebook. Now content moderation rules are being weaponized, not to alleviate the situation, but to further suppress such viewpoints.

This not only complicates the already challenging task of doing my job but also makes it nearly impossible to ensure a balanced discourse on social media platforms. Despite these obstacles, my team and I remain committed to our mission and through extensive research and leveraging social media intelligence tools, we have uncovered a troubling case study. While not unique, this case vividly illustrates the systematic abuse of Facebook’s content moderation, particularly targeting content from one prominent defender in Bulgaria of Western principles and institutions.

The Atlantic Council of Bulgaria (ACBG) is a non-governmental organization committed to promoting Bulgaria’s Euro-Atlantic integration and the values of democracy, freedom, and justice. It conducts its activities fully in accordance with the country’s laws and international standards.

In April 2022, a post on ACBG’s Facebook page detailed a Russian strike that led to the deaths of children in the eastern Ukrainian city of Kharkiv. Despite the post lacking inappropriate content, Facebook moderators removed it, citing “hate speech” and “adult content” violations. Consequently, the ACBG page faced ramped-up sanctions and increased scrutiny for allegedly spreading disinformation by reposting a story about a downed Russian helicopter. The accompanying photo showed what looked like a washing machine beside the crashed aircraft.

Facebook apparently scrutinized the page based on reports suggesting the article implied that a military helicopter was transporting stolen household appliances. Starting from May 2022 onward, the page administrators noted a substantial drop in traffic – despite ACBG maintaining its follower count, recent posts garnered only a fraction of the typical reactions and views, suggesting Facebook had decided to restrict traffic. Page traffic returned to normal only after an ACBG post that voiced frustration over the decreased page visibility.

However, traffic plummeted again within hours, as soon as ACBG announced its commitment to uphold pro-Ukraine viewpoints. Consequently, page restrictions prompted by the removed posts persisted beyond the three-month ban period, contrary to Facebook’s policy. The page was shut down by Meta in August 2022 due to accumulated sanctions that were expected to be lifted. It later disappeared entirely, leaving no unique identifier and only a few posts visible in the Google cache. ACBG created a new Facebook page on 31 August 2022.

That didn’t stop the abuse. A year later, in August 2023, it seems as if trolls might have bombarded Facebook’s moderators with complaints, with their tactics matching Kremlin censorship practices that we have studied previously.  This time the reason was a post discussing the removal of the Soviet Army monument in Sofia – a poignant reminder of a bygone era that stirred tensions among the capital’s citizens for over three decades. The post featured images of a young man in traditional Russian military clothing protesting against the monument’s removal. However, ACBG also pointed out recent images from the person’s profile on Meta-owned Instagram, showing him wearing a Nazi uniform. The intention behind displaying such images was to portray the individual as a hired actor without allegiance to any cause.

However, despite the image aligning with Facebook guidelines and containing no swastika symbols, the photo became an issue under the pretext that it violated Meta’s rules on content by dangerous individuals and organizations. ACBG moderators subsequently observed that engagement with its posts declined without a clear explanation. In response, an ACBG board member wrote an open letter to Meta/Facebook boss Mark Zuckerberg.

Shadow Bans

The implication of such actions highlights a powerful strategy for limiting a page or account on Facebook. Instead of resorting to overt closure or censorship, which might draw attention, the preferred approach typically involves reducing visibility. This can lead to a sharp decrease in likes and lower engagement overall, while making it challenging to prove culpability and consequently lodge a serious complaint. This time, as a result of such censorship tactics, a warning of potential deletion was issued, alleging the dissemination of Nazi propaganda. Ultimately, a page intending to expose individuals profiting from Nazi propaganda ended up penalized for allegedly promoting it.

Then, on 14 November 2023, Facebook removed another post from the ACBG page, this time involving a petition supporting the closure of the Russian Cultural Center in Sofia. Facebook cited hate speech as the reason, despite the absence of any hate speech in the post. The content merely included a link to a petition along with a photo of people holding signs saying “Russian terrorist out of Ukraine” in front of the Cultural Center. Consequently, the page received a warning, signaling imminent deletion upon further violations.

This case study provides insight into the flexible and adaptable tactics used in pro-Russian hybrid warfare in Bulgaria. While we don’t yet have a smoking gun, our assumption is that pro-Kremlin trolls bombard moderators with requests to take down content they deem pro-Ukrainian or anti-Russian, resulting in the moderators being overly aggressive when banning posts.

Our ongoing efforts to combat disinformation and propaganda are essential because this issue extends beyond politics; it’s about safeguarding free speech from foreign malign influences and uncovering as well as dismantling information operations conducted by foreign authoritarian actors and their local proxies. While these operations may begin on Facebook, their implications extend far into institutional and political structures.

Svetoslav Malinov works as an analyst at the Center for the Study of Democracy (CSD) in Sofia, focusing on media and state capture, malign foreign influence, and disinformation.