In Bulgaria, Russian Trolls Are Winning the Information War

Pro-Russia groups are gaming Facebook’s review process, and moderators are stuck in the middle.
Crowd of people waiving Bulgarian and Russian flags at a protest
Supporters of Nationalist and Russophile party of Vazrazhdane protest in Sofia, Bulgaria. They are against Bulgaria's membership in NATO and sending weapons to Ukraine.Photograph: Georgi Paleykov/Getty Images

On December 13, 2022, a group of Bulgarian activists from the nonprofit United Bulgaria For One Cause (BOEC) tried to enter the offices of Telus International, a global outsourcing company that handles content moderation for Meta, in Bulgaria’s capital, Sofia. Live on Facebook, they came armed with printouts of posts and accounts they said had been removed from the platform, which they stuck to the office doors.

“We used stickers to symbolically close the doors of Telus, symbolic like they closed our accounts,” Orlin Ezekiev, a member of BOEC, says.

BOEC accuses Telus International of blocking posts that criticize Russia and support Ukraine. Their protest came weeks after a local outlet, Bird.bg, published allegations—which Telus International denies—that the outsourcing company was working with pro-Russian oligarchs to silence pro-Ukrainian sentiment on the platform. The website also posted the names and images of Telus International employees on its Facebook page.

Criticism of Telus International and Meta in Bulgaria reached such a height that the outsourcing company’s chief corporate officer, Marilyn Tyfting, was called to testify in front of the Bulgarian parliament on January 26. “I would also like to confirm that Telus International does not set content review policies. Instead we apply the policies of our clients and comply with applicable laws,” she said in a prepared statement. On February 1, Meta published a blog post responding to claims of pro-Russian bias in its content moderation, calling the accusations “false” and saying “there is no evidence to support them.”

However, experts who monitor Russian attempts to manipulate the information space in Europe say that the truth is more complex. Russian propagandists and supporters of the Kremlin have become adept at abusing Meta’s moderation practices—which are less robust in non-English languages—by reporting content en masse to trigger reviews that could ultimately lead to its removal. The lack of transparency over what gets removed and why has created a sense of betrayal and frustration, which has in turn led pro-Ukraine activists to target the largely powerless moderators responsible for enacting Meta’s policies. 

“Facebook is one of the main tools for promoting and silencing others at the same time,” says Ruslan Trad, a Sofia-based fellow at the Digital Forensic Research Lab. “Mass reporting is a very successful strategy.”

Trad, whose own Facebook account had once been suspended after being spuriously reported for hosting extremist content, says that pro-Russian groups will often organize on Telegram and choose which accounts or posts to report and get removed from Facebook. Some of these groups, according to Trad, operate from Russia, while others may be paid-for trolls from within Bulgaria, where labor is relatively cheap.

According to Todor Galev, director of research at the Center for the Study of Democracy, a European public policy think tank, the Atlantic Council’s Bulgarian Facebook page has been banned several times after being mass reported. He says the accounts of prominent pro-NATO and pro-EU journalists and media outlets have also been targeted.

“We suspect that Facebook relies mostly on algorithms for small markets like Bulgaria,” says Galev. “Because human moderation is very limited. There are only a few people working [on moderation] for Bulgaria.”

A former Meta employee who worked on its content moderation systems and policy, and who spoke to WIRED on the condition of anonymity, says, however, that mass reporting could at least get certain pieces of content or accounts flagged for review. And the more frequently a certain type of content is flagged, the more likely the algorithm will be to flag it in the future. However, with languages where there is less material to train the algorithm, like Bulgarian, and AI might be less accurate, the former employee says that it’s possibly more likely that a human moderator would make the final call about whether or not to remove a piece of content. 

Meta spokesperson Ben Walters told WIRED that Meta does not remove content based on the number of reports. “If a piece of content does not violate our Community Standards, no matter how high the number of reports is, it won’t lead to content removal,” he says. 

Some moderation issues could be the result of human error. “There are going to be error rates, there are going to be things that get taken down that Meta did not mean to take down. This happens,” they say. And these errors are even more likely in non-English languages. Content moderators are often given only seconds to review posts before having to make a decision about whether or not it will stay online, an indicator through which their job performance is measured.

There is also a real possibility that there could be bias among human moderators. “The majority of the population actually supports Russia even after the war in Ukraine,” says Galev. Galev says that it’s not unreasonable to think that some moderators might also hold these views, particularly in a country with limited independent media.

“There’s a lack of transparency around who is who is deciding, who is making the decision,” says Ivan Radev, a board member of the Association of European Journalists Bulgaria, a nonprofit, which put out a statement condemning Bird.bg’s posting of employee information. “This sentiment is feeding dissatisfaction in Bulgaria.” This opacity can breed confusion.

The imbalance between the ability of coordinated campaigns to get content flagged, and that of individuals or small civil society organizations, whose reports go to human moderators, has helped to create an impression in Bulgaria that Meta is prioritizing pro-Russian content over pro-Ukrainian content.

Just over half of Bulgaria’s 6.87 million people use Facebook, which is the dominant social platform in the country. Bulgaria has long been a target of Russian trolls and pro-Russian propaganda, particularly since the beginning of the war in Ukraine. Both sympathetic local media and Russian disinformation operations have pushed a pro-Russia narrative, blaming the conflict on NATO.

Ezekiev, the BOEC member, told WIRED that he was never given an explanation for why his content was removed or how the choice was made. “If you raise your voice against propaganda and say something about the war in Ukraine, your account can be suspended,” he says. Meta’s own lack of transparency about its moderation processes, says Ezekiev, makes the entire situation murkier.

It is this frustration that drove BOEC to protest at Telus International’s Sofia office, and led to employees—themselves largely powerless—being doxed and harassed, though there is no evidence that any of the company's moderator deviated from Meta’s own instructions.

In February, Bulgarian media reported that Telus International would be closing its operations in the country and moving the work to Germany. “As part of a consolidation of operations, the work Telus International does for Meta in Sofia will be moving to another of our sites,” says Telus International spokesperson Michelle O'Brodovich. “Telus International continues to work successfully with Meta, ensuring the highest level of professional standards.” The company did not address whether or not the inquiries into its work in Bulgaria contributed to this decision.