top of page

Facebook content moderation inconsistencies in the context of war in Ukraine

Since the start of the war in Ukraine, social activists, opinion leaders, journalists and ordinary civic-minded people have noticed a series of Facebook account bans for allegedly violating the community guidelines. In many cases, such restrictions not only affected the collection of aid, but also dissemination of information, both of which are vital in the context of a war. The question therefore arises as to how flexible Facebook's content moderation policy has been in the context of the war in Ukraine.


The report was written by:


Debunk.org Analyst Mykolas Kralikas and Senior Analyst Laima Venclauskienė.


Vilniaus University Communications Faculty researchers assoc. prof. dr. Vincas Grigas, assoc. prof. dr. Daiva Siudikienė, prof. dr. Andrius Šuminas, and assist. prof. dr. Justina Zamauskė.


RELEVANCE OF THE PROBLEM


From February 24, 2022, Lithuanian public figures and organisations have raised at least €53.8 million of military, humanitarian, and other aid for the people of Ukraine. Facebook has become an integral part of the fundraising process, allowing foundations and activists to respond more quickly to the needs of beneficiaries, sharing information on how much has been raised and where the money is going. However, there have been many complaints from Lithuanian users that their posts collecting or expressing support for Ukraine, or showcasing their anger at the actions of the aggressor state, have been unjustifiably removed and their accounts restricted.


Last December, Debunk.org, together with the Lithuanian Government, launched an initiative inviting Facebook users in Lithuania to provide examples of their accounts or pages being restricted. The data collected was processed by Debunk.org analysts together with researchers from Vilnius University's Faculty of Communication.


The impact of content moderation gaps is particularly evident when it comes to the larger accounts getting banned. According to Meta, there were 2 910 000 000 active Facebook users worldwide in 2022 (Statista, 2023), and in Lithuania, with a population of 2.81 million, the social network is used by as many as 2.1 million people. So, while blocking an account or a page with 200 000 followers may seem like a small number on a global scale, in Lithuania it would mean that this account was followed by one out of ten Facebook users.


A review of the examples of account restrictions or blocking sent by users suggests that Facebook has penalised these pages or accounts for using certain keywords. However, content moderation rules are clearly applied inconsistently. It was noticed that accounts, groups, or pages which use the same words in posts spreading disinformation or pro-Russian sentiments are not always spotted and penalised.


Fake accounts that promote the Russian propaganda narratives are also not removed, although war propaganda is prohibited by law in Lithuania. The collected data shows problems with content moderation, which raises questions about how Facebook's mechanisms work in practice - whether these problems arise due to the policy loopholes, their inappropriateness in a war context, or perhaps due to the ability of certain users to exploit these loopholes with a malicious intent.


During the COVID-19 pandemic, Facebook faced heavy criticism for the lack of content control. At the time, as massive amounts of false information and hatred was aimed at doctors and researchers, stricter rules seemed like an appropriate and necessary tool. In the short term, introduction of more control would likely be perceived as a positive development. However, in the context of the destructive the war in Ukraine, these restrictions have manifested themselves in unforeseen ways, which have disrupted the normal flow of information.


The aim of this report is to present case studies and highlight the systemic problems with Facebook content moderation, which can disrupt the abilities of various communities to express support for Ukraine in various ways. The report is based on the analysis of data collected by Debunk.org. The research is aimed at identifying the causes and nature of the restrictions on Lithuanian Facebook users' accounts, and assessing how the community's standards function in a war context.


The research methodology includes an analysis of reports provided by users and a critical analysis of Facebook policies and community standards.


DATA REVIEW


Dates of the restrictions


Study period: 24/02/2022 - 31/12/2022. The earliest restriction case, of which we examined a total of 131 for this review, is 26/02/2022, the latest is 30/12/2022. The frequency of distribution of restrictions by month is shown in the graph below.



We can see that the most frequent cases of restrictions, reported by respondents, were those imposed in December (38 cases). Possible reasons for this are: a) the survey form was published on December 14, 2022, and people were sending cases the most recent bans; b) Lithuanian journalist and activist Andrius Tapinas also asked for examples of restrictions on his Facebook wall on December 14. Among the cases submitted, some users stated that their account was blocked after sharing comments and screenshots about restrictions under Tapinas's or Debunk.org's posts about innitiatives to gather such cases.


The higher number of restriction cases in the second half of 2022 may also have been influenced by the fact that respondents provided the most recent cases and screenshots. April stood out with a higher number of restrictions in the first half of last year (twice as many as any other month in this period since the start of the war). 3 out of 8 such cases were related to dissemination of information (sharing photographs) about the massacre in Bucha. Two cases were classified as "graphic violence" and one as "nudity or sexual activity", and faced restrictions of up to three days.



In early April 2022, Meta was criticised for blocking #bucha and #buchamassacre hashtags in what it said were automated systems that searched for violent images on Facebook and Instagram. This action echoes the criticism expressed by human rights groups towards Meta's approach to removing violent content during conflicts: the practice of wiping the data off its servers after 90 days removes important evidence of war crimes.


Duration of the restrictions


The analysis consists of 131 posts that brought restrictions and/or warnings to their authors. Below is a table of the frequency and the duration of these restrictions. In addition to the cases listed in this table, Debunk.org received 12 cases where the duration of the restriction could not be determined, 4 cases where the blocked posts were reinstated (bans lifted), 2 cases where access to advertising was restricted, and one case where the account was suspended and the page was disabled.


We determined the duration of the restrictions from screenshots sent to us or from the respondents' own answers. Among the answers provided by the respondents themselves, there were cases where only a part of the restriction was indicated, rather than the full time period. In such instances, we recorded the respondents' duration as the closest to the duration of the restrictions specified in Meta's restriction policy (0, 1, 3, 7, 30 days) - e.g., we converted 29 days to 30 days.



From this combined data, we note that the sequence of 0, 1, 3, 7 and 30 days is in line with Meta's explanation of its own policy [1], according to which the timing of the restrictions depends on whether the Facebook user has a history of misconduct. Meta's explanation also states that depending on the severity of the violation, other stronger or longer restrictions may be imposed.


It is worth noting that the 60-day restriction was slightly more prominent. From the screenshots we could discern that 60-day ban entailed restrictions such as "You could not start or join any calls" and in the other case - "Posts will be moved down the feed". However, we do not know whether in these two cases other restrictions, such as limitations on group activities, were also imposed for the same or different duration.


We also observed at least 2 cases where, in addition to the usual 30-day restriction, additional 60 days were imposed - "Posts will be moved down the feed".


From the data we obtained, we were not able to determine to what extent the timing of the restrictions depends on whether the user has a history of previous offences, as we generally do not see the full history of restrictions received by the respondent. This means that, for example, we usually cannot determine whether a 30-day restriction received by a respondent is due to the severity of the breaches of the Facebook Community Standards in a particular case, or whether the respondent has been sanctioned in the past for other posts that may not have been related to Ukraine.


Despite these shortcomings, we can see that, according to the available sample data, the 30-day ban is the most freequent (57 out of 131 cases). Several of these concern content related to the Ukrainian Azov Regiment, as it is only at the end of January 2023 (almost a year after the start of Russia's invasion of Ukraine) that Meta no longer considers the Azov Regiment to be a "dangerous organisation". This means that members of the Azov Regiment will be able to have accounts on Meta platforms, and content about the Regiment posted by other users will no longer be removed, as it had happened in the example here:



One of the most notable cases of indefinite restrictions is also worth mentioning. 1 account and 1 page ("Praeities Žvalgas") were completely closed and 2 accounts lost their right to advertise on Facebook. At the time of writing, the new account of "Praeities žvalgas" is not suspended, and Šarūnas Jasiukevičius, who goes by the pseudonym of "Praeities Žvalgas", continues to raise funds for Ukraine, which is devastated by the war and is fighting the aggressor.


'I've been unblocked on Facebook for about four or five days lately, and then I'm blocked again for a month,' Mr Jasiukevičius noted. In the case of this page, Facebook claimed that the account did not meet the social network's community standards for weapons, animals and other regulated goods.


Finally, out of all the cases analysed, Facebook reversed its decision only 4 times, based on the received appeals. However, this may also be due to the fact that respondents simply did not send such cases where blocked posts were restored.


Reasons for the restrictions


The reason for blocking given by Meta. 'Hate speech' was the most common reason for restrictions of Facebook accounts, in 76 of the 131 cases examined (in three other cases, this reason was supplemented by 'Hate speech and insults' and twice by 'Hate speech and degradation'). The second most common reason for imposing restrictions was 'Not in line with community standards' (31 case).



We have identified several groups of hypothetical keywords and/or phrases: their use may have triggered or significantly contributed to the restrictions being imposed. The table presented here shows the order of the most frequently recurring elements, with the forms of the word kacapas at the top.



These are followed by derivatives of the word maskolis, followed in third place by the designation of Russia and Russians as Ruskies, Ruzzkies, Ruzzia, etc. Expressions related to the word orc ranked fourth.


According to the Lithuanian Language Institute, the word maskolius (another variant: maskolis; plural form: maskoliai) is a non-standard word in the standard Lithuanian language, yet has a long usage tradition. This word can carry a negative connotation yet is not either a swear-word or a slur. It has a completely legitimate functional purpose because it draws a line between ethnic Russians as a nation and the aggressive belligerents, which are portrayed in a negative light. The same can be said about Maskolija, the name for Russia as an aggressor state that derives from the word maskolius.


In those content units where we found hypothetical keywords and/or groups of meaningful compounds, about one fifth of the cases (17.4%) contained two or more such elements.


As we can see, among the most common keywords and phrases that we have identified from the data available to us, there is also khokhol, a derogatory term for the Ukrainian people. In such cases, the term has been used ironically, to retell clichés repeated by Kremlin propaganda, while ridiculing them at the same time. In one case, for example, a meme entitled "Metamorphosis" was shared, in which a fierce supporter of the Kremlin's war in May 2022, who rejoices in the defeats of the khokhols, is transformed into a critic of Putin after he has declared mobilisation.














Typology of restriced content


Types of restricted content. Out of the 131 cases provided by the respondents, we found that 99 comments and 28 posts did not meet Facebook's content standards. In another 4 cases, it could not be determined which type of content was blocked.



Format of the restricted content. 99 entries were textual, 20 were visual and 9 combined textual and visual information (3 cases could not be identified). The visual cases can be divided into 14 memes, 5 photos and 1 entry using only emoticons.


Languege of the restriced content. 82 content items in Lithuanian; 20 in Russian; 8 in English, 6 in Lithuanian and Russian; 5 in Ukrainian; 1 in Lithuanian and Ukrainian. A further 9 cases were visual without any written text or no screenshots of the entries themselves.


Content of the restriced posts


Nature of the content. 99 entries were marked as discussion. 28 items were determined to be dissemination of information, 1 as fundraising and 3 could not be identified (only the fact of blocking is visible and there is a lack of information on the specific posts that were restricted).

Time limits for the content. According to the available data, 124 of the posts dealt with current affairs (current events and reflection to said events), while 4 posts linked history and the present, looking for parallels between the Russian invasion of Ukraine and historical events (the aggression of Nazi Germany, the Soviet occupations, and the massacre at Srebrenica), as well as reviewing the historical model of propaganda in Russia (dubbed the "culture of lies").











The genre of the content. We classified 20 of the available samples as satire/humour. Looking at these examples, it can be argued that Facebook's content moderation scheme is less effective when it comes to satirical cartoons, memes and jokes.


This report reads (in Russian): Have I understood correctly that Ukraine has attacked Russia in Ukraine, so Russia has declared martial law in Ukraine in order to protect Ukraine from Ukraine in Ukraine, which is not Ukraine but Russia? The meme says: The West is trying to ruin Russia. But we have risen from our knees. The Yankies are afraid of us. By the way, a Kalashnikov can be used to shoot through train tracks. They will get what is coming to them, Maidan scum. For 23 years we have fed those ....


The user received a 7-day restriction.






The post reads in Lithuanian All I want for Christmas, earlier if possible. Author Nikita Titov. The author of this message was subjected to a 7-day restriction.







This meme reads I don't need to be freed, I am free in Ukrainian. The user was restricted for 30 days.























The meme was subjected to restrictions for "not meeting community standards". The appeal of the author was unsuccessful.





























Sharing a GIF from Facebook's own gallery does not protect you from a penalty either. A user who shared an animation that says Thank God I'm not a muscovite. Glory to Ukraine, glory to the heroes received a 7-day ban.
















Facebook's algorithms which are designed to search for breaches of standards (if this example has indeed resulted from this algorithm being applied) do not seem to recognise allegories where, for example, Vladimir Putin is portrayed as the personification of the aggressor state, and such an image is perceived to be a straightforward call for violence. In March last year, Meta temporarily lifted its restrictions on certain posts calling for the assassination of Russian President Vladimir Putin and his Belarusian counterpart Alexander Lukashenko (unless they include other targets or two indicators of credibility, such as location or method) in twelve countries including Ukraine, Poland and Lithuania.












The inconsistencies of restrictions


Review of the data from the survey has revealed a list of keywords for which accounts were restricted. The then question arose as to whether accounts that spread pro-Kremlin content are also blocked for the same words. A review of the Facebook pages and groups monitored by Debunk.org suggests that the penalties are far from being distributed evenly. It has also been observed that the posts not only mention restricted keywords such as Azov, orcs, moskol, but also disseminate false information that supports the Russian state propaganda line and conspiracy theories.


The post claims that the bombing of a maternity home in Mariupol was staged. A known propaganda outlet Lenta.ru is also quoted.


'<...> In an interview before the "shelling", the son of one of the employees of the maternity home said that Azov had expelled all the employees and patients. <...> He does not know whether it was fighters of the Ukrainian Armed Forces or of the nationalist battalion Azov.'


The post is still available today.


'



These are the Nazis of the Azov Squad, who have been shelling the innocent inhabitants of the Luhansk and Donetsk Republics since 2014....'


The post is still available today.










'Zelensky sends his western Ukrainian orcs to war against Russia <...>.'


In this case, the orcs are Ukrainian soldiers, and the post is still available.







The post spreads the Kremlin's propaganda narrative about the Ukrainian President being a drug addict. 'The one we are talking about now (i.e. Zelensky), he is, if you like, the most important fighter against the orcs (Russian military) at the moment...'


The post is still available.







'<...> the muscovite fascist manure they have been throwing around during the "containment of the pandemic" would be erased and forgotten. <...>'

The post is still available today.




'Look at the zeal with which the forced cov$.id fascism has been introduced in Muscovy <...>.'


The post spreads a conspiracy theory about "globalists". It is still available today.








The inconsistency of the restrictions can also be seen when comparing accounts of similar size and level. For example, the account of Vilnius Mayor Remigijus Šimašius was restricted for 24 hours for a post explaining the historical origins of the term Muscovy. However, the account of Remigijus Žemaitis, a Member of the Seimas, was not penalised for one of the posts using the word orcs


CONCLUSIONS AND INSIGHTS


In its Community Standards document, which defines the do's and don'ts on Meta's social platforms, the company states that its aim is to create a place for people to express themselves. Meta declares that the platform wants people to openly discuss issues that are important to them, even if parts of the community disagree or find such issues unacceptable. However, the data collected on the restriction or blocking of profiles on Facebook casts doubt on Meta's claims that it manages to ensure proper content moderation even in a context of war.


Since the resumption of active hostilities in Ukraine on February 24, 2022, various cases of restrictions or blocking of Facebook user profiles have been recorded, which motivate to question Meta's stated values and rules. In some cases, the profiles of Facebook users who have spread information about war crimes in Ukraine and/or collected support for Ukrainians have been restricted or blocked.


Analysis of collected and summarised data pointed at some differentiations. The cases received from Facebook users reveal certain patterns, but the link between specific actions on the social media platform and the penalties imposed for them is not clear, as we do not usually have a complete history of the warnings and restrictions received by a Facebook user. However, a user who creates a piece of content that Facebook identifies as violating certain rules may receive different restrictions on the use of the platform.


The apparent differences in the timing and extent of the bans for the same or slightly different breaches of Facebook's content rules, based on the currently available data, can only suggest that this is mostly due to users having already received penalties for previous actions which, according to Meta, breached their policies. This implies that the same system of applying Meta's restrictions is also at work in the context of a war, with longer penalties being imposed for repeat offences.


However, it is questionable how proportionate and adequate this system is in terms of blocking Facebook users with pro-Ukrainian stances. For example, repeated use of keywords (maskol*, kacap*, etc.) results in accounts getting suspended for a long time. However, we can assume that the "incitement to hatred" by using these words, as explained by Meta, does not cause as much harm to the users of the social network as the blocking of these accounts and the restriction of the dissemination of pro-Ukrainian information.


Facebook's content policies on issues such as violence, bullying, guns, drugs, gambling, or alcohol are quite clearly defined by the Meta Transparency Center. Each of these topics is restricted not only by the general rules of the social media site, but also by the local laws of each country. For example, according to the Lithuanian alcohol advertising laws, alcohol advertising is prohibited on Facebook and Instagram. However, particularly sensitive topics such as war or war crimes, in the context of restrictions on the dissemination of such content, is not clearly defined, and may also be subjected to change according to the laws of each country.


Given the different scope of the bans and their uneven application to identical offences, it can be assumed that blocking of Facebook accounts is a complex process. Restrictions can be applied according to different components such as: the content of the comment or a piece of content, the frequency of comments or pieces of content, the engagement in war topics by discussing, supporting or sharing content created by other users, the size of the audience of the user's platform (impact on the community), actions taken by third parties, such as blocking of users, reporting of objectionable content. The uneven impact of these variables on Facebook's algorithm for limiting user-generated content results in users of the platform receiving different penalties for the same offences. Given that Facebook's algorithm has never been made public and is kept as a commercial secret by Meta, it can be assumed that, in the context of war, it can also be directed towards controlling the flow of content in order to maintain a balance between war and other topics that provide a financial return for the platform's managers.


In times of war, social media platforms are often used for nefarious purposes, spreading misinformation, lies or mistrust among the public. For this reason, moderation and control of user-generated content on social media platforms is essential, as is a well-defined and consistent penalty and blocking system. However, Meta should ensure that these surveillance measures do not victimize the social platform users who are simply seeking to express their opinions or engage in civic initiatives to raise support for the victims of the ongoing war.



- 1 violation: Warning and no further restrictions. - 2 violations: The right to create content, such as posting, commenting, using Facebook Live or creating a page, is restricted for one day. - 3 violations: 3-day content creation ban. - 4 violations: 7-day content creation ban. - 5 or more violations: 30 days content creation ban.

 

Report prepared and first time published on the DebunkEU.org web page.

InformNapalm_logo_07.png

Partneris Lietuvoje

bottom of page