top of page

Posts related to Kaliningrad sanctions targeted by social media bots and trolls

In a matter of weeks after sanctions were imposed on the transit of goods to Kaliningrad, Facebook posts published by Lithuanian broadcaster, LRT, experienced a surge in interactions with content about the issues related to the Russian semi-exclave. Analysis revealed that most interactions originated from suspected troll or bot accounts which engaged in inauthentic behaviour intended to artificially amplify certain news content while pushing harmful narratives in the comments sections. Further investigation uncovered links between the suspicious accounts and an organised operation to buy access to established social media profiles.


Summary


  • ‘LRT English’ Facebook page posts published from June 1 to August 31, 2022, about Kaliningrad. Some of them received a very high interaction rate, on one occasion reaching up to 111 times more reactions than the average.

  • The distribution of interactions on the ‘LRT English’ Facebook posts was revealed to indicate unnatural behaviour, consistent with artificial amplification tactics.

  • Automated analysis of data revealed accounts interacting with the LRT English posts originated from 91 countries, with 57% of interactions from Africa and Asia.

  • Manual analysis of 848 accounts revealed more than half of them (56.9 %) to be suspicious, with most suspicious accounts originating from Nigeria, Kenya and the Philippines.

  • Research revealed two users promoting an opportunity to be paid as part of an social media ‘account renting’ scheme, that also interacted with LRT English posts, suggesting a modus operandi for the campaign. One of the users was Russia-linked.

  • Analysis of 8936 comments revealed significant clusters, including some groups promoting pro-Russia and other pro-Ukraine narratives, which were engaged with by audiences around the world.

 

In view of the increasingly discussed problem of inauthentic behaviour on social media and the widespread activity of bots and trolls on social media, DebunkEU.org investigated a selection of artificially amplified Facebook posts from a Lithuanian national broadcaster dedicated to sanctions imposed on the transit of certain goods to Kaliningrad and related issues that were published on ‘LRT English’ Facebook page. In our analysis, we leveraged technologies developed and provided by Graphika, a social networks analysis firm working to map and understand online communities.


In June 2022, Lithuania began implementing the EU sanctions package on certain Russian goods - including building materials - in response to the Russian invasion of Ukraine. The ban entered into force on June 18 at midnight. The EU sanctions package led to a Kremlin campaign against Lithuania. The situation has triggered a wave of comments from politicians in the Russian Federation. Secretary of the Security Council of the Russian Federation, Nikolai Patrushev, called the ban a hostile action, claiming that Russia's response "will have a serious negative impact on the population of Lithuania.” Russia’s foreign ministry warned that Vilnius must stop the “openly hostile” activity: “If cargo transit between the Kaliningrad region and the rest of the Russian Federation via Lithuania is not fully restored in the near future, then Russia reserves the right to take actions to protect its national interests,” while the Kremlin’s spokesperson Dmitry Peskov stated that "the situation is more than serious." Similar threats were a continuation of tense relations between Russia and Lithuania. In the weeks before sanctions were implemented, the deputy of the Russian Duma proposed to cancel the recognition of Lithuania's independence.


The situation has become a pretext for the dissemination of false or misleading information, e.g., that the transit ban was an exclusive decision of Lithuania, it would lead to a famine on Kaliningrad territory (although the transport of consumer goods was not subjected to sanctions), which will result in the outbreak of World War III.


The issue of transit sanctions was extensively discussed in Lithuanian media - Lithuanian National Radio and Television (LRT) wrote about the situation, posting links to its articles on social media, including its own Facebook page ‘LRT English’.


LRT articles about Kaliningrad sanctions go viral



Although LRT is one of the largest media outlets in Lithuania, posts published on the ‘LRT English’ page typically elicited a moderate number of reactions. From September 2021 to early June 2022, there was an average of 42 reactions per post. However, in the second half of June, some posts about the Kaliningrad transit situation skyrocketed in terms of the number of reactions, triggering thousands of likes and comments.


Number of interactions on the Facebook page “LRT English” posts for the period of 2022


Between June 1 and August 31, 2022, posts published by 'LRT English' and containing the keyword "Kaliningrad," generated 45,700 reactions. However, only the selection of 14 posts focusing mainly on sanctions imposed on Kaliningrad were characterised by the significant increase in the number of reactions, generating up to 111 times more reactions than the average. It is noteworthy that the posts which gained the highest number of responses occurred at consistent intervals of 10-14 days.


Screenshots of four 'LRT English' Facebook posts with tens of thousands of interactions


Some posts about Kaliningrad gathered hundreds or thousands of reactions, comments and shares. What's more, the number of interactions of some of the Lithuanian broadcaster's posts was much higher or comparable to the number of interactions of major global media outlets, such as Reuters, BBC News, and CNN.


Number of interactions on Facebook posts with the keyword "Kaliningrad" published between June 1 and August 31, 2022


On July 7, the Facebook page 'LRT English' announced that it had recorded a suspicious increase in activity towards posts about Russia, Kaliningrad and some other topics. The inauthentic behavior, journalists noted, came "often from what appears to be fake accounts and includes hate speech and disinformation." As a result, the site's administrators decided to restrict the ability to comment on certain posts. However, the ban did not significantly reduce the number of reactions on subsequent posts related to the Kaliningrad transit situation.


Screenshots of two 'LRT English' Facebook posts on July 11, 2022, with a high number of reactions



Analyzed data



In this analysis, we limited the topic range for articles connected to the sanctions imposed on Kaliningrad transit (13 articles) or banned Russian goods in general (1 article) published in the period from June 1 to August 31, 2022. In total 14 posts were analysed. These posts published by ‘LRT English’ FB page were characterised by the highest total number of interactions, for this analysis, which ranged from 982 to 8389 (reactions, comments and shares combined). It should be stressed, however, that only 6 posts triggered increased number of comments. The reason for was that at some point ‘LRT English’ FB page restricted possibility to comment posts about sanctions imposed on Kaliningrad, as administrators stated, that these content pieces have inauthentically high interaction rate.


Data gathered for 14 problematic Facebook posts with the keyword "Kaliningrad" published on ‘LRT English’ Facebook page between June 1 and August 31, 2022


Automated analysis of accounts' country of origin



An analysis of accounts which have interacted with 'LRT English' Facebook posts has highlighted the wide variety of countries of origin. In some cases of manual review, it appeared that there was a mismatch between the username and account URL, generic account profile images, lots of locked accounts and the area from which the account was managed. This was mainly the case with fake accounts whose creators wanted to hide their identities. Nevertheless, in our opinion, in most cases the usernames corresponded to their geographic-cultural background.


The Namsor application was used to automatically check the country of origin of the analysed accounts. For the purposes of the study, we narrowed down the number of accounts to those for which the program specified an 80% accuracy. This criterion allowed us to select 2712 unique accounts which left 3514 interactions. Based on the data that we observed, the analysed posts about Kaliningrad gained responses from 91 countries. The results are presented on the map below.


World map highlighting the countries of origin of analyzed Facebook accounts


The highest activity in relation to posts about Kaliningrad came from, respectively, Poland, Lithuania, Nigeria, Kenya and Bangladesh. While in the case of Lithuania and its immediate neighbour, Poland, the increased interest in the topic can be easily explained, the high number of interactions from African and South Asian countries is potentially more unnatural. Accounts from these countries are typically rarely engaged with content published on 'LRT English'.


Using the same data sample, we conducted a test of the distribution of regions. It turned out that more than half of the accounts interacting with the posts were from Africa or Asia, which means that these accounts were mostly involved in amplifying 'LRT English' Facebook posts.


Distribution of interactions to the LRT English posts by continent


Since we identified a large amount of traffic from countries that do not typically interact with "LRT English" posts, we decided to conduct an in-depth analysis of a large sample of accounts. As a result, we identified signs of inauthentic behaviour among many of these accounts.



Manual review of sample dataset


We were able to check 848 accounts of which more than half (56.9%) were found to have a high probability of being social media trolls or bots. Those accounts originated from 91 countries, mainly African and Asian states (77%). In total, manually checked suspicious accounts left 947 reactions, which means that some of them engaged with 'LRT English' Facebook posts more than once.

Data gathered for 848 manually verified Facebook accounts which interacted with ‘LRT English’ Facebook posts


The top three countries of origin of suspicious accounts are Nigeria, Kenya and the Philippines. With documented instances of social media bot and troll farms being used in these areas before, it was another clue pointing to the fact that the amplification of ‘LRT English’ posts was part of the artificial amplification campaign.


We have adopted several 'flags' of account being likely inauthentic (i.e., troll, bot, hacked account): very new date of creation, suspicious profile activity (e.g., inactivity for years or 'hyperactivity', continuous posting), posting exclusively political content, misaligned account name and the name in the URL (e.g., mismatched gender, geographical-cultural region), suspiciously large number of friends (over 2,000), etc.



Followers from social media bot farms in Nigeria



The analysis allowed us to identify patterns that identified groups of suspects sharing certain characteristics. In the case of accounts originating from African states, such a factor was an unusually high and similar number of friends (4,900) on many of the accounts which responded to 'LRT English' posts. Most accounts with this pattern originated from Nigeria.


Screenshots of six Facebook accounts, originating from Nigeria,with between 4,500 and 4,900 friends


During the research, analysts also found a group of African-origin accounts dedicated to the spread of pro-Kremlin propaganda, as shown in the examples below [1,2,3,4]. It is worth noting that the involvement of these accounts in spreading ‘LRT English’ posts about Kaliningrad is further proof of inauthentic amplification of the content.


The in-depth analysis made it possible to identify a small group of accounts that are most likely bots. They were characterised by a similar form of name writing, similar 'biographies', their activity and the interaction of other suspicious accounts on their profiles.


The suspicious accounts also shared other common features, such as the use of symbols to describe 'biographies', quite often indicating that these are “Facebook VIP accounts”. Also, the accounts linked their supposed other social media profiles, however these links are either inactive or lead to someone else's social media profiles. The same accounts sometimes provided their WhatsApp numbers - checking some of phone numbers revealed that they were linked to scam chat rooms.


Most of the potential bot accounts from Africa had several thousand friends, the accounts also encouraged each other to add friends, sometimes commenting that they are testing operating “speed farms”, i.e. follower aggregators.


Likely inauthentic accounts seemingly create a huge network, which means an increased ability to spread their 'liked' content and amplify the reach. By researching “friends” and “likes” on the suspicious profiles it was possible to track Facebook groups acting as “follower farms”, like a public group named “Follow To Follow Back Facebook Page” gathering 250 thousand members. It is noteworthy that accounts with thousands of friends were found to be among sharers of the investigated posts about Kaliningrad.


Google Ads ‘Account renting’ in the Philippines



During the analysis of suspicious accounts, another interesting discovery was made, which can be taken as a clue towards uncovering the modus operandi of disinformation campaigns on social media. It was an advertisement for a way to make cash, consisting of 'Client will rent your Google Ads Account to put and run his ads from 9pm to 6am'. This activity is particularly popular in the Philippines. Although the ad is for Google Ads, similar 'teams' are gathering people willing to rent their social media accounts, including Facebook. It is also worth mentioning that, although the name of the account sharing the ad was Filipino, the URL showed an obviously Russian-speaking name. The profile also had many Russian pages in the 'likes' section and provided profile links to social media widely used in Russia (Odnoklassniki and VKontakte). Both links are inactive.


An advertisement presenting the opportunity to be paid for 'renting' a Google Ads account


The extension of such activity could explain the involvement in interactions with 'LRT English' posts about Kaliningrad - but also disinformation campaigns in general - of accounts that look 'private', 'authentic' and have a long profile history, etc.



Analysis of comments



For analysis, we collected 8936 comments written by 3140 accounts under 6 posts. It is worth mentioning that some of the accounts commented multiple times - 49 accounts wrote more than 20 times, while 7 left more than 40 messages under 'LRT English' Facebook posts.


An analysis of a selection of comments revealed that they were not usually directly related to the topic of sanctions. Instead, they tackled issues on a more general level, spinning discussions about the West and the Russia-Ukraine war.


There was a significant number of both pro-Russian and pro-Ukrainian comments among the posts. Using Graphika's semantic analysis tool, it identified five significant clusters of comments. Among them, a significant group of 62 accounts, characterizing Ukraine's Azov battalion as a Nazi organization, was identified. These accounts provided links to international media sources (e.g., CNN, The Guardian, Al Jazeera). The cluster stood out in terms of both coherence and the pro-Kremlin character of comments.


Narrative analysis technology developed by Graphika helped to identify common themes among the comment text, and its network analysis technology to identify audiences engaging with these themes. Early findings include propaganda, anti-Ukraine and pro-Russia narratives, engaged with by audiences around the world, including coordinated actors who support the Assad regime in Syria and (to a lesser extent) Mexican journalists and researchers.


Conclusions



  • The conducted analysis showed the abnormally high number of interactions on posts about the EU sanctions imposed on Russia and the ban on the transit of selected goods to Kaliningrad via Lithuania published on ‘LRT English’ was artificially created to amplify this content.

  • DebunkEU.org 14 problematic posts about sanctions imposed on Kaliningrad, which got the highest number of total interactions among all ‘LRT English’ Facebook page posts published from June 1 to August 31, 2022. The selected posts were characterised by the significant overperformance of reactions, reaching up to 111 times more reactions than the average.

  • Automated analysis of data using Namsor software allowed us to check potential countries and geographical regions of origin of the accounts. Accounts with 80% accuracy were analysed, i.e., 2712 accounts. They originated from a total of 91 countries, with the highest numbers from Poland, Lithuania, Nigeria, Kenya, and Bangladesh. Looking from the perspective of geographic regions, 57% of interactions came from Africa and Asia.

  • Manual analysis of 848 accounts revealed more than half of them (56.9 %) to be suspicious. Most suspicious accounts originated from African and Asian countries, with Nigeria, Kenya and the Philippines making the top three nations. These findings are consistent with information about exploiting bot and troll farms by foreign actors to manipulate social media.

  • The analysis of profiles revealed that the majority have a similar pattern - 4,900 number of friends. Moreover, the small cluster of African Facebook accounts, potential bots, interacting with ‘LRT English’ posts about Kaliningrad sanctions were detected.

  • Manual research allowed us also to determine a potential modus operandi for campaigns such as the one analysed - ‘renting’ the accounts for different purposes, e.g., Google Ads manipulation - organised schemes in the Philippines. Some of the accounts, seemingly administering such activity, were identified as interacting with 'LRT English' Facebook posts. One of these accounts was a Russian user disguising his identity under a Filipino surname.

  • An analysis of comments was conducted on six available posts, which gathered 8936 comments written by 3140 accounts. Graphika’s narrative detection and social network mapping technologies revealed topics of conversation in the posts, which included pro-Kremlin narratives depicting Ukraine as a Nazi state.

  • Graphika’s narrative analysis technology and network analysis technology allowed us to find anti-Ukraine and pro-Russia narratives engaged with by audiences around the world, including coordinated actors who support the Assad regime in Syria and (to a lesser extent) Mexican journalists and researchers.

The analysis of artificially amplified 'LRT English' posts about the Kaliningrad sanctions revealed a new way of manipulating information on social media. Posts containing links to reliable LRT press materials became the target of an attack from foreign Facebook bot and troll farms. The supposed aim of the malign social media campaign was to increase pressure on international opinion and create fears about the possible implications of continuing sanctions.


These goals coincide with the interests of Russia, which at the same time waged a political campaign to intimidate public opinion, threatening serious retaliation measures. While the analysis did not show the direct involvement of state actors in manipulating 'LRT English' posts, it is worth noting that similar actions would be in line with the goals of pro-Kremlin propaganda.

 

This analysis was carried out by DebunkEU.org analysts Aleksandra Michałowska-Kubś and Jakub Kubś.

 

Cover illustration: by The Daily Beast.

InformNapalm_logo_07.png

Partneris Lietuvoje

bottom of page