During the election campaign for municipal councils and mayors, potential attempts to artificially boost the content posted by certain candidates were noticed on Facebook. In response to the possible infringements, the NGOs Debunk.org and White Gloves filed a complaint with the Central Electoral Commission of Lithuania.
Local elections in Lithuania / J. Stacevičius / LRT
Debunk.org, together with researchers from the Faculty of Communication at Vilnius University, identified cases of potential artificial amplification of content on Facebook. The cases were related to posts by one of the candidates to become the Mayor of Vilnius Petras Gražulis, thus creating an impression of support and popularity for the politician that was not in line with reality.
The authenticity of interactions on social media is assessed using a conversion funnel approach. Out of the total number of views, approximately 2-7% of users comment, share and/or like. Therefore, a case where the number of views is lower than the number of interactions is not logically possible unless manipulation techniques are used.
In the case of Mr Gražulis, 56 videos were uploaded to his personal Facebook account during the election campaign period. In 49 cases, the number of social interactions is higher than the number of views, which would not have been possible without some sort of manipulation. The 49 videos in question had a total of 26 141 social interactions and only 2823 views. This seems like a failed attempt to increase visibility, but the fact that it is a potential case of manipulation is itself a cause for serious concern.
For example, the post “I have taken into account people's suggestions, I have added one more sticker to my visor, for a Lithuania WITHOUT the conservatives”, published on 19 January by Mr Gražulis, has only 93 views, but has received as many as 425 likes and 56 comments. It should be noted that Facebook views are counted 3 seconds after the start of the video, so a significantly higher number of interactions compared to the number of views is an indication of inauthentic activity and could be an attempt to manipulate the algorithm to get more people to see the content. Similar cases have been identified in other posts by this candidate for Mayor of Vilnius.
2023 01 19 Video on Gražulis’ Facebook account: 425 reactions, 56 comments and only 93 views
2023 01 14. Video on Gražulis’ Facebook account, 1.3 thousand interactions, 221 comments and only 466 views
2023 01 23. Video on Gražulis’ Facebook account, 423 reactions, 40 comments and only 234 views
The use of social media manipulation during election campaigns is dangerous because it creates an illusion that particular politicians or parties have a high level of support. In this way, public opinion and people’s decisions are influenced, which has a direct impact on the outcome of the elections.
In this case, the possible use of bots to artificially inflate the visibility of posts on Facebook violates the fundamental principles of elections, which are regulated by law: transparency, fairness, integrity, and fair competition.
Voters need to be aware that candidates might use illegal means of manipulation of social media platforms to promote their posts. In addition, the political agitation disseminated by a candidate must be true to reality, as it may influence voters' choice. In this case, by distorting the indicators of the posts published on Facebook, an impression of greater support is created, which may mislead undecided citizens.
The organization “White Gloves” has recorded an increasing number of violations of political agitation on the internet in every election. While it is possible to identify and legally sanction agitation infringements in general, there is limited scope for promptly identifying and proving cases of artificial amplification of posts on social networks. This is the first time that such technologies have been used in electioneering, and it is likely that similar cases, perhaps even on a larger scale, may occur in the future. This poses a major threat to electoral transparency and proper competition between candidates. The current legal framework does not provide the instruments to identify such irregularities and hold those responsible accountable.
Although the use of social media bots for political campaigning on social networks is currently not separately regulated, the concept of manipulation of an online platform and its prohibition and sanctions are planned to be included in the Law on Public Information of the Republic of Lithuania, thus giving greater importance to the above-mentioned acts. This would also be an important step in the run-up to the 2024 presidential, Seimas, and European Parliament elections.
By submitting a complaint to the Electoral Commission, White Gloves (Lit. Baltosios pirštinės) and Debunk.org aim to urge the authorities to take all available legal measures, or, if such measures are absent, initiating amendments to legislation, so that in the future, the manipulation of online platforms by artificially increasing the visibility of content could prevented, and the responsible parties will be held accountable in accordance with the procedure laid down by law. Moreover, the organisations urge the Electoral Commission to begin an investigation together with Meta.
Online platform manipulation - a method of increasing the dissemination of content from accounts on an online platform by means of automatically managed or automatically controlled accounts or groups of accounts on an online platform, with the aim of artificially increasing the number of views, comments, shares, likes, followers and/or subscribers, in order for the content and/or accounts to be prioritised by the algorithms of the online platform, and thus to be made available to a large number of users of the online platform.
Social media bot (an automated fake account) - social networking bots mimic human activity, but to a degree that real social network users cannot achieve.
Troll or sock-puppet account (fake account run by a real person) - a person behind a fake account who purposely posts content on demand (e.g., negative, or positive comments on an issue).
Troll or bot farms (clusters of fake accounts) - fake accounts operate in a systematic, coordinated way to spread a particular message and/or increase the visibility of content in order to manipulate public opinion.
Methods of creating fake accounts:
Automation – using AI-generated photos; auto-generated names; auto-generated accounts with little personal information.
Identity theft – using photos of real people; real accounts are simulated; accounts are 'mature' and contain a lot of information; because of the detail of the profile, these accounts appear authentic to social network platforms.
Report prepared and first time published on the DebunkEU.org web page.