A curious thing happened in Romania. A candidate, widely regarded as unpopular, scored the best result. How did this happen? According to declassified documents, it turns out that influence campaigns and TikTok may have been instrumental.
The Romanian documents reveal a digital campaign with activities of individuals having links to Russia to promote Călin Georgescu. The most fascinating technique - which I also cover in my book Propaganda - is the use of subliminal tactics. This campaign was so impactful a case that Romanian Supreme Court have just annulled the first round of presidential elections.
Subliminal messaging and tactics
Initially, the campaign employed subliminal techniques to subtly influence public opinion without direct mentions of Călin Georgescu. Content primarily focused on motivational themes, emphasising traits like leadership, renewal and national pride. Hashtags, visual cues and emotionally charged imagery were strategically used to link these values to Georgescu’s candidacy. As the campaign progressed, the candidate’s name began to appear more explicitly, but still within a broader emotional framework. This may have resonated with the initial emotional and promotional build-up.
Paid influencers and campaign strategy
The campaign hinged on an influencer-for-hire site, offering to engage influencers to, well, promote things. The campaign relied heavily on paid influencers and coordinated social media efforts to maximise its impact. Some influencers were fully aware of their participation, receiving financial compensation via influencer-for-hire platforms like FameUp and FA Agency. Others unknowingly participated by sharing pre-drafted content.
Clear instructions were provided to influencers, guiding them on specific hashtags, profiles and themes related to the "ideal President" model. This ensured uniformity in messaging and helped amplify Călin Georgescu's narrative.
Overall campaign strategy and execution
The overall campaign strategy was highly coordinated, with more than 25,000 TikTok accounts activated around two weeks before the election. This effort was not spontaneous but carefully orchestrated. Interestingly, 797 of the accounts were created as early as 2016 and remaining inactive until the campaign ramped up. The accounts were assigned unique IP addresses, indicating a deliberate effort to conceal the true scale of the network. This approach bypassed typical botnet detection patterns.
The activity of these accounts was coordinated outside TikTok, primarily on Telegram and Discord, where recommendations for bypassing TikTok’s verification mechanisms were discussed.
Despite the Central Electoral Bureau's (BEC) decision requiring the removal of electoral content that lacked proper identification, TikTok’s response appears to be inadequate. TikTok claimed that it took down requested content. However, it remained visible and accessible to the Romanian public, even after the campaign ended and on election day, violating Romanian law. Furthermore, TikTok categorised some of these electoral posts as entertainment content, allowing them to be widely displayed. Curiously, TikTok appears to contradict the declassified documents, claiming that the numbers of accounts taken down were orders of magnitude smaller. The discrepancies are today not possible to explain easily. Perhaps the suspected accounts were classified differently by security services or TikTok itself.
The issue at stake is also campaign finance. Călin Georgescu declared a null (0 lei) electoral budget to the Romanian electoral authority, claiming that no funds were spent during his campaign. Evidence revealed that financial support was provided by a Romanian citizen Bogdan PEȘCHIR, who funded the promotion of Georgescu's candidacy on TikTok.
The response
Detailed sociological research on trends in public opinion, as well as the political outlook of specific parties and candidates in the target states, is being conducted to identify vulnerabilities and the capacity to respond. The focus has been on information aggression, including propaganda uses of artificial intelligence for content creation. This is, in fact, typical in political PR and marketing campaigns, and it underlines the professional approach in information influence. Such activities cannot be downplayed blaming them on bunch of fringe groups or amateur trolls.
From declassified reports from the Serviciul de Informații Externe (SIE), the Romanian Intelligence Service, it seems that Moldova was a testing ground for these strategies, where the manipulation of information was tailored to specific demographic groups and populations. Content was adapted to reflect political developments.
If steering election results in a medium-sized state is possible, it’s a clear cause for concern. With elections in Germany, Poland, and perhaps France in 2025, this highlights the need to strengthen informational resilience in both civilian and military structures.
In Romania, TikTok is used by nearly 50% of the population, making it a potentially significant platform for influence. However, attributing an entire election result to a TikTok campaign requires careful scrutiny.
This is a high stakes case on several levels. The Romanian case raises further concerns. Canceling election results is unprecedented. It prompts questions about whether such a strong and rapid response was justified. Initial elements of the of Romanian Constitutional Court opinion does not unambiguously explain the basis for this decision. They said the following:
“In the case at hand, the Court notes that, according to the above-mentioned 'Information Notes,' the main aspects of the electoral process regarding the election of the President of Romania in 2024 are those related to the manipulation of voters' votes and the distortion of equal opportunities for electoral competitors, through the non-transparent use of digital technologies and artificial intelligence in the electoral campaign, in violation of the electoral legislation, as well as through the financing of the electoral campaign from undeclared sources, including online.”
Setting a precedent where elections can be overturned due to vague references to “disinformation” or “AI use” (mentioned six times) risks undermining citizens' choices. While these tools can effectively produce and deliver propaganda, can they truly steer an entire election upside down?
This article was originally published on Dr Lukasz Olejnik’s Substack and can be viewed here.
To find out more about influence operations and subliminal tactics, read Dr Lukasz Olejnik’s book, Propaganda: From Disinformation and Influence to Operations and Information Warfare. Lukasz Olejnik is active on X and Bluesky.