Skip to main content
KBS_Icon_questionmark link-ico
Propaganda hero image ;

Russian cyber and information warfare and its impact on the EU and UK

Dr Lukasz Olejnik

Visiting Senior Research Fellow

15 April 2025

Cyberwarfare is a geopolitical tool, with Russia using it as statecraft. Cyberattacks linked to Russian actors have affected European countries and the UK, raising concerns about security, economic stability and democracy. Information operations are likewise a systematic risk. These are long-term activities and cannot be reduced to the mundane sending of individual messages.

Russias reputation in cyberwarfare

Russia is a leading cyber actor. Notable incidents include the Bundestag hack (2015) and the Macron leaks (2017), both were executed remotely. But there were also close-access operations. Reports of Russian infiltration attempts, such as targeting the Organisation for the Prohibition of Chemical Weapons (OPCW) in the Netherlands, highlight its aggressive approach.

Development of Russian cyber capabilities

Russia has a strong foundation in cyber, with state agencies cultivating cyber expertise. Intelligence services like the GRU, the FSB, or the SVR conduct operations involving cyber-espionage, sabotage, and disinformation campaigns. We know this from numerous public reports, including reports from the US Department of State and the UK’s National Cyber Security Centre.

Methods of Russian cyberattacks range from cybercrime for monetary gain, such as ransomware attacks and financial fraud by state-tolerated cybercriminals, to cyber espionage and disruption targeting infrastructure, government institutions, and corporations. The NotPetya malware (2017) caused severe financial damage and supply chain disruptions. Another major tactic is disinformation and psychological operations, where digital propaganda, bot networks and manipulated media influence public opinion in Europe.

The effects of Russian cyberattacks go beyond system disruptions. Infamously, NotPetya caused billions in damages and led to Western sanctions for its reckless activities in cyberspace. Russian disinformation campaigns aimed to influence political processes or elections in various States, undermining trust in institutions. It played a role in past elections in France (2017), Germany (2025), and the US (2016, 2020, 2024). Concerns persist about elections in the EU in 2025 and increased tensions across Europe. This is not a drill.

Comic showing steps of digital subversion: demoralise, discredit, and divide.

Threat to democratic stability

It's important to note that disinformation does not aim for instant change — that is not possible. It's impossible to impact opinions with one-off messages. It's incomprehensible to understand that, upon reading a single email, tweet, or other message, the recipient is somehow suddenly or magically infected or turned into an info-zombie. The real deal is gradual influence, including delicate shifts in narratives and fostering divisions.

Russian cyber operations function through a complex ecosystem of state-employed cyber operators, state-sponsored hacking groups, and cybercriminals acting with government’s approval. Intelligence agencies directly manage these operations, which include espionage, sabotage, and disinformation. Meanwhile, state-sponsored hacker groups operate with strategic alignment to Russian objectives, often serving as deniable assets for more aggressive operations.

Additionally, cybercriminal networks are tolerated or even assisted by the Russian state, executing financial crimes and ransomware attacks while occasionally conducting operations that serve state interests. The blurred lines between these actors complicate attribution, allowing Russia to maintain plausible deniability while maximising the impact of its cyber operations. Tactics of information operations include fake media, mirrored sites to mimic the look and feel of outlets like The Guardian (while the content within bears no likeness), as executed by the Russian Doppelgänger information operator group. Other activities may include automated bot networks, human-hired trolls, and paid influencers working under their real names.

Foreign Information Manipulation and Interference (FIMI)

“Our information space has become a geopolitical battleground,” stated Kaja Kallas, the EU foreign policy chief, in the third report on foreign information manipulation and interference (FIMI).

Between November 2023 and November 2024, 505 incidents of foreign information manipulation and interference (FIMI) were classified, involving 38,000 channels across 25 platforms. These incidents targeted 90 countries, and 322 organisations, and produced over 68,000 pieces of content. The infrastructure behind these incidents included official state media, covert networks, state-aligned proxies and unattributed channels. Information influence and operations use a layered and scalable system.

Russia remains the most active actor. Nearly half of the incidents in the dataset targeted Ukraine (257). France (152), Germany (73) and Moldova (45) also faced sustained targeting. Russian campaigns are highly adaptable – tailored to local languages and contexts while consistently aligned with geopolitical goals. Germany and France were hit with localised campaigns; meanwhile, Ukraine, Moldova, Poland and the Baltic States are the key focus areas due to their geopolitical importance.

Content localisation was used in 349 incidents, adjusting narratives to match regional culture, language, and current events. Impersonation tactics were widespread: 124 cases involved faked (mirrored) sites designed to mimic legitimate media, while 127 cases involved impersonating institutions or public figures. The goal is to amplify messages and erode trust. 73% of active FIMI channels (around 28,000 out of 38,000) were short-lived or disposable accounts – bot networks used for coordinated inauthentic behaviour. The X platform accounted for 88% of observed activity, which is unsurprising given that X is a key space for opinion-shaping.

Artificial Intelligence (AI) also played a role in at least 41 incidents. Its role was the optimisation of inauthentic content generation, such as deepfakes and synthetic audio, as well as automating large-scale dissemination. AI does not necessarily lead to a greater impact; the point is to lower the cost and boost output. Currently, the use of AI by threat groups appears fairly basic and rudimentary. I assess that most—if not all—cyber threat actors and information manipulators are experimenting with AI, but few have adopted it as a standard tool in their operations. That shift is inevitable, though it’s unlikely to happen through large, server-side AI models like those offered by Google Gemini or OpenAI GPT.

Operations targeting elections usually begin well before the vote and extend well beyond it. The goal is to erode trust in the process rather than back specific candidate names.

Such operations are not random trolling. They are well designed and executed influence campaigns. Their point is not to send individual messages but to run a long-term process. Only then may propaganda be successful.

What's the aim of information operations?

Russian information warfare seeks to weaken support for Western policies and create divisions within NATO.

Even if a Ukraine peace agreement is reached, information warfare will persist. there's no need to go as far back as the 19th or even 20th century. Informational conflict is a continuous phenomenon. The way it works can be tracked even in real time today. Still, even in Ukraine, it experienced extensive information operations between 2014 and 2022. When armed conflict ends, the conflict may shift into other domains.

Influence on elections

Digital propaganda or disinformation need not be blunt as supporting or going against a particular candidate in elections. It may adopt a cleverer approach of targeting divisive social and political issues. States could see narratives on migration, economy, or sovereignty manipulated to create discord. Of particular notion are issues of current interest in the State — there is no need to discuss outdated issues that have faded from public consciousness. Discussing a hotly contested topic from the 1990s in the 2020s is ineffective.

In general, instead of adopting or promoting a blunt for-or-against stance, or even dealing directly in political content, propaganda may use a smarter approach focusing on 'issues'. Issues may include gender equality, global warming, vaccination, the economy, price increases, culinary preferences, and so on. The propagandist then plays both sides, deploying some assets to support, and other assets to go against the cause. Et voila: a recipe to play on existing topics.

It's a popular misconception that trolls or propagandists create topics or themes. It's much more cost-efficient to play on existing ones.

How to defend against these activities?

EU and UK responses to Russian cyber threats

Enhancing cyber defences involves strengthening cybersecurity frameworks, improving incident response, and ensuring resilience against attacks. Legislative and regulatory measures must be enforced to combat cybercrime and disinformation effectively. Sanctions and diplomatic actions serve as deterrents, targeting Russian entities responsible for cyberattacks. Additionally, increasing public awareness and media literacy is crucial in making societies more resilient to disinformation tactics. A major challenge remains offensive cyber operations. While the EU can coordinate defence, offensive actions like hacking back remain under the jurisdiction of individual member states, leading to policy disparities.

To further strengthen defences against foreign influence, the UK has implemented the Foreign Influence Registration Scheme (FIRS). This system requires individuals (in some cases including journalists) and organisations (in some cases including universities, NGOs, charities and the media) acting on behalf of foreign powers to register their activities, as a transparency and accountability measure.

Conclusion

Russian cyberwarfare threatens EU and UK security and democracy. Espionage, cyber sabotage, and disinformation are key tools. Strengthening cyber defences, enforcing regulations, and countering disinformation are critical. 

The EU and UK must be proactive, adaptable and united to counter evolving cyber threats. But there's much more than simply cyber. Recently, Russian-suspected plots were unveiled to deliver explosive parcels. Lithuania and Poland announced official probes of Russian sabotage operations aimed at physically destroying property such as shopping centres, activities which also impacted the UK. Are Europe and the UK prepared to deal with untrained individual saboteurs, hired for a low cost, and using instant messengers like Telegram?

Lukasz Olejnik is an independent researcher and consultant in cybersecurity and privacy, with a strong focus on multidisciplinary and interdisciplinary research.

He holds a PhD in Computer Science from INRIA, France, and an LLM in Information Technology Law from the University of Edinburgh. He integrates knowledge and skills from academia, industry, technology, research, policy and law.

In this story

Lukasz  Olejnik

Lukasz Olejnik

Visiting Senior Research Fellow

Latest news