Cybersecurity Challenges of the 2026 Hungarian Parliamentary Elections
Risk Analysis and Information Warfare Scenarios
TL;DR
The information environment of the 2026 Hungarian parliamentary elections is expected to be defined by four main risk factors:
The synergy of domestic government and foreign (primarily Russian, American, and Chinese) disinformation, creating a coordinated, mutually reinforcing narrative space.
Networked propaganda that bypasses the ban on paid political advertising, spreading through hard-to-trace, pseudo-civilian structures similar to "Digital Civic Circles," built on personal connections.
The deployment of generative artificial intelligence (deepfakes), which, with a well-timed, difficult-to-debunk forgery in the final stretch of the campaign, could significantly influence voter decisions.
The systematic erosion of trust in the electoral process and its institutions, aimed at questioning the democratic results and inciting social tensions on and after election day.
The Digital Battlefield of the 2026 Elections
Preparations for the 2026 Hungarian parliamentary elections are taking place in a digital and political environment that is fundamentally different from previous ones. The situation is characterized simultaneously by deep-rooted domestic political polarization, the transformation of technology platform regulations, and the rapid development of the information warfare toolkit. We will attempt to uncover the cybersecurity and information threats that may define the next election cycle within a framework of an experience-based, timeline-based risk model. It is important to emphasize that this analysis is not a prophecy, but a probability-based threat scenario built on past events and current trends.
Specifics of the Hungarian Political and Media Environment
Research uniformly describes Hungary's political climate as one of deep and lasting polarization. Analyses by Policy Solutions and Political Capital point out that society has fractured into political factions, which is coupled with a low level of general and institutional trust. This environment makes voters extremely susceptible to disinformation campaigns, as political identity and group affiliation often override the need for fact-based information.
This situation is further exacerbated by the extraordinary asymmetry of the Hungarian media market. The Central European Press and Media Foundation (KESMA), as a conglomerate of about 500 media outlets, supplemented by the state media (MTVA), holds a dominant position, especially in reaching rural and older demographics. The 2022 election observation report by the Organization for Security and Co-operation in Europe (OSCE) also highlighted that the governing party's media superiority, the blurring of state and party communication, and opaque campaign financing violated the principle of equal opportunity.
The New Era of Digital Campaigning
The digital battlefield is being fundamentally redrawn by the decision of the largest technology platforms, Meta (Facebook) and Google, to restrict or eliminate the display of paid political advertisements in the European Union from the autumn of 2025. This move seemingly reduces the role of financial dominance, but in reality, it leads to the appreciation of more sophisticated, network-based influence techniques. Instead of raw advertising spending, the emphasis is shifting to organic reach, community organizing, database building, and micromobilization. In this changed environment, actors with better organization and deeper activist networks may gain an advantage.
In parallel, the spread of generative artificial intelligence (AI) is revolutionizing disinformation creation. The production of deepfake videos and audio recordings has become orders of magnitude cheaper and simpler, allowing malicious actors to create and disseminate extremely believable, difficult-to-debunk forgeries. The use of this technology for campaign purposes was already observed in the 2023 Slovakian elections, serving as a cautionary tale for the future.
A further peculiarity of the Hungarian political environment is that foreign—primarily Russian and Chinese—disinformation efforts do not appear as classic external interference, but as a symbiotic reinforcement of existing government narratives. The Kremlin's strategic goals (disrupting Western unity, weakening support for Ukraine) perfectly align with the Hungarian government's "pro-peace" rhetoric. Hungarian public media and pro-government press regularly give space to pro-Kremlin narratives, while Chinese propaganda also appears in the domestic public sphere through news exchange agreements with Chinese state media. American governmental communication elements enter the domestic information space as Hungarian government narratives, with government members and influencers amplifying the messages. Thus, we are not talking about external attacks, but an internal-external information coalition, where boundaries blur, and influencing and misleading the public are realized through direct governmental or government-affiliated communication panels.
The Ecosystem of Domestic Information Warfare
The digital battles of the 2026 elections will be defined by a complex, multi-level, and centrally coordinated information ecosystem. Its elements range from the ruling party's latest network initiatives and its massive media empire to "grey zone" actors operating on social media with non-transparent funding, all held together by a common framework narrative built on fear-mongering.
The "Digital Civic Circles" (DPK): A New Tool for Bypassing Ad Bans and for Micromobilization
Prime Minister Viktor Orbán announced the launch of the Digital Civic Circles (DPK) in July 2025, at the Tusványos summer university, with the aim of strengthening the right-wing, conservative community in the online space he described as "hostile territory." Although the initiative appears on the surface to be a network of simple Facebook groups, the underlying strategy is far more sophisticated. The DPK is a multifunctional system that simultaneously serves database building, direct newsletter-based communication, and the circumvention of advertising restrictions, acting as an echo chamber for the political narrative.
The system is built on the logic of referral marketing: members are encouraged to spread pro-government content within their own circle of acquaintances through personal messages. This method is more effective than traditional advertising in three respects. Firstly, it bypasses the political ad bans of Meta and Google, as the content is shared not by the party but by private individuals. Secondly, through personal recommendation, the messages appear more credible and are more likely to reach users who would otherwise reject direct political communication. The biggest advantage is that it perfectly conceals the operation of bot networks and troll farms timed for the campaign period. This practice, where a centrally directed campaign is disguised as a grassroots, civil initiative, is a classic case of astroturfing.
Astroturfing is a deceptive marketing or propaganda technique in which an organization or individual simulates a seemingly spontaneous, grassroots social movement to influence public opinion. The goal is to make people believe that a particular product, service, or political stance is widely supported, when in fact it is an organized, often paid, campaign. The term astroturfing refers to the AstroTurf brand of artificial grass, alluding to the faking of "grassroots" movements.
The Governmental Communication Machine: The Coordinated Operation of KESMA, Public Media, and "Grey Zone" Actors
The Hungarian information space operates according to a functionally differentiated, three-tiered model that allows for the effective and widespread dissemination of government narratives while minimizing political and legal risks.
Official Level: The task of government communication and public media (MTVA) is to convey positive, constructive narratives (e.g., economic successes, family support, pension increases) and more moderate political messages. The public media, which engages in open government propaganda, creates a basis of legitimacy for the system.
Mainstream Pro-Government Media: The KESMA empire, which includes several hundred media outlets, amplifies government messages and begins to outline and thematize enemy images (Brussels, George Soros, the domestic opposition). This huge media holding is largely sustained by non-transparently distributed state advertising funds, which the European Commission also considers to have a market-distorting effect.
The "Grey Zone": Actors that are formally independent but in reality financed from public money or non-transparent sources, such as the Megafon Center and its affiliated influencers, do the "dirty work." They are responsible for the most aggressive campaigns based on personal smears and open disinformation on social media. Research by Political Capital and the Mérték Media Analysis Workshop has shown that Fidesz and its coterie spend orders of magnitude more on online advertising than the entire opposition combined, and a significant portion of this spending is directed at spreading negative, enemy-image-building narratives. This three-tiered division of labor allows the government to formally distance itself from the harshest attacks, while they maximally serve its political interests.
The method is simple: disinformation originating from the grey zone is picked up, quoted, and referenced by government actors. This enters the mainstream media space, thereby authenticating and elevating the fake news and its source, rendering criticism against the content weightless and ridiculous, often discrediting the critics themselves. Finally, the official level incorporates it into its own communication depending on how much the given topic resonates with voters.
Note: As a Hungarian peculiarity, it can also be observed that ruling party politicians use their perceived or real credibility to generate grey-zone topics and fake news themselves.
The following table summarizes the main actors and characteristics of the Hungarian disinformation ecosystem.
The "Peace vs. War" Framing Narrative: Psychological Impact Mechanisms and Strategic Goals
A central element of the government's communication is the framing narrative built on the "peace vs. war" dichotomy, which already proved its effectiveness during the 2022 election campaign. The essence of the strategy is to position the government as the "sole representative of peace," while casting its political opponents—both the domestic opposition and Western allies—as irresponsible, "pro-war" actors who would drag the country into armed conflict.
This communication is built on fear-mongering as a political tool. In uncertain times laden with crises, voters are more receptive to simple, clear messages and the image of a strong leader who will protect them from external threats. The narrative reduces a complex geopolitical situation to a simplistic moral choice ("he who is not with us is with the war"), exploiting people's cognitive biases and their aversion to complexity.
The strategy's effectiveness is enhanced by its deep resonance with historical grievances and traumas rooted in the Hungarian national identity, particularly the Trianon syndrome. The topoi of the "abandoned nation," the "diktats of Brussels," and "sovereignty threatened by external powers" are cultural codes that are easily activated and, when linked with the "peace" narrative, elicit strong emotional reactions from a significant portion of the electorate.
Characteristics of the Digital Battlefields: Platform-Specific Strategies and Risks
The regulations restricting political advertising, which will come into effect in 2025, are fundamentally reshaping the channel ecology of political communication. Instead of paid advertising, the focus is shifting to organic reach, network building, and platform-specific content creation. Each social media platform has a different user base, operational logic, and moderation practice, so political actors must apply differentiated strategies on them.
Facebook / Instagram: The Center of Network Mobilization
Tactics: Meta's platforms remain the most popular in Hungary, with 68% of the adult population using Facebook daily. Political communication here focuses on community organizing and network-based distribution. The most prominent tools for this are group-driven distribution models, like the "Digital Civic Circles." These groups, seemingly grassroots but actually centrally coordinated, use astroturfing to bypass advertising bans, as content is shared by private individuals, authenticating it as a personal recommendation. Additionally, live streams create the appearance of direct, unfiltered communication, while meme campaigns serve the effective spread of fast, emotion-based, and hard-to-refute messages.
Advantage: The platform's greatest strengths are its vast, pre-existing user base and the potential for rapid response. A coordinated reaction to a political event or news item can be generated within hours, dominating news feeds.
Risk: The biggest risk lies in the difficult-to-control organic distribution. Disinformation spreading in closed groups and private messages is almost invisible to fact-checkers, and by the time it reaches the public sphere, it may have already caused significant damage.
TikTok: Reaching the Youth and Viral Potential
Tactics: TikTok is the key platform for reaching the youngest voter demographics. Political actors here communicate in short, video-based narratives, often packaging their messages in entertainment or lifestyle content. Influencer collaborations play a prominent role, where well-known opinion leaders promote a political agenda, often through hidden sponsorships.
Advantage: The platform's algorithm-driven nature holds rapid viral potential; a single successful video can achieve millions of views in a short time without the user needing to follow the politician.
Risk: TikTok has fewer fact-checking resources than Meta's platforms, and due to rapidly changing trends, tracking and refuting disinformation is more difficult. The blurring of political messages and entertainment content makes identifying propaganda challenging.
YouTube: The Appearance of Credibility and Lasting Impact
Tactics: YouTube is the platform for longer, more in-depth content. Here, political actors build their image with documentary-style narratives and explainer videos. These videos are suitable for constructing more complex enemy images or presenting political programs in greater detail.
Advantage: The more professional production quality and longer format can create a higher sense of credibility for the viewer than a quick TikTok video or a Facebook meme. Furthermore, the content remains permanently available and searchable, allowing it to influence public opinion over the long term.
Risk: The spread of YouTube videos is slower, and their production requires more resources. However, due to their lasting impact, a well-made piece of disinformation can circulate on the platform for years, continuously poisoning the discourse.
Telegram / WhatsApp: Closed-Channel Panic-Mongering
Tactics: Encrypted messaging apps form the most closed information ecosystem. The main tactic here is closed-group panic-mongering and the dissemination of fake news disguised as "confidential," internal information. The content is often spread as context-free screenshots or forwarded messages ("forward chains"), making it impossible to identify the source.
Advantage: The distribution is uncontrollable and reaches large masses extremely quickly without being filtered by any moderation or fact-checking.
Risk: These channels are almost completely undetectable from the outside. The deepfake audio recording circulated before the 2023 Slovakian elections also began to spread in such closed groups, demonstrating the danger these platforms pose to the integrity of elections.
X (Twitter): Shaping Elite and International Discourse
Tactics: Although its user base in Hungary is more limited, X (formerly Twitter) plays a key role in addressing journalists, political analysts, and the international public. The main tactics are hashtag campaigns and the manipulation of trending topics, which can artificially place an issue on the agenda.
Advantage: The platform's open nature provides easy international visibility, making it suitable for conveying domestic political messages to the foreign press and international organizations.
Risk: Due to its moderate Hungarian audience, the platform is less suitable for broad domestic mobilization, but it remains an important battlefield because of its influence on opinion-forming elites.
Information Attack Scenario for the 2026 Elections: A Timeline-Based Risk Model
The model is based on past election experiences, international examples, and current technological and political trends.
The scenario is not linear but consists of mutually reinforcing, interlocking phases, where the operations carried out in each stage prepare and support the subsequent ones.
The table below summarizes the main elements of the risk model.
Preparatory Phase (Late 2025 – January 2026): Laying the Foundation for Narratives
During this phase, the goal is the long-term shaping of public opinion and the construction of the necessary infrastructure for information operations. The activities are less conspicuous but are of strategic importance.
Narrative Seizure: Attackers do not necessarily create entirely new narratives but rather connect to and amplify existing, deeply embedded social and political fault lines. Russian disinformation channels may amplify critical voices against the EU and NATO, anti-migration sentiments, and the "pro-peace" stance on the war in Ukraine, which aligns with government communication. Chinese influence, through its collaborations with Hungarian public media, may focus on building a positive image of China and dampening Western criticism. American government propaganda communication serves as a reference point, a crutch, in communicating Hungarian government positions. The main goal is to observe which topics resonate with voters and to flood the domestic information space with a high volume of false news and distortions, so that voters can no longer, and eventually no longer want to, distinguish the real from the fake. The ultimate goal is to blunt critical thinking.
Infrastructure Building: Hungarian-language "news portals" and social media pages may be established, operated from abroad, which build a significant follower base with neutral, entertaining, or clickbait content before the campaign period. As the campaign intensifies, these pages will gradually switch to disseminating political content and disinformation.
Infiltrating Influencer Networks: Opinion leaders may be activated, especially on platforms targeting young people (TikTok, YouTube, Telegram), who spread political messages as local faces and seemingly credible sources. Their financing may come from non-transparent domestic (e.g., foundations operating on public funds) or foreign sources.
Campaign Launch (February – March 2026): Destabilizing the Political Space
With the start of the official campaign period, operations become more intense and targeted. The aim is to discredit political competitors and to create uncertainty among voter groups.
Targeted Disinformation: Using available databases and social media profiling capabilities, specific demographic groups are targeted with tailored fake news.
Retirees: Alongside the well-established narrative that "Brussels will take away the 13th-month pension," fake news may appear about an EU-planned "pension rationalization" or an impending economic collapse, against which only the current government offers protection.
Youth: As part of the "war" narrative, the threat of compulsory military service may be amplified should the opposition come to power. News about forced conscription in Ukraine, already a theme in pro-government media, can easily be turned into a domestic political weapon.
Activation of Bot and Troll Networks: Coordinated bot and troll armies are activated on Facebook and X (formerly Twitter), which artificially amplify certain topics, flood comment sections, and discredit critical voices.
Phishing Attacks: Targeted phishing campaigns are launched against political parties, campaign staff, journalists, and civil society organizations. The goal is to obtain internal communications, strategic plans, or personal, compromising data that can be used in the later, more intense phase of the campaign.
Final Campaign Stretch (April – May 2026): Actively Influencing Voter Decisions
In the last 3-4 weeks before the elections, the attacks aim for maximum impact, often with means that are difficult or impossible to refute in a short time.
Deepfake Videos/Audio Recordings: The last 48-72 hours before the campaign silence period are the most critical. A believable-looking audio or video recording generated by artificial intelligence—in which a politician makes a scandalous statement or is caught in a compromising situation—published at this time can significantly influence undecided voters. The deepfake audio recording circulated during the campaign silence in the 2023 Slovak elections, which attempted to discredit the president of Progressive Slovakia and a journalist, is a cautionary example of the potential impact of such an operation.
"Leak & Spin" Tactic: The timed leaking of information previously obtained through phishing. The materials may be manipulated, taken out of context, or even real, but their presentation (spin) aims for the maximum discrediting of the political opponent. The Borkai scandal, which broke before the 2019 municipal elections, showed how a scandal based on real events, but purposefully timed and presented, can dominate public discourse and cause political damage.
Closed-Channel Panic-Mongering: In WhatsApp, Telegram, and private Facebook groups, panic-inducing, uncontrollable messages can be spread. These could include "inside information" about election fraud, falsified photos of ballots, or news about "opposition activists preparing for provocation."
Election Day and the Following 72 Hours: Delegitimizing the Process
On election day and immediately after, the goal of the operations is no longer to influence the voters' decision but to undermine trust in the electoral process and its outcome.
DDoS Attack on the Election System: A Distributed Denial of Service (DDoS) attack on the online systems of the National Election Office (NVI) (valasztas.hu). Although the website crash on election day in 2018 was officially classified not as a cyberattack but as an internal configuration error, the incident highlighted the system's vulnerability. Such an incident, even if it does not affect the paper-based processing of votes, can create a sense of chaos and provide fertile ground for rumors of fraud.
Fake Exit Poll Data and "Evidence": Manipulated exit poll results spread on social media in the hours before the polls close are aimed at influencing voter turnout. In parallel, posts "supported" by falsified photos and videos about "election fraud" may spread, preparing the ground for a post-election delegitimization campaign.
Post-Result Destabilization (1–4 Weeks): Deepening the Loss of Trust
After the election results are announced, the goal of disinformation campaigns is to maintain political instability and to erode long-term trust in democratic institutions.
Maintaining a Dual Narrative: For the losing side, maintaining the "the election was stolen" narrative, building an alternative reality that questions the winner's legitimacy in the long run. The #StopTheSteal movement following the 2020 US election and the storming of the Capitol show the most severe consequences of this strategy. At the same time, an international smear campaign can be launched against the winning side, questioning the fairness of the election and undermining the country's international positions.
Inciting Social Tension: Organizing protests built on the election fraud narrative and radicalizing them through social media, further deepening political polarization.
Prevention and Resilience-Building Strategies
Defending against the outlined complex and multi-level threats requires a coordinated, multi-faceted strategy that simultaneously focuses on institutional preparedness, regulation of technological platforms, and increasing societal resilience. However, the situation in Hungary is complicated by the fact that some of the institutions responsible for defense are themselves active participants in the political struggle. Taking all this into account, the responsibility of the Hungarian government, and the role, effectiveness, and independent operation of the National Election Office (NVI) and the National Cybersecurity Institute (NKI) have become more critical than ever during previous parliamentary elections.
Institutional Level Defense
Real-time Disinformation Monitoring: Effective action against cross-border disinformation campaigns is based on international cooperation. Closer, daily information sharing between the National Cybersecurity Institute (NKI) and the European Union's Rapid Alert System (RAS) is essential. The RAS is an alert system between member states that allows for the early detection of disinformation trends and coordinated responses. The NKI should not only receive this information but also actively contribute to the system's operation by sharing domestic experiences.
Cybersecurity Training for Campaign Staff: Political parties and their campaign staffs are prime targets for phishing and other cyberattacks. The NKI or independent cybersecurity firms should proactively provide training, vulnerability assessments, and consulting to all political actors. This should extend to the use of secure communication channels, preparing staff against phishing, and developing crisis communication plans.
Critical Analysis of the Role of the Sovereignty Protection Office (SzvH): Although the declared goal of the office, established in February 2024, is to protect national sovereignty and uncover foreign interference attempts, its activities to date raise serious concerns. The office's first investigations were not launched against networks proven to spread foreign (e.g., Russian) disinformation, but against government-critical investigative portals (Átlátszó) and civil society organizations (Transparency International) that also operate on foreign (e.g., EU) funds, or against public opinion researchers. This raises the suspicion that the SzvH, in its current form, is not a tool in the fight against disinformation, but a political weapon used to intimidate and stigmatize critics of the government.
Regulatory Environment and Tech Platform Responsibility
The European Union's Digital Services Act (DSA), fully effective since February 2024, imposes new obligations on large online platforms to prevent the spread of disinformation and illegal content. The regulation requires platforms to conduct risk assessments on the impact of their services on elections, make their algorithms more transparent, and provide data access to independent researchers.
However, the effectiveness of the DSA in Hungary may be a gyengítheti. Firstly, state-funded content that spreads disinformation is difficult to classify as "illegal," as it often operates on the borderline of freedom of expression. Secondly, the enforcement of the regulation will be the task of national authorities, the so-called Digital Services Coordinators (DSCs). In a political environment where the independence of state institutions is questioned, there is a risk that the DSC will not act with sufficient force against government disinformation. Critics argue that the implementation of the DSA could be slow and cumbersome against state-level, coordinated disinformation campaigns, which could lead to an "enforcement deficit."
Increasing Social Resilience
Independent Fact-Checking: Workshops like Lakmusz, which operates as part of the Hungarian Digital Media Observatory (HDMO) project, are crucial for the rapid identification and refutation of false information. International research confirms that fact-checking articles and the labeling of false content on social media measurably reduce their resharing. However, the effectiveness of fact-checking in Hungary is limited by several factors: lack of resources, the overwhelming reach of the pro-government media empire, and deep political polarization. In a society where factional identity overrides the acceptance of facts, fact-checking often only reaches already skeptical groups, while dedicated believers dismiss refutations as "opposition propaganda."
Strengthening Digital Literacy: In the long run, the most effective defense is to develop society's critical thinking and media literacy skills. School education programs, like the Digital Thematic Week, and initiatives aimed at adults, such as the Digital Welfare Program, are important steps in this direction. At the same time, research points to significant shortcomings in the digital competencies of the Hungarian population, especially in recognizing fake news. In the future, special attention must be paid to training aimed at recognizing new types of threats posed by generative AI, especially deepfakes. However, the root of the problem is not merely informational but political-psychological: fact-checking and digital literacy are necessary but not sufficient conditions for the fight against disinformation in a country where the level of social trust is extremely low and political identity has become an all-encompassing factor.
Protecting the integrity of elections in this hybrid warfare environment is a complex task that affects society as a whole. Successful defense requires the coordinated, proactive, and continuously adapting cooperation of technical (NKI), regulatory (DSA enforcement), civil (fact-checking workshops), and educational (digital literacy) levels.
The greatest challenge, however, is the deeply polarized political environment, in which the institutions and authorities responsible for defense and information (e.g., the Sovereignty Protection Office) can themselves become part of the political struggle and, instead of acting against disinformation, may use their tools to silence critics of the government. In this context, the role of independent media, civil society, and close cooperation with international partners is elevated, serving as the last bastions of democratic publicity and the fairness of elections.