You can read about the significant progress of our campaign and donate here. We’ve done all of this on a shoestring and by relying on unpaid volunteers. However, the reality is that ahead of the Elections Bill we really need your help to achieve our campaign goal of making electoral advertising regulation a reality.
Please donate to our campaign if you feel as passionately as we do about the need for electoral ad regulation.
The long-awaited Elections Bill, now retitled the Representation of the People Bill, has finally been published.
For those of us working on electoral reform, this moment has been highly anticipated. Electoral law has seen almost no meaningful change in two decades, despite the transformation of campaigning through social media, data-driven targeting and now AI-generated content.
There is much to welcome in the Bill. Measures to strengthen enforcement and tackle foreign interference are important. Streamlining imprint enforcement may improve clarity and consistency. Modernising aspects of electoral administration is necessary.
But if the aim is truly to restore trust in our democracy, an enormous gap remains: how political parties communicate with voters in the digital and AI age.
The government’s 2025 strategy paper, Restoring trust in our democracy: Our strategy for modern and secure elections, raised expectations. In particular:
- Sections 117 to 124 committed to strengthening transparency and enforcement of imprint rules
- Section 83 supported the development of a code of conduct for campaigning.
These commitments aligned closely with the four recommendations we have called for. Having reviewed the Bill as published, however, they are not reflected in the legislation.
Below, we set out the four targeted measures that would meaningfully strengthen the Bill. We have worked closely with Full Fact and other stakeholders in civil society to align our recommendations. The cross-party House of Lords Democracy and Digital Technologies Committee in 2020 also made all of the recommendations below (excluding the third, covering the recent development of deepfakes).
1. Clear and prominent party identification
The strategy paper recognised that campaign material has sometimes been designed to mislead voters by mimicking local newspapers or obscuring party affiliation. It committed to requiring information about party affiliation on campaigning material.
Our recommendation is straightforward.
Our proposal:
Strengthen imprint rules to require that all digital and printed election material prominently displays the name of the political party commissioning or promoting the advert.
The Bill clarifies enforcement of imprint rules and strengthens oversight. But it does not explicitly legislate for clear, prominent party identification in the way the strategy paper suggested.
That distinction matters.
A small-print promoter’s name is not the same as clearly identifying the political party behind a message. Voters should not need to scrutinise fine print to understand who is asking for their vote.
YouGov research has shown that 81% of the public think it is important that “it is clear which political party is responsible for an advert?”
In this example, during the 2021 London mayoral election, letters were sent to voters purporting to be from City Hall and TfL but were in fact Conservative election advertisements. While transparency about who is speaking is essential, it is only half the issue. The far greater threat to our democracy, in our view, is what is being said (which is why this needs to be implemented alongside our fourth recommendation below). An imprint at the bottom of the leaflet containing the political party name would not make its provenance clear to the majority of voters who see the advert.
2. A transparent, searchable database of all election adverts
Digital campaigning has transformed political communication, but it has also created opacity.
Our second recommendation is the creation of a transparent, searchable repository of election advertising, including who paid for the advert, which organisation sponsored it, who was targeted, and how much was spent.
Some platforms maintain ad libraries. But these are fragmented, inconsistent and controlled by the platforms themselves. There is no independent, statutory, comprehensive UK repository. RPA was the first to put forward the concept of an independent database of election ads in 2018 to the DCMS Fake News and Disinformation Inquiry. It was included in the recommendations of the inquiry as per our proposal below.
Across the European Union, regulation is now moving in this direction. From March 2026, the EU will launch a centralised repository for political ads, with metadata, targeting information, API access and authentication protocols. It’s a significant step forward in solving the problem of “dark advertising” (ads that are only seen by the audience they are targeted to, and not open to scrutiny). This tactic came into the spotlight during the Brexit referendum and continues to challenge transparency in online campaigning.
However, the Bill does not currently provide for such a repository.
Without comprehensive transparency, micro-targeted dark ads remain effectively invisible to journalists, researchers and voters. In an era of data-driven persuasion, transparency is foundational to trust.
Our proposal:
“Political advertising items should be publicly accessible in a searchable repository – who is paying for the ads, which organisations are sponsoring the ads, who is being targeted by the ads – so that members of the public can understand the behaviour of individual advertisers. It should be run independently of the advertising industry and of political parties.”
3. AI, Deepfakes and Digital Impersonation
Perhaps the most striking omission in the Bill is its silence on AI and synthetic media.
The Bill contains no explicit reference to AI, deepfakes or synthetic impersonation.
This is not a hypothetical risk. AI-generated deepfakes can produce realistic audio or video of candidates saying things they never said. These tools can distort elections, confuse voters and undermine trust in democratic processes.
AI-generated deepfakes enable realistic impersonations of political candidates, including fabricated audio or video of them saying things they never said. As we outlined in our article in LabourList, AI deception is a threat to democracy – it’s time the law caught up, these tools can distort elections, confuse voters and undermine trust in democratic processes. Conservative MP George Freeman spoke at our recent House of Commons event about his own experience of this issue, and we believe he will be supporting the recommendation. This targeted measure focuses only on intentional voter deception – not creativity, criticism or humour.
Our proposal:
It should be illegal to create or distribute digital content that falsely purports to be a political candidate (or claims to be speaking for them), with the intent to deceive voters. Alongside that, there should be a clear exemption for parody, satire or artistic expression.
The Bill rightly addresses foreign interference and funding risks. But domestic disinformation, particularly AI-enabled impersonation, is an emerging vulnerability that also requires attention.
If we are future-proofing our elections against external manipulation, we must ensure they are resilient against digital deception at home.
4. Introduce content regulation for misleading election advertising
The govenrment’s strategy paper mentioned the development of a new code of conduct for campaigning.
The Bill does not establish a statutory mechanism to implement or enforce such a code.
If a code is voluntary, practical questions arise. Who adjudicates alleged breaches? What sanctions apply? Is compliance mandatory? What happens if a party ignores it?
A code can help build trust, but only if it is accompanied by clarity and accountability.
Elections are increasingly fought on wafer-thin margins. In contests decided by dozens or hundreds of votes, misleading advertising can be decisive.
Let’s look at just a few of these tight contests:
- Runcorn by-election – Reform beat Labour by 6 votes (0.02% vote share)
- Doncaster mayoral election – Labour held off Reform by 698 votes (1% vote share)
- North Tyneside mayoral election – Labour scraped by with a margin of 444 votes (0.8% vote share)
- West of England mayoral election – Labour edged Reform by 5,945 votes (2.9% vote share)
- The Gorton and Denton by-election looks set to join these (written on the day of polling).
In contests this close, advertising matters. And not just any advertising – but the kind that is deliberately designed to mislead, and some of which hides party affiliations. You can read a good case study of attempts to distort democracy from the North Tyneside mayoral election here.
RPA tested appetite for an election advertising code at the 2024 mayoral elections.
Six London mayoral candidates signed up, including the Green, Liberal Democrat and Labour candidate Sadiq Khan. They agreed to observe the RPA pledge that their advertising would be accurate, and rapidly corrected if not.
Other notable supporters were The Green Party of England and Wales, Plaid Cymru, Alliance, Andy Burnham, Tracy Brabin and Neil Kinnock.
You can read the code here.
RPA has also twice trialled an election advertising review panel to demonstrate how content regulation could work practically in a busy election period, including at the last General Election.
Research over several years has shown overwhelming support for the regulation of misleading ads. For example, our 2024 Opinium research found that 76% of the UK public would “support it being a requirement in the UK that factual claims in election adverts must be accurate”. Only 4% would oppose such a measure.
Examples of international precedents:
- New Zealand – long-established election ad regulation supported by all major parties.
- Australia – two states (Australian Capital Territory and South Australia) already have electoral ad regulation. MP Zali Stegall has also introduced two bills nationally to prohibit misleading political ads.
The unanimous recommendation by the cross-party Democracy and Digital Technologies Committee in 2020 should be implemented, which we’ve included verbatim below.
Our proposal:
“The relevant experts in the ASA, the Electoral Commission, Ofcom and the UK Statistics Authority should co-operate through a regulatory committee on political advertising. Political parties should work with these regulators to develop a code of practice for political advertising, along with appropriate sanctions, that restricts fundamentally inaccurate advertising during a parliamentary or mayoral election, or referendum. This regulatory committee should adjudicate breaches of this code.”
Restoring trust in politics is not just about electoral mechanics. Voters deserve an election advertising environment that is transparent, truthful and fit for the modern information landscape. The Representation of the People Bill offers a vital opportunity to bring the UK in line with public expectations, international best practice and the standards already applied to commercial advertisers.
We urge MPs from all parties to support these practical reforms to help safeguard trust in our democracy. These are proportionate protections for voters in a modern democracy.
In the coming weeks, we will engage constructively with MPs and peers across parties to explore targeted amendments to the Bill.
We urge MPs from all parties to support these practical reforms to help safeguard trust in our democracy.
About Reform Political Advertising
Reform Political Advertising is a not-for-profit, politically neutral organisation run by unpaid volunteers campaigning for accuracy and transparency in election advertising.
It was founded by Alex Tait and Benedict Pringle in 2018, who have significant experience working in the marketing and advertising industry. It is chaired by Lord David Puttnam, who chaired the Democracy and Digital Technologies Committee in 2020.
You can find out more about the campaign on our website: reformpoliticaladvertising.org.