DPIA advice for UK organisations

Data Protection Impact Assessments

Expert DPIA advice for UK organisations processing personal data

A data protection impact assessment is a risk management process that organisations must conduct when processing is likely to result in high risk to individuals. We help organisations move past tick-box compliance to genuine risk identification and mitigation. DPIAs are especially important now because the Data Use and Access Act 2025 has widened the circumstances in which they are required, and because AI deployment has driven demand for proper assessment of novel processing methods. Done well, a DPIA protects your business by identifying problems early and building accountability into your project governance before costly mistakes happen.

Where the DPIA covers an AI or automated decision-making system, see our hub post on UK AI Regulation: What the Law Actually Says for the wider regulatory context and the new DUAA Articles 22A to 22D.

When a DPIA becomes necessary

A DPIA is mandatory under Article 35 UK GDPR where a type of processing is likely to result in high risk to the rights and freedoms of individuals. The statute names three categories of high-risk processing: systematic evaluation of personal aspects (including profiling), large-scale processing of special category data, and systematic monitoring of public areas. Beyond these statutory triggers, the ICO has published a list of additional types of processing that typically warrant a DPIA assessment, including novel technologies, combined data processing from multiple sources, and any processing of children’s data for profiling or automated decision-making.

In practice, DPIAs become necessary in several common situations. When you deploy new technology, especially in areas where you have limited prior experience, a DPIA helps you understand what could go wrong. Artificial intelligence and automated decision-making systems almost always require assessment, whether you are building them yourself or using third-party tools. Processing personal data across borders, or sharing data with processors in different jurisdictions, triggers DPIA requirements because of the complexity of regulatory compliance and data subject rights. Material changes to existing processing such as expanding its scope, using new categories of data, or changing the way you analyse it should also prompt reassessment.

The Data Use and Access Act 2025 introduced a material change: the Secretary of State now has power under Article 35 to specify by secondary legislation circumstances in which a DPIA is or is not required, supplementing the existing “high risk” threshold. This means the legal position is evolving, and the circumstances that trigger DPIAs may narrow or broaden depending on future regulations. Until new secondary legislation is published, organisations should apply the existing Article 35 and Article 35(4) triggers.


Why DPIAs matter now

DPIAs are particularly important in the current regulatory environment. The ICO now treats the absence of a proper DPIA as an aggravating factor in enforcement proceedings. When the ICO investigates data breaches or other compliance failures, one of the first things it examines is whether the organisation conducted a DPIA before processing began. A thorough, contemporaneous DPIA demonstrates that you took data protection seriously as a governance matter, not merely a tick-box exercise. Conversely, a missing or inadequate DPIA is viewed as evidence of negligence in accountability.

AI deployment has driven a sharp increase in demand for DPIAs, particularly those assessing automated decision-making systems. Organisations are increasingly running novel algorithms and predictive models, and regulators expect proper assessment of these before they go live. The EU AI Act, which applies to some UK organisations processing data of EU residents, requires conformity assessments of high-risk AI systems, and these overlap significantly with DPIA requirements. DPIAs are also expected to become a key element of future ICO enforcement priorities as the Secretary of State’s powers under the Data Use and Access Act 2025 are clarified through secondary legislation.


Where organisations over-engineer DPIAs

A common failure is treating the DPIA as a compliance document rather than a genuine risk management tool. Many organisations commission elaborate 50 or 60 page documents that meticulously describe the processing, list generic risks with their stock mitigations, and conclude that all risks are “acceptable” or “mitigated”. These DPIAs satisfy no one: they do not prevent problems, they do not drive better decision-making, and they do not serve the business purpose of identifying what might actually go wrong.

Over-engineering takes several forms. Template-driven DPIAs that copy the same language across multiple projects, without questioning whether the template’s assumptions apply. DPO consultation that becomes a box-ticking exercise where the DPO reviews a finished draft rather than challenging the processing design itself during development. DPIAs completed after processing has already started, defeating their purpose as a preventive tool. DPIAs that are never reviewed or updated even when processing changes materially. And DPIAs that identify residual high risk but fail to reach any actual conclusion about whether the processing should proceed, or whether the ICO should be consulted under Article 36.

The under-engineering problem is the inverse: some organisations skip DPIAs altogether because they seem burdensome, and then discover during an ICO audit or after an incident that they should have completed one long ago.


What a good DPIA looks like

A good DPIA is proportionate to the actual risk. For straightforward, low-risk processing such as routine data management for existing business purposes, a DPIA can be brief and focused. For complex or novel processing, including AI systems or large-scale profiling, it needs genuine depth and rigorous analysis. The key elements of any DPIA are these.

Accurate description of the processing. Start with a clear account of what data you collect, from where, how you use it, who has access to it, and how long you retain it. This is not a privacy notice. It should be technical, specific, and free of omissions. If your processing has changed or is more extensive than people assume, that should be evident at this stage.

Honest risk identification. Name the risks that are genuinely possible given your processing, your data subjects, and your control environment. Common risks include unauthorised access, loss or corruption of data, discrimination through biased algorithms, rights of individuals not being respected, and unintended secondary uses. Do not list risks and then immediately state they are “mitigated” without evidence. Separate the risks that exist before you add controls from the risks that remain after you add them.

Proportionate mitigation measures. Describe what you will do to reduce risk. Some measures are technical such as encryption or access controls. Others are organisational such as staff training or contractual terms with processors. The measures should be specific: not “implement security controls” but “implement encryption at rest using AES-256, and enforce multi-factor authentication for administrative access”.

Clear residual risk conclusion. After mitigation, state what level of risk remains. Low risk means you can proceed. Moderate risk means you should monitor and reassess regularly. High risk means you should consider whether the processing should proceed at all, whether different controls might reduce risk further, or whether you should consult the ICO under Article 36.

Genuine decision on Article 36 consultation. Article 36 requires you to consult the ICO where a DPIA indicates processing is likely to result in high risk that cannot be adequately mitigated. Many organisations treat this as optional. It is not. If your DPIA concludes that residual risk is high and that you cannot reduce it further, you must consult the ICO before going live. This is a legal requirement, not a recommendation.

We approach DPIAs as working documents integrated into project governance. A DPIA should be started early, revised as the project develops, and refreshed if material changes occur. It should inform architecture and controls decisions, not simply document them after the fact. For organisations with ongoing data processing, DPIAs should be reviewed at least annually and updated if business or technical context changes materially.


When to instruct specialist DPIA support

Many organisations have capable internal Data Protection Officers who can handle routine DPIAs efficiently. Specialist DPIA support should be engaged in several contexts. When you are deploying artificial intelligence or automated decision-making systems, particularly those that profile individuals or make decisions affecting them, specialist input helps ensure you have properly assessed algorithmic risk, discrimination potential, and transparency obligations.

Novel technologies such as synthetic data, advanced analytics platforms, or edge computing introduce processing patterns that most organisations have not encountered before. A specialist can help translate technical complexity into risk terms and identify control gaps. Processing involving children’s data carries heightened risk and tighter regulatory requirements. Cross-border processing, particularly where data flows between the UK and the EU or beyond, requires assessment of dual compliance obligations and adequacy decisions.

Situations where a DPIA indicates high residual risk benefit from specialist advice on remediation options and on whether Article 36 consultation is appropriate. A fresh external perspective also reduces confirmation bias and ensures the DPIA is genuinely challenging rather than confirmatory.


Frequently asked questions about data protection impact assessments

Do we need a DPIA for every processing activity we undertake?

No. DPIAs are required only for processing that is likely to result in high risk to individuals. Routine processing such as employee records management, ordinary email communications, or customer billing does not normally require a DPIA unless you introduce elements that increase risk, such as profiling employees or cross-border data transfers.

Can we use a template for every DPIA, or does each one need to be tailored?

Templates can provide useful structure, but each DPIA must be genuinely tailored to the specific processing and context. A template that works for payroll processing will not adequately address the risks of a new AI system. We recommend templates as starting frameworks, not standard documents, and we always rewrite substantial sections to reflect the actual processing and actual risks in each case.

What happens if we complete a DPIA and find that residual risk is high?

You should not ignore high residual risk. Your options are to redesign the processing to reduce risk, add additional controls, or consult the ICO under Article 36 before proceeding. Consultation does not mean the ICO will block the processing, but it gives the regulator a chance to review your assessment and advise on whether they believe risk can be adequately mitigated. Proceeding without consultation when your own DPIA suggests high risk is a material compliance failure.

How often should we review and update an existing DPIA?

At minimum, you should review a DPIA annually and update it if the processing, the technologies you use, or your control environment change materially. If you introduce a new data processor, expand the scope of the processing, or adopt new tools such as analytics platforms, the DPIA should be refreshed. DPIAs are not one-off compliance events: they are part of ongoing governance.

Who should be involved in a DPIA?

A DPIA should involve the business owner who is responsible for the processing, the technical team who will implement it, your Data Protection Officer or internal data privacy contact, and any relevant external advisors. The ICO expects you to consult your DPO, but consultation is not a box-tick. The DPO should be genuinely involved in challenging the processing design and the risk assessment, not simply reviewing a finished document.

What is the difference between a DPIA and an ICO prior consultation under Article 36?

A DPIA is a document and process that you must complete before processing begins if processing is likely to result in high risk. An Article 36 consultation is a request to the ICO for advice when a DPIA indicates that high risk remains despite mitigation efforts. Prior consultation does not delay implementation indefinitely. The ICO has a statutory obligation to advise within ten weeks, though this can be extended by four weeks if the ICO needs more information. You should initiate a DPIA early so that you have time for prior consultation if the DPIA indicates high risk.

Need help with a Data Protection Impact Assessment?

Representative experience

Recent and representative matters include:

  • Prepared DPIAs for an AI-enabled customer analytics platform processing behavioural data, usage patterns and inferred preferences across a telecoms customer base.
  • Advised a connected vehicle manufacturer on the Article 35 DPIA requirement for processing vehicle telemetry, geolocation and driver behaviour data, including consultation with the ICO on high residual risks.
  • Conducted a DPIA for a financial services firm deploying automated creditworthiness scoring using alternative data sources, assessing the Article 22 implications of solely automated decision-making.
  • Reviewed and updated existing DPIAs for a health-tech company following changes to its data processing operations, ensuring continued compliance with the ICO’s screening criteria.
  • Advised a public sector body on the interaction between the DPIA obligation and the Equality Act 2010 public sector equality duty in the context of algorithmic decision-making.

Rob Bratby has conducted and advised on DPIAs for telecoms, payments and technology businesses, including in his General Counsel roles. Bratby Law is recognised by Lexology as a Global Elite Thought Leader for data protection.

Related data protection pages

See also our other data protection pages:

Independent directory rankings

Our specialist expertise is recognised in major independent legal directories:

  • Chambers & Partners: Rob Bratby is ranked as a band 2 lawyer in the UK Guide 2026 in the “Telecommunications” category: Chambers
  • The Legal 500: Rob Bratby is listed as a “Leading Partner – Telecoms” in London (TMT – IT & Telecoms): The Legal 500
  • Lexology: Rob Bratby is featured on Lexology’s expert profiles as a Global Elite Thought Leader for data: Lexology
Chambers and Partners accreditation
Legal 500 accreditation
Lexology Global Elite Thought Leader accreditation

See our TelXL case study for an example of how we advise on data protection impact assessments for AI-enabled products.

Ready to discuss your matter?

Primary sources