
Data Protection Impact Assessments
Expert DPIA advice for UK organisations processing personal data
A data protection impact assessment (DPIA) is a structured process for identifying and minimising the data protection risks of a project or processing activity. Under Article 35 of the UK GDPR, a DPIA is mandatory before any processing that is likely to result in a high risk to individuals. The EU GDPR imposes the same obligation, and the EU AI Act creates additional requirements for organisations deploying high-risk AI systems. Organisations operating across the UK and EU must satisfy both sets of DPIA requirements, which are diverging as each regime develops independently.
Bratby Law provides DPIA advice for UK and international organisations. We advise controllers and processors on when a DPIA is required under UK and EU law, how to conduct assessments that satisfy the ICO and EU supervisory authorities, and how to embed DPIA processes into product development, procurement and regulatory change programmes. Our practice covers the full DPIA lifecycle from screening through to ICO prior consultation, with particular expertise in AI systems, telecoms and payments.
The regulatory framework for DPIAs
A data protection impact assessment is required under Article 35(1) of the UK GDPR where processing is likely to result in a high risk to the rights and freedoms of individuals. Article 35(3) identifies three categories of processing that always require a DPIA: systematic and extensive profiling with significant effects, large-scale processing of special category data, and systematic monitoring of publicly accessible areas. The ICO’s published list under Article 35(4) adds ten further categories. Where a DPIA identifies high residual risk, the controller must consult the ICO under Article 36 before processing begins.
The ICO has published a list of processing operations that require a DPIA, supplementing the Article 35(3) categories. These include processing involving innovative technology, decisions about access to services or opportunities, large-scale profiling, biometric data processing, genetic data processing, data matching or combining datasets, invisible processing, and tracking of individuals. Where two or more of the ICO’s criteria are met, a DPIA is almost certainly required.
Article 36 imposes a further obligation: where a DPIA indicates that processing would result in a high risk that the controller cannot mitigate, the controller must consult the ICO before the processing begins. The ICO has eight weeks (extendable by six weeks for complex cases) to respond, and may exercise its corrective powers under Article 58(2), including ordering the controller not to proceed with the processing.
When is a DPIA required?
The threshold is “likely to result in a high risk”. In practice, the ICO expects organisations to conduct a DPIA in any of the following situations:
- Deploying AI or machine learning systems that process personal data, particularly where outputs affect individuals (credit scoring, recruitment screening, automated content moderation)
- Launching a new product or service that collects personal data at scale, or processes it in ways not anticipated at the point of collection
- Implementing employee monitoring systems, including email scanning, location tracking, or productivity analytics
- Processing children’s data, health data, or other special category data for new purposes
- Sharing personal data with new third parties or transferring data to jurisdictions without adequate protection
- Combining datasets from different sources to build profiles or derive new insights about individuals
- Using biometric data for identification or verification, including facial recognition and voice analysis
The Data Protection Act 2018, section 64 supplements the UK GDPR by requiring DPIAs for certain law enforcement processing. Controllers should also consider conducting voluntary DPIAs for lower-risk processing where a structured assessment would support accountability obligations under Article 5(2).
What a DPIA must contain
Article 35(7) sets out the minimum content requirements. A compliant DPIA must include a systematic description of the processing operations and their purposes, including the legitimate interest pursued where applicable. It must assess the necessity and proportionality of the processing in relation to the purposes, and contain an assessment of the risks to the rights and freedoms of data subjects. Finally, it must set out the measures envisaged to address those risks, including safeguards, security measures, and mechanisms to ensure the protection of personal data.
The ICO expects the DPIA to go beyond a compliance checklist. A good DPIA describes the information flows in detail, identifies specific risks rather than generic categories, records the decisions made about risk mitigation, and explains why remaining residual risks are acceptable. Where the DPIA identifies risks that cannot be mitigated to an acceptable level, it should record the decision to proceed to prior consultation under Article 36 or to redesign the processing.
How Bratby Law helps with DPIAs
Our DPIA advisory services cover the full lifecycle from screening through to ICO prior consultation. We advise organisations across telecoms, payments, and technology on data protection impact assessments for new products, regulatory change programmes, and M&A transactions.
- DPIA screening and scoping: we assess whether a DPIA is required for your processing activity and define the scope, including identifying the processing operations, data flows, and stakeholders to consult
- DPIA drafting and review: we prepare DPIAs that meet Article 35(7) requirements, or review and strengthen existing assessments prepared by your internal team or DPO
- AI and automated decision-making assessments: specialist DPIA advice for organisations deploying AI systems, addressing algorithmic fairness, explainability, and the interaction between DPIA obligations and the AI and Automated Decision-Making framework
- Risk identification and mitigation: structured risk analysis using the ICO’s risk framework, with practical recommendations for technical and organisational measures to reduce residual risk
- Prior consultation support: where a DPIA concludes that high risks cannot be mitigated, we prepare and manage the Article 36 prior consultation process with the ICO
- DPIA integration into business processes: we help organisations embed DPIA screening into procurement, product development, and change management processes so that assessments are triggered at the right point in the project lifecycle
- Regulatory change DPIAs: assessments for processing changes driven by new legislation, including the Data (Use and Access) Act 2025 and its amendments to the UK GDPR accountability framework
- Dual UK/EU DPIA compliance: for organisations processing personal data across both jurisdictions, we prepare DPIAs that satisfy ICO and EDPB requirements simultaneously, managing differences in screening criteria, prior consultation processes and AI Act interaction
EU DPIA requirements and EDPB guidance
Organisations operating across the UK and EU must comply with DPIA obligations under both regimes. The EU GDPR imposes the same Article 35 obligation, but the European Data Protection Board (EDPB) and individual EU supervisory authorities have issued guidance that differs in emphasis from the ICO’s approach. The EDPB Guidelines on DPIAs (WP 248 rev.01, adopted from the Article 29 Working Party) set out nine criteria for identifying processing that requires a DPIA. Where processing meets two or more criteria, a DPIA is presumed necessary. Each EU member state supervisory authority has also published its own list of processing operations requiring a DPIA under Article 35(4), and these lists vary between jurisdictions.
The EU approach to DPIAs is also evolving alongside the EU AI Act. Article 26(9) of the AI Act requires deployers of high-risk AI systems to use the information provided under Article 13 to carry out a DPIA where required under the GDPR. The EDPB and EU AI Office are expected to issue joint guidance on the interaction between AI Act fundamental rights impact assessments and GDPR DPIAs. There is no equivalent UK requirement, since the UK has not adopted AI-specific legislation with DPIA interaction provisions.
Dual UK/EU DPIA compliance
Bratby Law advises organisations that process personal data across both the UK and EU on conducting DPIAs that satisfy both regimes. A single DPIA can cover both UK and EU processing, but must address the requirements of both the ICO and the relevant EU supervisory authority. Key differences to manage include the screening criteria (ICO list vs EDPB criteria and national SA lists), the prior consultation thresholds and processes (ICO under UK GDPR Article 36 vs the relevant EU supervisory authority under EU GDPR Article 36), and the interaction with the EU AI Act for organisations deploying AI systems in EU markets.
For organisations subject to both regimes, we recommend a DPIA methodology that maps to the ICO and EDPB frameworks simultaneously, using the stricter standard where they diverge. This avoids duplication while ensuring compliance with both. Where the DPIA identifies different risk profiles under UK and EU law (for example, because the DUAA recognised legitimate interest basis reduces the risk assessment under UK law but not under EU law), the DPIA should document both analyses and the measures adopted for each jurisdiction. See our UK/EU Data Protection Divergence page for the broader context of dual compliance.
ICO enforcement and DPIAs
The ICO treats failure to conduct a required DPIA as a standalone breach of the UK GDPR, separate from any substantive data protection failings the assessment might have identified. In its enforcement practice, the ICO has issued reprimands and enforcement notices where organisations deployed new processing activities without conducting the required assessment. The penalty framework under Article 83(4)(a) permits fines of up to 10 million euros (or the sterling equivalent) or 2% of annual worldwide turnover for DPIA failures, even where no data breach has occurred.
The ICO’s 2024 audit framework specifically examines DPIA processes as part of its accountability assessments. Organisations that can demonstrate a systematic approach to DPIA screening, a documented assessment methodology, and evidence that DPIA outcomes have been acted upon are better positioned in regulatory engagement. Conversely, organisations that treat DPIAs as a retrospective compliance exercise, completing them after processing has begun, face both regulatory criticism and the practical difficulty of retrofitting safeguards into live systems.
The Data (Use and Access) Act 2025 and DPIAs
The Data (Use and Access) Act 2025 (DUAA) introduces changes to the UK GDPR accountability framework that affect DPIA practice. The Act amends Article 35 to give the Secretary of State power to specify circumstances in which a DPIA is or is not required, supplementing the existing “high risk” threshold. It also modifies the prior consultation mechanism under Article 36, allowing the ICO to issue guidance on when prior consultation is expected rather than relying solely on the statutory trigger.
The DUAA also introduces a new “recognised legitimate interest” basis under Article 6, which may reduce the scope of processing that requires a DPIA where the legitimate interest is on the Secretary of State’s approved list. However, the interaction between recognised legitimate interests and DPIA requirements is not yet settled. Organisations should continue to apply existing DPIA screening criteria until the ICO publishes updated guidance on the new provisions. Bratby Law monitors these developments and advises clients on how the evolving accountability framework affects their DPIA obligations.
DPIAs in telecoms, payments, and technology
Data protection impact assessments arise frequently in the sectors where Bratby Law has deep regulatory expertise. Telecoms operators processing communications data and location data at scale face DPIA obligations when launching new services, deploying analytics platforms, or responding to lawful intercept requirements. Payment service providers processing transaction data must assess the data protection implications of open banking integrations, fraud detection systems, and the interaction between UK GDPR requirements and payments regulation obligations under the Payment Services Regulations 2017.
Technology companies building AI-enabled products face particular DPIA challenges. The assessment must address not only the personal data used to train and operate the model, but also the downstream effects of automated decisions on individuals. Where AI processing involves profiling or automated decision-making with legal or similarly significant effects, the DPIA intersects with the rights under Article 22 and the transparency obligations under Articles 13 and 14. Our AI and Automated Decision-Making practice works alongside our DPIA advisory to address these overlapping requirements.
Frequently asked questions about DPIAs
When is a DPIA legally required?
A DPIA is required under Article 35 of the UK GDPR before any processing that is likely to result in a high risk to individuals. The ICO has published specific criteria including large-scale profiling, systematic monitoring, innovative technology, and processing of special category data. If two or more of the ICO’s criteria apply, a DPIA is almost certainly required. Failing to conduct a required DPIA is a standalone regulatory breach.
What happens if we do not conduct a required DPIA?
Failure to conduct a required DPIA is a breach of Article 35, enforceable independently of any data breach or other substantive failing. The ICO can issue enforcement notices requiring the organisation to stop the processing, and fines under Article 83(4)(a) can reach 10 million euros or 2% of annual worldwide turnover. The ICO has issued reprimands for DPIA failures even where no personal data was compromised.
Can we conduct a DPIA after processing has started?
The UK GDPR requires a DPIA to be carried out before processing begins. Retrospective DPIAs do not satisfy the Article 35 obligation, although they may help demonstrate accountability going forward. If processing is already underway without a required DPIA, the priority is to conduct the assessment promptly, identify and address any high risks, and document the steps taken. The ICO takes a more favourable view of organisations that self-identify the gap and act to remedy it.
Do we need a DPIA for every AI project?
Not every AI project requires a DPIA, but most will. AI systems that process personal data typically involve innovative technology, profiling, or automated decision-making, each of which is an ICO DPIA trigger. The question is whether the specific AI application is likely to result in a high risk to individuals. A DPIA screening assessment at the project scoping stage is the most efficient way to determine whether a full DPIA is needed.
What is prior consultation with the ICO?
Prior consultation under Article 36 is required where a DPIA concludes that the processing would result in a high risk that the controller cannot mitigate through reasonable measures. The controller must submit the DPIA to the ICO before processing begins. The ICO has eight weeks to respond and may require changes to the processing or prohibit it entirely. Prior consultation is rare in practice but is a mandatory step where the residual risk threshold is met.
How does the Data (Use and Access) Act 2025 affect DPIA requirements?
The DUAA gives the Secretary of State power to specify circumstances where a DPIA is or is not required, supplementing the existing Article 35 high-risk threshold. It also introduces recognised legitimate interests that may reduce the scope of processing requiring a DPIA. The detailed impact depends on secondary legislation and updated ICO guidance, which are expected during 2026. Organisations should continue to apply current DPIA screening criteria until the new framework is in force.
Representative experience
Recent and representative matters include:
- Prepared DPIAs for an AI-enabled customer analytics platform processing behavioural data, usage patterns and inferred preferences across a telecoms customer base.
- Advised a connected vehicle manufacturer on the Article 35 DPIA requirement for processing vehicle telemetry, geolocation and driver behaviour data, including consultation with the ICO on high residual risks.
- Conducted a DPIA for a financial services firm deploying automated creditworthiness scoring using alternative data sources, assessing the Article 22 implications of solely automated decision-making.
- Reviewed and updated existing DPIAs for a health-tech company following changes to its data processing operations, ensuring continued compliance with the ICO’s screening criteria.
- Advised a public sector body on the interaction between the DPIA obligation and the Equality Act 2010 public sector equality duty in the context of algorithmic decision-making.
Related data protection pages
See also our other data protection pages:
- Data Protection (pillar page)
- UK GDPR and Regulatory Compliance
- AI and Automated Decision-Making
- Sector-Specific Data Protection
- Data Governance, Transfers and Accountability
- Data Breach Response
- Privacy and Electronic Communications (PECR)
- UK/EU Data Protection Divergence
