UK Compute Roadmap and data protection: AI infrastructure obligations

In short: Organisations building or procuring AI compute infrastructure in the UK face four parallel data protection obligations the Compute Roadmap does not address. A data protection impact assessment under UK GDPR Article 35; the automated decision-making regime amended by Data (Use and Access) Act 2025 section 80, in force 5 February 2026; international transfer rules under Articles 44 to 49; and Article 28 processor agreements with cloud providers.
The UK is investing in compute. The Department for Science, Innovation and Technology’s UK Compute Roadmap envisages a £2 billion public investment in AI compute infrastructure by 2030, with national supercomputing centres, dedicated training clusters, and a planned compute reservation scheme. What the Roadmap does not cover is the data protection regime attaching to organisations that build, procure or operate that infrastructure. The obligations are not new. The Data (Use and Access) Act 2025 has reshaped them since 5 February 2026, and the ICO’s AI audit framework is live. This post sets out what GCs and DPOs need to know.
UK Compute regulatory background
AI compute infrastructure typically processes personal data on a scale that engages the full UK GDPR framework. The applicable framework comprises the UK GDPR, the Data Protection Act 2018, and the Data (Use and Access) Act 2025 (DUAA). The DUAA Commencement No. 6 Regulations 2026 brought the bulk of Part 5 (data protection) into force on 5 February 2026. That commencement included the amendments to the automated decision-making regime in section 80 and Schedule 6, the recognised legitimate interests provisions in Schedule 4, and the international transfer reforms in section 85. A separate commencement of section 103 and Schedule 10 (the data subject complaints regime) is set for 19 June 2026.
The statute is the source of obligation. The Information Commissioner’s Office (ICO) specifies the method of compliance. The ICO published its guidance on AI and data protection in 2023 and updated it in 2024. The ICO’s AI and data protection audit framework, operative since 2023, sets the assurance lens against which large-scale AI deployments are tested. The framework feeds into the ICO’s enforcement powers, which the DUAA strengthened from 5 February 2026.
DPIA obligations under UK GDPR Article 35
Article 35 of the UK GDPR requires a data protection impact assessment (DPIA) before processing personal data on a large scale where the processing is likely to result in a high risk to the rights and freedoms of natural persons. Large-scale AI training and inference processing typically falls within the test, as the ICO’s AI guidance confirms.
“Large scale” is not defined by record count alone. The ICO assesses scale by reference to the volume of personal data, the geographic area covered, the duration of processing, and the number of data subjects affected. AI training datasets typically meet every limb. Where the processing involves special category data, or systematic monitoring of public spaces, the DPIA is mandatory and must be completed before processing begins. A procurement timeline that compresses governance to the back of the schedule risks ICO prior consultation under Article 36 and, in the worst case, an order to suspend processing.
The automated decision-making regime under DUAA 2025
Section 80 and Schedule 6 of the DUAA amended Article 22 of the UK GDPR with effect from 5 February 2026. Decisions taken before that date remain governed by Article 22(3) UK GDPR or sections 14 and 50(2) of the Data Protection Act 2018, under the saving provision in regulation 5 of SI 2026/82.
The DUAA removed the prior requirement at Article 22(2) that the controller establish a qualifying lawful basis (consent, contract, or law) before conducting solely automated decision-making producing legal effects or similar effects on the individual. The new regime substitutes specific safeguards: disclosure of the automated decision-making, the right to contest a decision and obtain human review, and information about the logic involved. The position for special category data is more restrictive, with a qualifying lawful basis still required. AI inference deployments that produce outputs affecting individuals (credit, employment, insurance, immigration, education) sit within the regime. Procuring organisations must flow these obligations down to processors. Our analysis of the automated decision-making obligations sets out the safeguards in detail.
International transfers where compute sits abroad
Chapter V of the UK GDPR (Articles 44 to 49) governs transfers of personal data outside the UK. Where AI compute infrastructure sits in the EU, the US, or any other third country, personal data processed on that infrastructure is being transferred for the purposes of Chapter V. The Compute Roadmap’s framing of “sovereign compute” does not displace those rules.
A transfer to the EEA can rely on the UK’s adequacy regulations. A transfer to the US can use the UK Extension to the EU-US Data Privacy Framework where the recipient is certified. Other third countries require an International Data Transfer Agreement (IDTA) or, for more complex arrangements, Binding Corporate Rules. Section 85 DUAA recalibrated the assessment standard for adequacy regulations, giving more latitude to ministers but not relieving organisations of the obligation to identify and document the transfer mechanism for each compute deployment.
Article 28 processor agreements with cloud providers
Article 28 of the UK GDPR requires controllers to engage processors only under a written contract that meets prescribed minimum terms. Where AI compute is procured from AWS, Microsoft Azure, Google Cloud, or any sovereign UK operator, the cloud provider is a processor and the Article 28 requirements apply in full.
The minimum terms include processing only on documented instructions, confidentiality, Article 32 security measures, sub-processor controls, audit rights, breach notification, and return or deletion on termination. Cloud providers typically offer standard data processing addenda. The points on which a standard addendum often falls short, in our experience, are scope of permitted sub-processing, jurisdictional carve-outs, audit rights, and deletion on termination. A standard addendum is the start of the negotiation, not the end.
Engagement with the ICO AI audit framework
The ICO’s AI and data protection audit framework structures the regulator’s review of AI deployments against UK GDPR principles, including accountability, lawful basis, transparency, security, and rights-respecting design. Organisations operating large-scale AI compute infrastructure can map their deployments to the framework’s areas as a self-assessment exercise.
The framework distinguishes between governance (board oversight, DPIAs, risk management) and operational controls (training data quality, model monitoring, output explainability, security testing). Where an AI deployment is novel or large-scale, the ICO’s regulatory sandbox is open to organisations that want structured engagement.
The four obligations: a comparison
| Obligation | UK GDPR provision | DUAA 2025 effect | Maximum penalty |
|---|---|---|---|
| DPIA before high-risk processing | Article 35 | Unchanged | £17.5m or 4% global turnover |
| Automated decision-making safeguards | Article 22 (as amended) | Section 80, Schedule 6 (in force 5 February 2026) | £17.5m or 4% global turnover |
| International transfer mechanism | Articles 44 to 49 | Section 85 (assessment standard recalibrated) | £17.5m or 4% global turnover |
| Article 28 processor agreement | Article 28 | Unchanged | £17.5m or 4% global turnover |
Implications for organisations procuring AI compute
Before procuring AI compute infrastructure, GCs and DPOs sign off against a governance file mapping each obligation to a specific control. The four obligations above are cumulative. A cloud-procured AI training cluster typically engages all four. The workflow that works in practice anchors data governance at the front of the timeline. The DPIA is started when the compute requirement is scoped, not when the contract is signed. The transfer mechanism is identified before the deployment region is chosen. The processor agreement is reviewed against an Article 28 checklist before signature.
The cost of getting this wrong divides into two heads. Direct enforcement risk is the first: the maximum penalty for breach of the GDPR’s substantive obligations is the higher of £17.5 million or 4% of global annual turnover, with the ICO’s investigatory powers strengthened by DUAA 2025. The second is operational: an order to suspend processing, even temporarily, on a multi-million-pound AI training cluster is a procurement and reputational event few boards have planned for. Our guidance on AI and data governance advice sets out where Bratby Law engages, and our analysis of the Sovereign AI Fund covers the investment context.
Viewpoint
The Compute Roadmap is the visible part of UK AI policy. The data protection regime runs in parallel and binds the organisations the Roadmap is trying to attract. In our experience advising regulated-sector clients on AI deployments, the operational bottleneck is rarely cloud capacity or model selection. It is the governance file: the DPIA, the lawful basis assessment, the transfer impact assessment, and the processor agreement. The organisations that move fastest into compliant production wire data protection governance into the procurement track from day one. The ICO’s AI audit framework is operative, and the DUAA has given the regulator sharper investigatory and penalty powers from 5 February 2026. UK divergence from the EU AI Act is a separate question that the Roadmap and the DUAA both step around. For organisations operating cross-jurisdiction, DUAA 2025 and the EU AI Act have already diverged, with no convergence mechanism.
Further reading
Primary sources: UK Compute Roadmap (DSIT); Data (Use and Access) Act 2025; SI 2026/82 (Commencement No. 6); ICO guidance on AI and data protection. Bratby Law analysis: our data protection practice page.
For advice on data protection obligations attaching to AI compute procurement or deployment, contact Rob Bratby at Bratby Law.
