Agentic AI and Payments: Can an AI Agent Consent to a Payment?

Agentic AI and Payments Can an AI Agent Consent to a Payment Bratby Law payments regulation

Rob Bratby, Managing Partner | Last updated: March 2026

Agentic AI payments are about to become a regulatory priority. The FCA published its 2026 Payments Regulatory Priorities Report on 25 March 2026, signalling that it may rewrite UK payments regulation to accommodate autonomous AI agents. The report confirms the FCA will consider whether changes to the regulatory framework are needed to support AI systems that autonomously initiate and execute payment transactions. For payment institutions, fintechs and their investors, this is the clearest indication yet that the consent architecture underpinning UK payments law may need to change.

The consent framework under the PSRs 2017

Regulation 67 of the Payment Services Regulations 2017 (PSRs 2017) provides that a payment transaction is authorised only if the payer has given consent. Consent must be given in the form and procedure agreed between the payer and the payment service provider. The payer may withdraw consent at any time before the point at which the payment order can no longer be revoked. For payment initiation services, consent is given to the payment initiation service provider (PISP), which then initiates the transaction from the payer’s account.

The Strong Customer Authentication (SCA) requirements, set out in the regulatory technical standards under the PSRs 2017, add a further layer. SCA requires authentication using two or more independent elements from the categories of knowledge, possession and inherence. Exemptions exist for low-value contactless payments, trusted beneficiaries and recurring transactions of the same amount, but the underlying principle is that a human must authenticate each payment or authorise a defined series of payments. This framework was not designed for agentic AI payments, where a software system acts on the person’s behalf.

What changes when the agent is autonomous

The term refers to software systems that can autonomously initiate, route and execute payment transactions on behalf of a user. Unlike a standing order or direct debit, where the instruction is fixed and the payer has defined the amount, frequency and recipient in advance, an autonomous agent may decide when to pay, whom to pay and how much to pay based on parameters set by the user but interpreted in real time.

For agentic AI payments, the regulatory question is where consent sits in this chain. Under regulation 67 of the PSRs 2017, the payer must consent to the transaction. If the payer sets broad parameters (“pay my energy bill when it falls below the cheapest tariff”) and the AI agent determines the timing, amount and recipient, has the payer consented to the specific transaction? Or has the payer delegated authority to the agent, which is a different legal relationship?

The distinction matters. If the AI agent is treated as acting within the scope of a prior consent, the existing framework may stretch to accommodate it, in the same way that variable recurring payments operate under mandates with defined parameters. If the agent is treated as exercising autonomous judgment, the consent model breaks down. The payer has not consented to the specific transaction; the agent has decided it.

There is a further complication around liability. Under regulation 76 of the PSRs 2017, where a payment transaction is not authorised, the payer’s payment service provider must refund the amount immediately. If an agent initiates a transaction that the payer did not specifically authorise, who bears the loss? The PISP, the account servicing payment service provider, or the technology provider? The current framework does not answer this.

What the FCA is doing about it

The FCA’s 2026 Payments Regulatory Priorities Report addresses agentic AI payments directly, stating that the regulator will consider whether regulation needs to change to support them. This is a step beyond the FCA’s established approach of applying existing frameworks to new technologies rather than writing new rules.

Three initiatives are now running in parallel. First, the FCA is operating a Supercharged Sandbox in partnership with Nvidia, allowing firms to test AI-driven payment products with synthetic data. Second, the FCA’s AI Live Testing programme permits controlled deployment of autonomous payment services. An evaluation report on both programmes is expected by the end of 2026.

Third, the Mills Review, launched in January 2026 and led by FCA Executive Director Sheldon Mills, is examining the long-term impact of AI on retail financial services. The review’s engagement paper identifies autonomous AI systems and their implications for consumer protection, market structure and regulatory design. Recommendations are due to the FCA Board in summer 2026.

Separately, HM Treasury’s planned consultation on the future of UK payment services law, expected in Q2 2026 as set out in the Payments Forward Plan, will address whether the PSRs 2017 consent and authentication framework requires updating. As we noted in our analysis of the Payments Forward Plan, the consultation will consider whether a new regime should accommodate agentic AI payment initiation from the outset.

Commercial and operational implications

For PISPs and payment institutions building AI-enabled payment products, the immediate question is whether current products comply with the existing consent framework. A PISP that deploys an autonomous system to initiate payments on behalf of customers remains responsible as a regulated entity for the actions of that system. If the agent initiates a transaction outside the scope of what the customer authorised, the PISP faces liability under regulation 76.

Fintechs developing agentic AI payments products should map their consent architecture against regulation 67 now. Where the agent operates within tightly defined parameters set by the customer, the existing variable recurring payment model may provide a workable analogy. Where the agent exercises broader discretion, the legal basis is less certain and firms should consider whether their customer agreements adequately define the scope of the agent’s authority and the allocation of liability if things go wrong.

For PE investors backing agentic AI payments ventures, the FCA’s direction of travel reduces one source of regulatory uncertainty. A framework that explicitly accommodates autonomous payment initiation, rather than leaving firms to fit AI-driven systems into rules designed for manual consent, should support product development and reduce compliance risk. The timing, however, remains uncertain. Primary legislation is likely needed, and the PSRs 2017 replacement is not expected before 2028.

Viewpoint

The future of agentic AI payments depends on whether regulators can keep pace with the technology. The FCA is right to move from applying existing rules by analogy to asking whether the rules themselves need to change. The PSRs 2017 consent framework assumes a human in the loop. As AI agents become more autonomous, that assumption becomes a fiction. The better approach is to define what delegated authority means in a payments context, set boundaries on the scope of that delegation and make clear where liability falls when the agent acts outside those boundaries. The Mills Review and the HM Treasury payments law consultation are the right forums for that work. The challenge will be moving from consultation to legislation before the technology overtakes the regulatory process.

Links

FCA 2026 Payments Regulatory Priorities Report

Payment Services Regulations 2017

Regulation 67, PSRs 2017 (consent and withdrawal of consent)

Regulation 76, PSRs 2017 (liability for unauthorised transactions)

Mills Review: Call for Input

Payments Forward Plan (GOV.UK)

The Payments Forward Plan: A Three-Year Roadmap (Bratby Law)

FCA Safeguarding: Six Weeks to the Supplementary Regime (Bratby Law)

Get in touch

For advice on the regulatory framework for AI-enabled payment products, or on preparing for the FCA’s payments law consultation, contact Rob Bratby at Bratby Law.

Similar Posts