
When AI shops for you
Redefining the payments journey
Visa and Mastercard have announced the launch of AI-driven payments agents which will operate as autonomous tools that can make purchases on behalf of consumers. These tools have been designed to analyse user preference, compare options and execute transactions. The launch of these AI agents represents a major evolution in the way consumers interact with the payments ecosystem.
While the commercial and convenience benefits are clear, the introduction of AI agents introduces novel legal and regulatory challenges. As the boundaries between human instruction and machine execution continue to blur, firms deploying these tools will need to reassess how they manage risk, demonstrate accountability and transparency and protect consumers.
AI in the payment flow: beyond automation
The launch of autonomous AI-driven payments agents goes further than existing “smart” technologies embedded in digital wallets or recurring payment tools. AI payments agents are not a passive tool facilitating consumer-initiated payments but instead operate as active agents capable of initiating transactions within parameters set by the consumer.
In doing so, they challenge many tradition legal constructs, some of which we explore further below.
Some legal considerations
Traditional agency is predicated on clear instructions from a principal and the accountability of the agent. Where the agent is a generative AI model responding dynamically to data inputs, establishing clear lines of responsibility becomes more complex.
For UK-regulated firms, this raises a number of questions:
- Does the execution of a payment by an AI tool meet the definition of “authorisation” under the Payment Services Regulations 2017? Consideration will need to be given to whether a broad instruction (e.g. to automatically re-order household products) qualifies as a valid authorisation, how far downstream decisions made by the AI agent still fall within that authorisation and what controls exist to prevent unauthorised or disputed transactions.
- How is liability for disputed transactions managed? Especially where, for example, the AI agent makes a purchase the consumer disagrees with, misinterprets input data or relies on erroneous information from a third party (e.g. merchant or data provider) that influences its decision.
- How is data being processed and is it compliant with the UK GDPR? For example, what special category data is being relied upon, how are transparency obligations managed (particularly around profiling and automated decision making) and how is the consumer informed of any automated decisioning and given a right to contest this?
- Are the decisions made by the AI agent explainable and auditable?
- Are customer outcomes being actively checked under the Consumer Duty? Are firms testing the AI system for unintended bias or discriminatory impact and ensuring consumers are not financially disadvantaged by poorly tuned or oversensitive AI logic.
- Are terms and disclosures relating to the use of AI agents sufficiently clear and fair? Terms must be transparent and accessible and should disclose the nature, scope and risks of AI-driven functionality and ensure consumers have and maintain meaningful control.
To manage these risks, firms will need to ensure robust contractual frameworks are in place, clarifying the scope of the AI agent’s role and limiting liability where appropriate. For any firms deploying AI-driven payments agents, they will need to ensure that consumers are given clear information on how decisions are made, allowing them to set and adjust parameters easily and ensuring that they can revoke or override agent activity at any time. These controls and principles align closely with the Financial Conduct Authority’s expectations around transparency, explainability and accountability.
The Financial Conduct Authority, Competition and Markets Authority and Information Commissioner’s Office will inevitably keep a close eye on these tools through the lens of existing consumer protection, data and AI governance regimes. We also anticipate alignment efforts between the UK and other international regulatory approaches, particularly as the EU AI Act begins to influence market participants.
Governance and oversight
As with any AI deployment, strong governance will be critical. The launch of AI-driven payments agents is likely to prompt a broader conversation about how AI systems are developed, tested and monitored in financial services.
In practice, for firms launching AI tools in the payments space, this means:
- defining ownership of AI decisions across the payments journey;
- building in auditability and traceability of agent actions;
- conducting regular testing and scenario analysis;
- ensuring effective human oversight; and
- embedding ethics, legal review and consumer analysis into product design.
For firms seeking a practical starting point, our insight Five top tips for AI governance outlines how to build proportionate frameworks tailored to different levels of AI sophistication.
TLT’s insight
As the payments sector embraces AI, firms must strike the right balance between innovation and accountability. At TLT, we support clients in navigating this emerging space – helping them design compliant frameworks, manage operational and contractual risk and engage constructively with regulators.
For more articles relating to this topic, please explore our insights:
Written by Matthew Atkinson, with contributions by Alex Williamson, Tom Sharpe and Michelle Sally
This publication is intended for general guidance and represents our understanding of the relevant law and practice as at June 2025. Specific advice should be sought for specific cases. For more information see our terms & conditions.
Get in touch
Get in touch
Insights & events

AI chatbots and competition law: A look into the Meta WhatsApp antitrust investigations

DMCC Act subscription contracts rules: What's the latest?

AI in Motion - How to make sustainable AI a reality

Agentic commerce - The next legal frontier in AI-powered shopping

AI sovereignty and the Intel precedent - How governments are redefining technology control

Competition Appeal Tribunal dismisses second subsidy control challenge

We have a date - Identity verification and statutory register reforms
Making digital regulation work - a framework for digital regulation compliance

Getty Images v Stability AI: Retail Sector Impact | TLT

Are we about to see the end of upwards-only commercial rent reviews in England and Wales?

AI and the future of payments: Five Big Questions with Dave Gardner
Agentic AI: A Short Introduction and Key Legal Considerations

AI and financial crime - five big questions with Ben Cooper

The Franchise Act in the Netherlands - how will it affect you?

The fast and the curious: Empowering in-house lawyers to create business value

European Access Plan: Your gateway to business in the EU

Rebalancing act: the impact of retail transformation on people and stores

How competitors can work together to protect the economy and consumers from the coronavirus crisis

TLT assists pioneering clinical-stage diagnostics company on investment

TLT advises K3 Capital Group on acquisition of HMA Tax Limited

TLT boosts UK-wide commercial offering with appointment of senior hires

TLT expands its commercial services group with new disputes hires

TLT Targets Northern Ireland Growth with Senior Hire | TLT
TLT Partner Appointed Chair of North West Fraud Forum | TLT

TLT Shortlisted for Firm of the Year at Scottish Legal Awards | TLT

TLT Wins Law Firm of the Year at Manchester Legal Awards | TLT

TLT Recognised for Two Awards at The Lawyer Awards 2022 | TLT

TLT Shortlisted for Two Manchester Legal Awards 2022 | TLT

International trade head Caroline Ramsay featured in The Lawyer's Hot 100 2022

Retail IT systems straining to keep pace with heightened demand

Scale up Insights episode five - grow and manage your team





%20790px%20X%20451px%2072ppi.avif)





















