Thursday, January 22, 2026
Economy & Markets
8 min read

Lawmakers Demand AI Stress Tests for Britain's Financial Services

Reuters
January 20, 20262 days ago
Britain needs 'AI stress tests' for financial services, lawmakers say

AI-Generated Summary
Auto-generated

British lawmakers urge financial watchdogs to implement AI stress tests for financial services. A report highlights significant risks of AI, including consumer harm and market destabilization. Regulators are called upon to move beyond a "wait and see" approach and provide clearer guidance on AI use and consumer protection by the end of 2026.

LONDON, Jan 20 (Reuters) - Britain’s financial watchdogs are not doing enough to stop artificial intelligence from harming consumers or destabilising markets, a cross‑party group of lawmakers said on Tuesday, urging regulators to move away from what it called a “wait and see” approach. In a report on AI in financial services, the Treasury Committee said the Financial Conduct Authority and the Bank of England should start running AI‑specific stress tests to help firms prepare for market shocks triggered by automated systems. Sign up here. The committee also called on the FCA to publish detailed guidance by the end of 2026 on how consumer protection rules apply to AI, and on the extent to which senior managers should be expected to understand the systems they oversee. “Based on the evidence I've seen, I do not feel confident that our financial system is prepared if there was a major AI-related incident and that is worrying,” committee chair Meg Hillier said in a statement. TECHNOLOGY CARRIES 'SIGNIFICANT RISKS' A race among banks to adopt agentic AI, which unlike generative AI can make decisions and take autonomous action, runs new risks for retail customers, the FCA told Reuters late last year. About three‑quarters of UK financial firms now use AI. Companies are deploying the technology across core functions, from processing insurance claims to performing credit assessments. While the report acknowledged the benefits of AI, it warned the technology also carried “significant risks” including opaque credit decisions, the potential exclusion of vulnerable consumers through algorithmic tailoring, fraud, and the spread of unregulated financial advice through AI chatbots. Experts contributing to the report also highlighted threats to financial stability, pointing to the reliance on a small group of U.S. tech giants for AI and cloud services. Some also noted that AI‑driven trading systems may amplify herding behaviour in markets, risking a financial crisis in a worst-case scenario. An FCA spokesperson said the regulator welcomed the focus on AI and would review the report. The regulator has previously indicated it does not favour AI‑specific rules due to the pace of technological change. The BoE did not respond to a request for comment. Hillier told Reuters that increasingly sophisticated forms of generative AI were influencing financial decisions. “If something has gone wrong in the system, that could have a very big impact on the consumer,” she said. Reporting by Phoebe Seers; Editing by Tommy Reggiori Wilkes

Rate this article

Login to rate this article

Comments

Please login to comment

No comments yet. Be the first to comment!
    AI Stress Tests for UK Finance: Lawmakers Urge Action