UK lawmakers press for AI guidance by the end of 2026
UK Treasury Committee warns AI in finance is outpacing oversight and urges the FCA to issue guidance on consumer protection and senior accountability by end 2026.
The UK’s Treasury Committee warned that artificial intelligence is spreading faster than oversight across financial services, creating risks for consumers and the financial system. The panel urged the Financial Conduct Authority to publish guidance by the end of 2026 on how consumer-protection rules apply to AI and how responsibility should be assigned to senior managers when AI-driven decisions cause harm.
The findings, ordered to be printed by the House of Commons earlier this month, reflect concerns that regulators-the FCA, the Bank of England, and HM Treasury-are leaning too heavily on existing rules as banks, insurers, and payment firms deepen their use of AI. The report describes AI as already embedded in core financial functions while oversight has not kept pace with the scale and opacity of these systems.
The committee asked the FCA to set clear expectations for firms and for senior managers under current accountability regimes and to publish a comprehensive package by the end of 2026. Formal minutes of the committee’s deliberations are due later this week.
“By taking a wait-and-see approach to AI in financial services, the three authorities are exposing consumers and the financial system to potentially serious harm,” the committee wrote. The report noted that technology can benefit consumers but concluded that firms lack clarity on how current rules apply in practice.
Regulators have focused on applying existing conduct, prudential, and operational resilience frameworks to AI rather than drafting AI-specific rules. The committee stated that this approach has not given firms sufficient practical guidance, particularly as systems grow more complex and their decision-making becomes harder to audit.
The report also highlighted growing reliance on major technology providers for AI tools and infrastructure. That dependence can complicate oversight when model development and deployment involve multiple parties and layers of adaptation before reaching end users in banks and insurers.
Industry voices emphasized the need for clearer rules. Dermot McGrath, co-founder at Shanghai-based strategy and growth studio ZenGen Labs, pointed to earlier fintech policy as a contrast. “The FCA’s sandbox in 2015 was the first of its kind, and 57 countries have copied it since. London remains a powerhouse in fintech despite Brexit,” he noted. In his view, that framework worked because supervisors could observe tests and step in when needed. “AI breaks that model completely,” he argued.
McGrath added that many firms do not fully understand the models they rely on, leaving both companies and regulators to infer how fairness rules apply to algorithmic decisions. He cautioned that unclear expectations risk holding back careful deployment and observed that accountability becomes harder when models are built by technology vendors, adapted by third parties, and then used by banks.
The committee’s warning comes amid government efforts to expand AI adoption across the economy. About a year ago, Prime Minister Keir Starmer pledged to use the technology to accelerate growth.
Industry participants contend that clearer supervisory guidance would help firms align product design, testing, and governance with expectations. The proposed timeline would allow regulators to consult and set detailed instructions on AI use in financial services.
The material on GNcrypto is intended solely for informational use and must not be regarded as financial advice. We make every effort to keep the content accurate and current, but we cannot warrant its precision, completeness, or reliability. GNcrypto does not take responsibility for any mistakes, omissions, or financial losses resulting from reliance on this information. Any actions you take based on this content are done at your own risk. Always conduct independent research and seek guidance from a qualified specialist. For further details, please review our Terms, Privacy Policy and Disclaimers.








