The UK’s Treasury Select Committee has warned that the government’s ‘wait and see’ approach to the governance of artificial intelligence is leaving the financial sector exposed to series of growing risks.
This is amongst the warnings set out in a TSC report published this week, which calls for a coherent national approach to AI governance in regulated sectors. The report urges the government, the Bank of England and the Financial Conduct Authority to ensure protections are in place so that the UK’s AI ambitions can grow responsibly. With other countries fast introducing rules combining accountability, transparency and resilience, the UK is falling behind, the report suggests.
AI adoption is now widespread, with more than 75% of City firms understood to be using the technology to automate administrative tasks or support core operations. Yet, in the absence of AI-specific regulations, businesses must determine for themselves how existing rules apply, creating exposure to operational, financial and cyber risks.
Commenting on the integration of AI into financial operations, Greg Watson, CEO of Napier AI, said: “AI has already become deeply embedded into the financial infrastructure and the challenge is not whether or not AI will be used, but whether it is being governed in a way that is transparent, explainable and resilient at scale.
“The lack of specific laws and regulations around the governance of AI and the fragmented accountability is ultimately what will bring about the most damage as and when any major AI related incidents should occur. By embedding transparency, audibility and regulatory alignment into AI systems, the UK can harness innovation while safeguarding financial stability and preventing fraud and financial crime.
“The opportunity now is to move beyond high-level principles and put in place practical, coordinated governance that supports both innovation and confidence."
Printed Copy:
Would you also like to receive CIR Magazine in print?
Data Use:
We will also send you our free daily email newsletters and other relevant communications, which you can opt out of at any time. Thank you.







YOU MIGHT ALSO LIKE