Financial Conduct Authority Responds to AI and Big Tech Challenges Raised by Industry Panels

The Financial Conduct Authority responds to industry panel concerns on AI regulation and Big Tech’s role in financial services, addressing risks like bias, competition, and data privacy while exploring opportunities through initiatives like the Digital Sandbox.

Financial Conduct Authority Responds to AI and Big Tech Challenges Raised by Industry Panels

Regulatory updates tackle AI risks and Big Tech data in financial markets

The Financial Conduct Authority (FCA) has responded to feedback from its six independent statutory panels regarding its approach to regulating artificial intelligence (AI) and Big Tech.

The concerns come amid the rapid development of generative AI and the growing influence of technology giants in financial markets.

These insights, part of the FCA’s annual report exchange with the panels, highlight pressing issues and propose measures to address risks while capitalising on opportunities.

FCA response to the independent panels’ 2023/24 annual reports
We have 6 independent statutory panels. They represent the interests of consumers, regulated firms and markets and we are required to consult with them on the impact of our work, policies and practices. The Panels play an important role in both advising and challenging us. They bring a depth of experience, support and expertise that helps us identify and remedy potential harm to users and markets. We consider their views when we develop policy and implement interventions.

Regulating Artificial Intelligence: Balancing Innovation and Safety

Industry panels, including the Practitioner Panel and Markets Practitioner Panel, have expressed concerns over the lagging pace of AI regulatory controls compared to the swift advancements in generative AI capabilities.

The panels also highlighted disparities in AI deployment and future risks associated with imbalances created by Big Tech’s role in financial services.

In response, the FCA explained its technology-agnostic and principles-based approach to AI regulation, focusing on the safe and responsible use of AI in financial services.

The FCA published an AI update in April 2024 outlining its commitment to outcomes-based regulation, stressing the importance of assessing AI’s impact on consumers and financial markets.

Artificial intelligence in UK financial services - 2024
The Bank of England and Financial Conduct Authority conducted a third survey of artificial intelligence and machine learning in UK financial services.

Key findings from a joint FCA and Bank of England AI survey revealed that 17% of AI use cases in financial services involve foundation models, including Large Language Models (LLMs).

Operations and IT represent the largest share of these implementations, followed by general insurance, risk and compliance, and retail banking for higher-materiality use cases. Notably, a third of these use cases involve third-party providers, with three leading providers dominating the market for cloud, model, and data services.

While AI can mitigate certain cyber risks, the survey identified the potential for biases in machine learning models, particularly in consumer decision-making.

The FCA’s ongoing research on AI bias, aligned with the Digital Regulation Cooperation Forum’s work on AI fairness, seeks to address these challenges.

Big Tech and Data Challenges in Financial Services

The Consumer Panel and Smaller Business Practitioner Panel raised concerns about the risks associated with Big Tech’s growing role in financial services, including potential price discrimination, competition issues, and data privacy challenges.

These concerns prompted the FCA to launch a Call for Input (CFI) in 2024, focusing on data asymmetry between Big Tech and financial services firms.

The FCA’s findings revealed that while current adverse effects are limited, future risks could significantly impact competition and consumer outcomes. Big Tech’s data could play a transformative role in areas such as consumer credit and insurance, where personalised marketing and risk-based pricing could lead to both opportunities and challenges.

The FCA’s response included plans to explore these issues further through its Digital Sandbox initiative. This platform could help assess the value of Big Tech data in financial services and examine how incentives for data sharing can align with achieving positive outcomes for consumers.

Digital Wallets and Wholesale Markets

Feedback from the panels also called attention to the regulatory treatment of digital wallets. The FCA, in collaboration with the Payment Systems Regulator (PSR), launched a Call for Information in mid-2024 to examine whether digital wallets should fall within its regulatory scope.

The findings will inform updates to the Payment Services Regulations as the UK continues to replace EU laws with domestic legislation.

In the context of wholesale markets, the FCA concluded that strict privacy agreements limit Big Tech’s ability to compete directly with incumbents.

A Wholesale Data Market Study found minimal evidence of Big Tech firms entering wholesale markets in ways that challenge traditional players. However, the FCA stated it would maintain vigilance in monitoring these activities.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Technology Law.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.