Last updated: 1 April 2026

About AU0002

AU0002 (AI Leadership: Developing AI Strategy) is a Level 5 apprenticeship unit approved by Skills England on 17 March 2026. It covers 15 knowledge statements and 14 skills statements across seven function areas: AI strategy, procurement and investment, governance and ethics, enterprise risk and audit readiness, leadership and organisational change, external engagement, and workforce transformation. Delivery hours and funding rate are due to be confirmed by Skills England in April 2026. See the official Skills England specification for the latest status.

Why Financial Services Is the Critical Sector for This Unit

Every major UK sector has some AI governance exposure. But financial services is the sector where that exposure has already translated into regulatory enforcement expectations, named individual accountability, and documented compliance obligations. In most industries, AI governance is still largely a matter of internal risk management and ethical commitment. In financial services, it is a regulatory requirement with personal liability attached.

Three factors make FS the highest-urgency sector for AU0002:

  1. Regulatory density. FS senior leaders operate under FCA and PRA oversight, SMCR personal accountability, Consumer Duty obligations, Anti-Money Laundering requirements, and increasingly the extraterritorial reach of the EU AI Act. Each of these regimes creates specific expectations around AI systems that touch the relevant area. Leaders who cannot navigate this landscape are already a compliance risk.
  2. AI adoption depth. FS was an early and deep adopter of machine learning across credit scoring, fraud detection, algorithmic trading, insurance pricing, and customer servicing. The senior leaders now responsible for oversight of those systems were often not involved in their original deployment and may lack the governance vocabulary to challenge, direct, or audit them effectively.
  3. Personal accountability regime. SMCR makes AI governance a personal liability question, not just a firm-level one. Where a Senior Manager is named as accountable for a business area that uses AI to make consequential decisions — lending, claims, investment recommendations — they need to be able to demonstrate they exercised meaningful oversight. A training record showing completion of AU0002 is one concrete element of that demonstration.

FCA and PRA Expectations on AI Governance

Neither the FCA nor the PRA has yet published a finalised AI-specific regulatory framework, but both have been clear through Dear CEO letters, supervisory publications, and enforcement activity that AI governance is a current supervisory priority. The FCA’s ongoing AI discussion papers and its published thinking on algorithmic decision-making make several expectations explicit:

  • Firms must be able to explain AI-driven decisions to customers and to the regulator. This requires that someone in the firm understands the model well enough to provide that explanation — which is distinct from the data scientists who built it.
  • Firms must have documented governance arrangements for AI systems in use, including who approved deployment, what risk assessment was conducted, and who is accountable for ongoing monitoring.
  • Firms must demonstrate that AI systems do not produce discriminatory outcomes in protected characteristics, particularly in credit and insurance contexts where AI can inadvertently perpetuate historical bias.
  • Firms must have incident response and escalation protocols for AI failures, including model drift, unexpected outputs, and adversarial manipulation.

AU0002’s governance and ethics function area (K7, K8, S7, S8) directly addresses the first three of these. Its enterprise risk and audit readiness function area (K9, K10, S9, S10) addresses the fourth. A leader who has completed AU0002 has been assessed on their ability to establish accountability frameworks, document governance arrangements, identify bias risks, and build audit-ready processes — exactly the capabilities regulators are testing for.

SMCR and AI Accountability: Why This Has Become Personal

The Senior Managers and Certification Regime requires FS firms to map prescribed responsibilities and most senior functions to named individuals. The intent is straightforward: where something goes wrong, there should be a named person who was accountable for it and who can be held responsible by the regulator.

As AI systems have moved from back-office analytics into front-line customer decisions — credit offers, insurance quotes, fraud flags, investment recommendations — the question of SMCR accountability for those AI decisions has become live. The FCA has not yet published a specific SMCR mapping for AI, but its supervisory messages have been consistent: firms should not treat AI as an accountability gap. Where an AI system makes a consequential decision, a named Senior Manager should be able to demonstrate they exercised oversight of the conditions under which it makes those decisions.

The practical problem is that many Senior Managers named as accountable for AI-driven functions do not have the knowledge to exercise that oversight meaningfully. They approved deployment, they receive MI packs, but they could not articulate:

  • What the model is actually optimising for and what the limitations are
  • What governance documentation exists and whether it is current
  • What the escalation protocol is if the model starts behaving unexpectedly
  • How bias testing was conducted and what the results showed
  • What their firm’s AI procurement standards require of third-party AI vendors

AU0002 addresses every one of these directly. K5 covers AI system procurement and investment decisions. K7–K8 cover governance frameworks and ethical AI design. K9–K10 cover risk management and audit readiness. S7–S8 require the learner to actually implement governance structures and accountability frameworks — not just understand them conceptually. A Senior Manager who completes AU0002 has an assessed training record demonstrating governance competence that can be produced in a supervisory review.

AU0002 is not a legal compliance qualification

Completing AU0002 does not satisfy any specific regulatory requirement and should not be presented as doing so. What it does is provide a training record demonstrating that a named individual has been assessed on AI governance knowledge and skills — which is relevant context in a supervisory or enforcement conversation, but not a substitute for legal advice on your firm’s specific regulatory obligations.

Consumer Duty and AI-Driven Customer Outcomes

The Consumer Duty, which came into full force in July 2023, established a cross-cutting standard requiring firms to deliver good outcomes for retail customers across four outcome areas: products and services, price and value, consumer understanding, and consumer support. The Duty is outcomes-focused, which means the FCA is not primarily interested in whether a firm followed a process — it is interested in whether customers actually received good outcomes.

AI creates specific tension with Consumer Duty in several ways:

Product personalisation. AI-driven product recommendations and pricing may systematically give different customers different outcomes for the same underlying need. A Consumer Duty analysis has to consider whether the AI’s personalisation criteria are producing good outcomes across the customer base — not just whether the recommendation was technically compliant.

Consumer understanding. AI-generated communications and chatbot interactions must meet the same consumer understanding standard as human-delivered communications. Leaders responsible for AI-driven customer communications need to understand how those outputs are generated, how quality is controlled, and what the testing protocol is for outputs that might confuse or mislead.

Vulnerability. AI systems are generally trained on population-level data and may not handle vulnerable customer signals well. Senior leaders with oversight of AI-driven servicing need to understand how vulnerability is identified in AI systems, what escalation protocols exist, and how outcomes for vulnerable customers are monitored.

AU0002’s workforce transformation function area (K14, K15, S13, S14) covers the human impact of AI deployment, including the design of appropriate oversight mechanisms. Its AI strategy function area (K1–K4) covers strategic alignment of AI systems with organisational objectives — in a Consumer Duty context, those objectives include good customer outcomes. A leader completing AU0002 develops the conceptual tools to interrogate whether an AI system is actually delivering what the firm claims it is.

EU AI Act: Extraterritorial Reach for UK FS Firms

The EU AI Act entered into force in August 2024, with its high-risk AI provisions applying from August 2026. For UK financial services firms, the Act’s extraterritorial scope is the key issue. The Act applies to:

  • Providers that place AI systems on the EU market, regardless of where the provider is based
  • Deployers of AI systems established in the EU or whose affected persons are in the EU
  • Importers and distributors of AI systems in the EU

For a UK bank, insurer, or investment firm with EU operations, EU retail customers, or EU institutional counterparties, this extraterritorial reach is not theoretical. Several AI use cases in FS fall into the Act’s Annex III high-risk categories:

  • Credit scoring — explicitly listed as high-risk in Annex III
  • Insurance risk assessment and pricing — high-risk under the natural persons category
  • Employment decisions — high-risk, including algorithmic performance management
  • Biometric identification — high-risk, relevant to KYC and fraud prevention

High-risk AI systems under the Act require: a risk management system, data governance measures, technical documentation, automatic logging, transparency measures, human oversight measures, accuracy and robustness measures, and in some cases, a conformity assessment. The Act also creates obligations for designated human reviewers of AI-driven decisions affecting natural persons.

AU0002’s enterprise risk and audit readiness function area (K9, K10, S9, S10) covers exactly these requirements: identifying AI risks, building regulatory compliance frameworks, preparing for audit, and implementing proportionate controls. A senior leader with AU0002 competence has the vocabulary and the frameworks to direct their team’s EU AI Act compliance work, even if the detailed legal analysis is done by specialist counsel.

How AU0002 Function Areas Map to FS Regulatory Requirements

The table below maps each AU0002 function area to its most direct financial services regulatory application. This is not a compliance mapping — it is a relevance mapping, showing which regulatory expectations each part of the unit develops capability against.

AU0002 Function Area K&S Refs Primary FS Regulatory Application
AI Strategy and Direction K1–K4, S1–S4 FCA/PRA documented governance arrangements; Consumer Duty strategic alignment; SMCR accountability for strategic AI decisions
Procurement and Investment K5–K6, S5–S6 Third-party AI vendor due diligence; FCA expectations on outsourcing and operational resilience for AI-dependent services; EU AI Act provider obligations
Governance and Ethics K7–K8, S7–S8 SMCR personal accountability frameworks; FCA/PRA explainability expectations; bias monitoring for Consumer Duty; EU AI Act transparency requirements
Enterprise Risk and Audit Readiness K9–K10, S9–S10 PRA model risk framework; FCA incident reporting for AI failures; EU AI Act conformity assessment and human oversight requirements; AML transaction monitoring oversight
Leadership and Organisational Change K11–K12, S11–S12 Regulatory change management for AI Act implementation; Senior Manager-led AI transformation programmes; Board and ExCo communication on AI risk
External Engagement K13, S13 Regulatory relationship management on AI; industry body engagement on FS AI standards; client/counterparty AI due diligence responses
Workforce Transformation K14–K15, S14 Consumer Duty vulnerability considerations in AI-driven servicing; human oversight design for high-risk AI; FCA expectations on AI literacy across the regulated population

Which FS Roles Should Complete AU0002

AU0002 is a Level 5 unit — the equivalent of a foundation degree. The knowledge and skills it covers are calibrated for people who are, or will shortly be, responsible for setting direction and governance on AI, not for people who use AI tools day-to-day. In a financial services context, the most direct fits are:

Chief Risk Officer / Deputy CRO

AI risk now sits within model risk, operational risk, and conduct risk simultaneously. CROs who cannot independently assess AI governance arrangements are reliant on technical teams for a risk function that is fundamentally a governance question.

Chief Compliance Officer

Consumer Duty, EU AI Act, and FCA/PRA AI governance expectations all land in compliance. CCOs need to be able to build and operate compliance frameworks for AI systems, not just review policies prepared by others.

Chief Data Officer / Chief Technology Officer

CDOs and CTOs often have strong technical backgrounds but may lack the governance and strategy vocabulary that regulators and boards now require. AU0002 bridges that gap, particularly on accountability frameworks, audit readiness, and external engagement.

Money Laundering Reporting Officer

MLROs are typically named as accountable for AML functions that increasingly include AI-driven transaction monitoring. Demonstrating competence to oversee and govern those AI systems is increasingly relevant to FCA MLRO fitness assessments.

Senior Business Leaders with AI-Driven P&L

Heads of retail banking, insurance underwriting, or investment management who use AI-driven models to make product, pricing, or eligibility decisions need the governance knowledge to exercise meaningful SMCR oversight.

Non-Executive Directors with AI Oversight Remit

Board-level oversight of AI is now a governance expectation in regulated FS firms. NEDs with audit, risk, or technology committee responsibilities increasingly need the AI strategy and governance vocabulary that AU0002 develops.

Roles Where AU0002 Is Not the Right Choice

AU0002 is not the right unit for everyone in FS who works with AI. The unit is calibrated for leadership-level governance — it is not an applied AI skills qualification and not a technical AI qualification. Roles where a different route is likely more appropriate include:

  • Data scientists and ML engineers — technical practitioners need practitioner-level qualifications, not a leadership governance unit. ST1398 (Machine Learning Engineer) or ST1512 (AI and Automation Practitioner) are more appropriate apprenticeship routes.
  • Front-line analysts who use AI tools — applied AI literacy and tool-level training is more appropriate. Watch for lower-level AI units from Skills England in 2026–2027.
  • Compliance analysts who support AI governance frameworks — benefit from understanding governance principles but the Level 5 calibration of AU0002 may be pitched above what is needed. Line managers should assess whether the unit is proportionate.

Using the Growth and Skills Levy for AU0002 in FS

FS employers paying the Apprenticeship Levy — or its successor the Growth and Skills Levy — can use their levy funds to cover AU0002 unit costs once Skills England confirms the funding rate (expected April 2026). The mechanics are the same as standard levy-funded apprenticeship training:

  1. Identify eligible employed adult learners (any age, must be employed, must be UK resident)
  2. Select an approved AU0002 delivery provider from the Register of Apprenticeship Training Providers
  3. Agree a training plan and employer obligations with the provider
  4. Draw down levy funds through the ESFA Digital Accounts system once delivery begins
  5. There is no off-the-job training requirement for standalone units — all delivery can be employer-time

For large FS employers with substantial levy balances, AU0002 offers a materially faster return on levy investment than full apprenticeship programmes. A senior leader completing AU0002 can be assessed and certified within one quarter. A full Senior Leader apprenticeship takes 14–18 months minimum. For organisations that have struggled to spend their levy meaningfully on senior populations, units are a significant structural improvement.

Levy funding rate not yet confirmed

The funding rate for AU0002 will be set by Skills England and published in April 2026. Do not commit to internal budget figures or provider pricing until this is confirmed. The AU0002 specification page will be updated when funding details are published.

Building the Internal Business Case in FS

The internal business case for AU0002 in a regulated FS firm is unusual among learning and development investments in that it has a compliance dimension as well as a capability dimension. A well-constructed internal business case should address both:

The regulatory risk angle: Named Senior Managers accountable for AI-driven functions need to demonstrate meaningful oversight capability. AU0002 provides an assessed, Quality-assured training record that can be produced in a supervisory review. In a sector where the FCA is actively reviewing AI governance arrangements, this is not a hypothetical benefit.

The capability angle: AI governance capability is genuinely scarce in the senior FS population. Most leaders appointed to AI-adjacent roles in the past five years were selected for their domain expertise — not their AI governance knowledge, which effectively did not exist as a codified discipline when they were trained. AU0002 addresses that gap systematically rather than through ad hoc awareness sessions or vendor-provided overviews.

The levy efficiency angle: For firms with unspent levy balances, AU0002 provides a legitimate, high-value route to deploy levy funds on the senior population where they are hardest to spend through traditional apprenticeship routes. The business case is further strengthened if the firm can demonstrate a cohort approach — putting a group of Senior Managers through the unit together, which builds shared governance vocabulary across the leadership team.

Frequently Asked Questions

Is AU0002 relevant to financial services employers?
Yes — financial services is arguably the sector where AU0002 is most immediately relevant. FCA and PRA expectations around AI governance, SMCR accountability for AI decisions, Consumer Duty obligations on AI-driven customer outcomes, and EU AI Act compliance for high-risk AI systems all create direct demand for the knowledge and skills AU0002 develops.
How does SMCR interact with AI accountability?
SMCR requires named individuals to hold accountability for specific functions. As AI systems drive decisions in credit, insurance pricing, fraud detection, and customer communications, regulators expect that accountability for those AI-driven decisions sits with an identified Senior Manager. AU0002’s governance and ethics function area directly addresses accountability frameworks, documentation obligations, and the governance structures required to demonstrate meaningful oversight.
Does Consumer Duty require AI governance training?
Consumer Duty does not mandate specific training, but it creates a clear accountability standard: firms must be able to demonstrate that their products and services deliver good outcomes for customers. Where AI influences those outcomes, the FCA expects firms to have oversight mechanisms in place. AU0002 covers those oversight mechanisms directly.
What EU AI Act obligations apply to UK FS firms?
UK firms with EU operations, EU customers, or EU contractual relationships may fall within the EU AI Act’s extraterritorial scope, particularly for high-risk AI systems in credit scoring, insurance underwriting, and employment decisions. AU0002’s risk and audit function area covers regulatory compliance frameworks and audit readiness that map onto the Act’s requirements.
Which roles in financial services should do AU0002?
The most direct fits are Chief Risk Officers, Chief Data Officers, Chief Technology Officers, Money Laundering Reporting Officers (where AI is used in transaction monitoring), Chief Compliance Officers, and senior business leaders with P&L responsibility for AI-driven products. More broadly, any Senior Manager who will be named as accountable for AI decisions under SMCR should be able to demonstrate the governance knowledge AU0002 develops.
Can unspent Apprenticeship Levy funds cover AU0002?
Yes — once Skills England publishes the funding rate (expected April 2026), employers can use levy balances held in their Digital Accounts to fund AU0002 delivery for any eligible employed adult. There is no off-the-job training requirement for standalone units, so delivery can take place entirely in work time.

Sources and Further Reading