Last updated: 1 April 2026

About AU0002

AU0002 (AI Leadership: Developing AI Strategy) is a Level 5 apprenticeship unit approved by Skills England on 17 March 2026. It covers 15 knowledge statements and 14 skills statements across seven function areas: AI strategy, procurement and investment, governance and ethics, enterprise risk and audit readiness, leadership and organisational change, external engagement, and workforce transformation. Delivery hours and funding rate are due from Skills England in April 2026. See the official AU0002 specification for the latest status.

Legal services sits in an unusual position among professional sectors. It adopted AI tools rapidly — AI-assisted research, contract analysis, due diligence review, and document drafting are now mainstream across large and mid-market firms — but it did so ahead of any clear regulatory framework for AI governance in legal practice. The SRA’s AI guidance arrived in 2024, after many firms had already deployed AI tools at scale. The result is a sector with significant AI adoption and, in many cases, inadequate governance infrastructure to match.

The risks that have already materialised — lawyers submitting AI-generated court filings containing fabricated case citations, client data transmitted to AI tools without appropriate data processing agreements, AI outputs mistaken for definitive legal analysis — are governance failures, not technology failures. They are failures of oversight, supervision, and procurement standards. These are precisely the capabilities AU0002 develops.

Three regulatory and professional factors make AU0002 specifically relevant to legal services:

  1. The SRA competence obligation. Solicitors must maintain the skills and knowledge needed to practise competently. As AI becomes embedded in legal work, the SRA’s position is clear: understanding how to use and supervise AI tools responsibly falls within that obligation.
  2. The supervision obligation. Partners and heads of department are responsible for supervising the work produced by their teams. Where that work involves AI-assisted outputs, supervision requires being able to evaluate those outputs critically — not just accepting them because a tool produced them.
  3. Client confidentiality exposure. Legal professional privilege and client confidentiality are the foundation of the solicitor-client relationship. AI tools that transmit client data to third-party servers, or that produce outputs drawing on information from other client matters, create systemic confidentiality risk that senior lawyers need to be equipped to identify and prevent.

The SRA’s AI Position: What It Requires of Senior Lawyers

The SRA published AI guidance in 2024 and has followed it with ongoing supervisory messaging. The guidance does not prohibit AI use — it sets conditions for responsible use. The core requirements it articulates for solicitors and firms are:

  • Competence. Solicitors must understand the AI tools they use, including their limitations and error rates. Using a tool you do not understand, in a matter where the output has legal consequences, is a competence risk.
  • Supervision of AI outputs. AI-generated work product must be reviewed by a qualified solicitor before being used in client matters, submitted to courts, or provided as legal advice. The obligation to supervise applies to AI output as to any junior work product.
  • Client transparency. The SRA expects firms to have clear policies on when and how they disclose AI use to clients. Where AI plays a significant role in delivering a matter, clients may have a legitimate interest in knowing.
  • Confidentiality protection. Firms must ensure AI tools used on client matters do not expose confidential information to third parties. This requires appropriate data processing agreements, enterprise-grade access controls, and clear policies on what data can be inputted to which tools.
  • Professional judgement. AI must not substitute for professional judgement. The solicitor retains responsibility for the advice given and must be able to explain and defend it independently of the AI’s output.

Each of these five requirements has a governance and leadership dimension. At firm level, these are not just individual solicitor obligations — they require firm-wide policies, procurement standards, training frameworks, and oversight mechanisms. A managing partner or head of department who has not thought through how their team is using AI, or what the firm’s data handling policies are for AI tools, is exposing both the firm and individual fee earners to regulatory risk.

AU0002’s governance and ethics function area (K7, K8, S7, S8) covers accountability frameworks, ethical AI design, and bias identification. Its procurement and investment function area (K5, K6, S5, S6) covers how to evaluate AI tools, set procurement standards, and build appropriate contract terms. Together they map directly onto what the SRA requires firms to have in place.

Confidentiality, Privilege, and AI: The Governance Challenge

Confidentiality is the issue that keeps general counsel and managing partners awake at night when they think about AI. The risks operate at several levels:

Data transmission to third-party servers. General-purpose AI tools — including many that are marketed for legal use — transmit inputted data to servers operated by the AI provider. Unless the firm has a specific enterprise agreement governing data handling, that data may be used for model training, accessible to the provider’s staff, or subject to US data disclosure obligations (particularly for tools run by US-headquartered companies). For client-confidential matter information, this is a serious confidentiality risk.

Cross-matter contamination. AI tools that are trained on or have access to a firm’s historical work product may inadvertently surface information from one client’s matters in outputs generated for another. This is particularly acute for firms using AI tools that are fine-tuned on their own document libraries without adequate information barrier controls.

Privilege waiver risk. Disclosing privileged legal advice to an AI tool — particularly one operated by a third party — may constitute a waiver of legal professional privilege over that advice, depending on the circumstances. In litigation contexts, this could expose privileged documents to disclosure.

Metadata and inference risks. Even when the substantive content of a matter is not transmitted, metadata about who is working on what — search patterns, query histories, matter codes — can reveal commercially sensitive information about the firm’s clients and instructions.

None of these risks require a firm to stop using AI. They require a firm to have governance arrangements that identify which tools are approved for which types of work, what data handling requirements those tools must meet, how approval is granted for new tools, and who is responsible for monitoring compliance. AU0002’s procurement (K5–K6, S5–S6) and enterprise risk (K9–K10, S9–S10) function areas develop exactly this capability.

AU0002 Function Areas Mapped to Legal Services Requirements

AU0002 Function Area K&S Refs Legal Services Application
AI Strategy and Direction K1–K4, S1–S4 Firm-level AI adoption strategy; practice group AI roadmap decisions; SRA-aligned AI policy development; alignment of AI tools with client service objectives
Procurement and Investment K5–K6, S5–S6 Approved AI tool evaluation framework; data processing agreement requirements for AI vendors; enterprise-grade access control assessment; privilege and confidentiality checklist for AI procurement
Governance and Ethics K7–K8, S7–S8 SRA-aligned AI use policy; supervision obligations for AI outputs; client disclosure policy on AI use; professional judgement preservation frameworks; bias identification in AI-assisted legal research
Enterprise Risk and Audit Readiness K9–K10, S9–S10 AI incident response (e.g. AI-generated filing error, data breach via AI tool); SRA supervisory review preparation; cross-matter contamination risk controls; privilege waiver risk management
Leadership and Organisational Change K11–K12, S11–S12 Managing partner-led AI adoption programmes; change communication to partners, associates, and clients; professional culture change around AI-assisted work; managing AI resistance in traditional practice areas
External Engagement K13, S13 Client AI due diligence responses; industry body engagement on AI standards in legal practice; court and tribunal communications on AI use; regulator relationship management with the SRA on AI governance
Workforce Transformation K14–K15, S14 AI impact on legal roles (paralegals, junior associates, support functions); CPD and competence planning for AI-affected practice areas; maintaining human oversight in AI-assisted advice delivery

Which Roles in Law Firms Should Complete AU0002

AU0002 is calibrated at Level 5 for people who set direction and governance on AI — not for people who use AI tools as part of their daily work. In a law firm context, the target population is leaders who are responsible for deciding whether AI is deployed, how it is governed, and what happens when it goes wrong.

Managing Partner / Senior Partner

Responsible for firm-wide AI strategy and ultimately accountable to the SRA for the firm’s conduct. AU0002 provides the governance vocabulary to lead responsible AI adoption and respond credibly in regulatory conversations.

Head of Innovation / Legal Technology Director

Typically the internal champion for AI tool adoption. AU0002 provides the governance and ethics framework to ensure innovation is deployed responsibly — not just effectively — and that procurement decisions are properly documented.

Head of Department (AI-intensive practice areas)

Heads of litigation, corporate, real estate, and employment practices where AI tools are most embedded bear supervision responsibility for AI-assisted outputs across their teams. AU0002’s governance framework directly supports that supervision obligation.

Chief Operating Officer / Director of Operations

In mid-to-large firms, the COO typically owns AI procurement decisions, vendor management, and operational governance. AU0002’s procurement and risk function areas map directly onto their responsibilities.

General Counsel / Deputy GC (in-house teams)

In-house legal leaders are increasingly responsible for governing AI use both within the legal function and across the wider organisation. AU0002’s breadth — covering strategy, ethics, risk, and workforce — aligns well with the general counsel’s cross-functional remit.

Risk and Compliance Partner / Director

Responsible for the firm’s regulatory compliance posture. AI creates new SRA compliance obligations that the risk and compliance function needs to be equipped to manage — from policy drafting to incident response.

In-House Legal Teams: A Different Set of Drivers

In-house legal teams face a slightly different AI governance challenge from private practice firms. Rather than managing AI in the context of client service delivery, in-house lawyers are typically managing AI in the context of commercial risk, regulatory compliance, and corporate governance.

The in-house general counsel increasingly occupies a dual role: governing the use of AI within the legal function itself, and advising the board and ExCo on AI-related legal risks across the business. This second role is where AU0002 is particularly valuable. A GC who has completed AU0002 has the frameworks to:

  • Advise on AI procurement contracts — understanding what vendor terms actually mean for the organisation’s liability, data ownership, and audit rights
  • Brief the board on AI governance obligations — including UK ICO expectations, EU AI Act exposure, and sector-specific regulatory requirements
  • Chair or participate in AI governance committees — with the authority that comes from demonstrated competence in the domain
  • Assess AI-related legal risk in M&A due diligence — as AI-built products and AI-dependent processes become standard components of target businesses

For in-house teams at levy-paying organisations, the funding route for AU0002 is straightforward — the same Growth and Skills Levy mechanism applies as for any other sector employer.

Client AI Due Diligence: The Commercial Driver

Beyond the regulatory compliance angle, there is a growing commercial driver for AI governance competence in legal services. Sophisticated corporate clients — themselves subject to AI governance obligations under the EU AI Act, FCA/PRA rules, or their own board-level risk frameworks — are increasingly including AI governance questions in their law firm due diligence processes.

Questions now appearing in large client RFPs and relationship reviews include: What is your firm’s AI governance policy? Who is accountable for AI tool decisions? How do you ensure client data is not exposed through AI tools? What training do fee earners receive on AI use? How do you supervise AI-generated work product?

A firm whose senior partners can answer these questions substantively — because they have completed formal assessed training in AI governance — is better positioned than one relying on general assurances. AU0002 creates a documented, assessed credential that can be cited in client conversations and RFP responses as evidence of firm-level AI governance capability.

AU0002 is not a legal professional qualification

Completing AU0002 does not satisfy any SRA CPD requirement, does not confer any legal qualification, and should not be presented as satisfying specific SRA compliance obligations. It provides governance and strategy knowledge that is relevant to meeting those obligations — the specific legal analysis of your firm’s SRA obligations requires advice from your own risk and compliance team or external regulatory counsel.

Law firms and legal employers that pay the Growth and Skills Levy — those with payroll above £3 million — can use their levy balance to fund AU0002 for any eligible employed adult. The mechanics are the same as for any other levy-funded training:

  1. Identify candidates (any employed adult, UK-resident, working for the levy-paying employer)
  2. Select an AU0002 delivery provider from the Register of Apprenticeship Training Providers
  3. Agree a training plan — no OTJ requirement, so all delivery can be in work time
  4. Draw down funds through the ESFA Digital Accounts system once delivery starts

For large law firms with substantial unspent levy balances — common in firms where partner-level delivery roles have made traditional apprenticeship enrolment difficult — AU0002 offers a fast, high-value route to productive levy deployment among the most senior cohort. A group of partners or heads of department completing AU0002 together can build shared AI governance language across the leadership team as a side benefit of the programme.

Smaller firms below the levy threshold can access co-funded places, where the government covers 95% of training costs and the employer contributes 5%. For a unit of AU0002’s anticipated scope, the employer co-funding contribution should be modest.

Frequently Asked Questions

Does the SRA require law firms to train staff on AI?
The SRA does not mandate specific AI training, but the competence obligation requires solicitors to maintain the skills needed to practise competently. As AI becomes embedded in legal work, the SRA’s position is that understanding how to use and oversee AI tools responsibly falls within that obligation. Senior leaders who allow their teams to use AI tools they cannot adequately supervise are creating a competence and supervision risk.
What are the main confidentiality risks with AI in legal services?
The primary risks are: inadvertent disclosure of client confidential information to third-party AI servers; cross-matter contamination where AI trained on the firm’s work product surfaces information from one client matter in another; potential privilege waiver through disclosure to third-party AI systems; and metadata inference risks. AU0002’s procurement and governance function areas address how to build controls around each of these.
How does AU0002 map to the SRA’s AI guidance?
The SRA’s guidance focuses on competence, supervision, client transparency, confidentiality, and professional judgement. AU0002 covers governance, procurement, risk, and workforce transformation function areas that map onto each of these — equipping senior lawyers with the frameworks to build firm-level policy and oversight mechanisms that match SRA expectations.
Can in-house legal teams use the levy for AU0002?
Yes — any levy-paying employer can use their balance to fund AU0002 for eligible employed adults, regardless of sector. General counsel, heads of legal, and senior in-house lawyers with AI oversight responsibility are all suitable candidates.
Which roles are the best fit for AU0002 in law?
Managing partners, heads of innovation or legal technology, heads of department in AI-intensive practice areas, COOs, general counsel, and risk and compliance partners. The common thread is responsibility for deciding whether and how AI is deployed — rather than day-to-day use of AI tools.

Sources and Further Reading