Confidential Assessment Results

AI Governance Scorecard Results

UK Credit Union. Generated 31 March 2026

Overall Score

0/48

0% of maximum

Maturity Level

Early Stage

Significant governance gaps exist across multiple regulatory areas. Immediate action is required to establish foundational AI governance controls and avoid potential regulatory exposure.

Want to discuss your results with a specialist?

A Clarendon adviser can help you prioritise the gaps identified in this assessment and build a practical governance roadmap.

Book a 15 Minute Consultation

Section Breakdown

1AI System Identification and Data Privacy
Early Stage0/6
2Accountability and Governance
Early Stage0/8
3Board and Staff AI Competence
Early Stage0/6
4Fairness, Bias and Vulnerable Members
Early Stage0/8
5Third Party AI and Vendor Assurance
Early Stage0/10
6Consumer Duty Outcomes and Transparency
Early Stage0/10

Governance Profile

S1S2S3S4S5S6
S10%
S20%
S30%
S40%
S50%
S60%

Priority Gaps

Sections scoring below 50%

Section 1: AI System Identification and Data Privacy

UK GDPR and ICO Compliance

0/6 (0%)
  • 1Complete a formal AI system inventory documenting all AI tools in use, their purpose, data inputs, and vendor. For most credit unions this will cover three areas: a lending or credit scoring module, a fraud or AML monitoring tool, and any member facing chatbot or automated service.
  • 2Conduct a UK GDPR Article 22 assessment for each AI system that influences lending or member decisions. Determine whether the system makes the final decision or provides a recommendation that a human reviews: this distinction is critical to your legal obligations.
  • 3Update or create Data Protection Impact Assessments (DPIAs) for all AI systems processing personal data, with specific consideration of profiling risks and the vendor's own processing activities.

Section 2: Accountability and Governance

SM&CR Individual Accountability

0/8 (0%)
  • 1Assign AI governance oversight to a named Senior Manager and reflect this in their Statement of Responsibilities. In smaller credit unions, this will typically be the CEO (SMF1).
  • 2Document the reasonable steps taken by each relevant Senior Manager to control AI systems in their area. For a CEO overseeing a third party lending module, reasonable steps include reviewing the vendor's bias testing results and confirming that a human review process exists for borderline decisions.
  • 3Incorporate AI competence into the annual fit and proper certification process for relevant Certification Staff, particularly those in compliance, risk, and technology roles.

Section 3: Board and Staff AI Competence

Training, Literacy and Oversight Capability

0/6 (0%)
  • 1Deliver tailored AI governance training to board members, covering Consumer Duty and SM&CR obligations. The focus should be on understanding what questions to ask of management and vendors, not on technical expertise.
  • 2Provide AI literacy training to Senior Managers, focusing on bias, explainability, and risk management, and specifically on the credit union's own AI tools and their known limitations.
  • 3Train member facing staff on AI system limitations, the escalation process for unexpected results, and the advice and guidance boundary for any AI assisted member service tools.

Section 4: Fairness, Bias and Vulnerable Members

Equality Act 2010 and Consumer Duty

0/8 (0%)
  • 1Conduct formal risk assessments for each AI system, covering data quality, accuracy, and Consumer Duty harm risks. For smaller credit unions, a structured one page review of each AI tool is a proportionate starting point.
  • 2Implement a bias testing programme covering all protected characteristics under the Equality Act 2010. For credit unions relying on third party AI tools, this means requesting evidence of bias testing from the vendor rather than conducting it independently.
  • 3Develop a vulnerability framework for AI decisions, ensuring adjustments where member circumstances of vulnerability are known. Consider whether your AI tools have been configured to flag or refer cases where vulnerability indicators are present.

Section 5: Third Party AI and Vendor Assurance

PRA SS2/21, FCA Outsourcing Rules and Practical Vendor Oversight

0/10 (0%)
  • 1Assess all AI vendor arrangements against PRA SS2/21 to determine if they constitute material outsourcing. Credit unions with assets below £50m may apply a proportionate approach, but the assessment must still be documented.
  • 2Review and update AI vendor contracts to include audit rights, exit provisions, and the vendor's obligation to notify the credit union of material changes to the AI system. Standard vendor terms rarely include these provisions.
  • 3Conduct and document due diligence on all AI vendors. Practical steps for a credit union without specialist technical staff include: requesting the vendor's bias testing methodology and most recent results; asking whether the model has been independently audited; reviewing the vendor's ICO registration; and checking whether the vendor can provide plain English explanations of individual decisions.

Section 6: Consumer Duty Outcomes and Transparency

Member Rights, Complaints and Incident Response

0/10 (0%)
  • 1Create documentation for each AI system demonstrating how it supports Consumer Duty good outcomes. For credit unions relying on third party AI tools, this documentation will largely come from the vendor: ensure you have received and retained it.
  • 2Update privacy notices and member facing materials to disclose AI use in clear, plain English. Specifically address automated processing in loan applications and any AI assisted member service tools, and ensure chatbots are clearly identified as automated rather than human operated.
  • 3Establish a documented process for members to request explanations and human intervention for AI decisions. Ensure the vendor can provide a plain English explanation of individual decisions on request.

Full Recommended Action Plan

The following actions are recommended across all sections, prioritised by regulatory urgency. Actions in sections where you scored below 50% should be treated as immediate priorities.

1

AI System Identification and Data Privacy

Early Stage
1Complete a formal AI system inventory documenting all AI tools in use, their purpose, data inputs, and vendor. For most credit unions this will cover three areas: a lending or credit scoring module, a fraud or AML monitoring tool, and any member facing chatbot or automated service.
2Conduct a UK GDPR Article 22 assessment for each AI system that influences lending or member decisions. Determine whether the system makes the final decision or provides a recommendation that a human reviews: this distinction is critical to your legal obligations.
3Update or create Data Protection Impact Assessments (DPIAs) for all AI systems processing personal data, with specific consideration of profiling risks and the vendor's own processing activities.
2

Accountability and Governance

Early Stage
1Assign AI governance oversight to a named Senior Manager and reflect this in their Statement of Responsibilities. In smaller credit unions, this will typically be the CEO (SMF1).
2Document the reasonable steps taken by each relevant Senior Manager to control AI systems in their area. For a CEO overseeing a third party lending module, reasonable steps include reviewing the vendor's bias testing results and confirming that a human review process exists for borderline decisions.
3Incorporate AI competence into the annual fit and proper certification process for relevant Certification Staff, particularly those in compliance, risk, and technology roles.
4Establish a regular board reporting pack on AI system performance, including fairness indicators and any incidents or complaints. For smaller credit unions, a quarterly one page summary is a proportionate starting point.
3

Board and Staff AI Competence

Early Stage
1Deliver tailored AI governance training to board members, covering Consumer Duty and SM&CR obligations. The focus should be on understanding what questions to ask of management and vendors, not on technical expertise.
2Provide AI literacy training to Senior Managers, focusing on bias, explainability, and risk management, and specifically on the credit union's own AI tools and their known limitations.
3Train member facing staff on AI system limitations, the escalation process for unexpected results, and the advice and guidance boundary for any AI assisted member service tools.
4

Fairness, Bias and Vulnerable Members

Early Stage
1Conduct formal risk assessments for each AI system, covering data quality, accuracy, and Consumer Duty harm risks. For smaller credit unions, a structured one page review of each AI tool is a proportionate starting point.
2Implement a bias testing programme covering all protected characteristics under the Equality Act 2010. For credit unions relying on third party AI tools, this means requesting evidence of bias testing from the vendor rather than conducting it independently.
3Develop a vulnerability framework for AI decisions, ensuring adjustments where member circumstances of vulnerability are known. Consider whether your AI tools have been configured to flag or refer cases where vulnerability indicators are present.
4Establish model monitoring procedures with defined thresholds for review or suspension, and ensure your vendor contract requires advance notification of material model changes.
5

Third Party AI and Vendor Assurance

Early Stage
1Assess all AI vendor arrangements against PRA SS2/21 to determine if they constitute material outsourcing. Credit unions with assets below £50m may apply a proportionate approach, but the assessment must still be documented.
2Review and update AI vendor contracts to include audit rights, exit provisions, and the vendor's obligation to notify the credit union of material changes to the AI system. Standard vendor terms rarely include these provisions.
3Conduct and document due diligence on all AI vendors. Practical steps for a credit union without specialist technical staff include: requesting the vendor's bias testing methodology and most recent results; asking whether the model has been independently audited; reviewing the vendor's ICO registration; and checking whether the vendor can provide plain English explanations of individual decisions.
4Implement a change management process for AI vendor model updates. The vendor contract should require advance notification of material model changes, and the credit union should have a documented process for reviewing and approving such changes before they go live.
5Obtain written confirmation from each AI vendor that their system has been tested for bias against the protected characteristics in the Equality Act 2010, and that they can provide a plain English explanation for any individual decision if a member requests one. If a vendor cannot or will not provide this information, that is itself a significant governance concern.
6

Consumer Duty Outcomes and Transparency

Early Stage
1Create documentation for each AI system demonstrating how it supports Consumer Duty good outcomes. For credit unions relying on third party AI tools, this documentation will largely come from the vendor: ensure you have received and retained it.
2Update privacy notices and member facing materials to disclose AI use in clear, plain English. Specifically address automated processing in loan applications and any AI assisted member service tools, and ensure chatbots are clearly identified as automated rather than human operated.
3Establish a documented process for members to request explanations and human intervention for AI decisions. Ensure the vendor can provide a plain English explanation of individual decisions on request.
4Enhance the complaints process to handle AI related complaints, including the ability to reconstruct the decision, investigate whether the AI performed correctly, and provide appropriate redress.
5Update operational resilience and incident response plans to include AI specific failure scenarios, including a vendor outage, a model producing systematically incorrect decisions, and the discovery of significant bias.

Regulatory Framework Reference

FCA Consumer Duty

PRIN 2A; FG22/5

Fully in force from 31 July 2023 (new products) and 31 July 2024 (closed products). Requires firms to deliver good outcomes across four areas: products and services, price and value, consumer understanding, and consumer support. AI systems that influence member outcomes must be assessed against all four outcome areas.

SM&CR Individual Accountability

SYSC 4.7; COCON; FCA/PRA AI Update (April 2024)

The FCA and PRA confirmed in April 2024 that SM&CR already applies to AI governance. Senior Managers are personally accountable for AI systems within their area of responsibility and must document the reasonable steps they have taken to ensure those systems are effectively controlled.

UK GDPR and ICO

UK GDPR Art. 22, 35; Data (Use and Access) Act 2025

Article 22 restricts solely automated decisions with legal or similarly significant effects. DPIAs are mandatory for high risk AI processing. The Data (Use and Access) Act 2025 is amending the automated decision making framework. The ICO is actively supervising compliance.

Equality Act 2010

s.19 (Indirect Discrimination); s.4 (Protected Characteristics)

AI systems that produce disparate outcomes for groups sharing protected characteristics may constitute indirect discrimination. Regular bias testing is required to identify and mitigate this risk. For credit unions relying on third party AI tools, vendor provided bias testing evidence is the practical mechanism for demonstrating compliance.

PRA SS2/21 and FCA Outsourcing

PRA SS2/21 (March 2026 update); FCA SYSC 8; PS26/2; FG26/4

SS2/21 applies to credit unions and requires assessment of material outsourcing, including AI platforms. Credit unions with assets below £50m may apply a proportionate approach, but the assessment must still be documented. PS26/2 (February 2026) introduced new operational incident and third party reporting requirements.

FCA Operational Resilience

Policy Statement (March 2022); Impact Tolerances

Firms must identify important business services, set impact tolerances, and ensure they can remain within those tolerances during severe but plausible disruption scenarios including AI system failures and vendor outages. AI systems supporting lending, fraud detection, or member services are likely to be important business services.

Disclaimer: This assessment is a self administered, indicative tool designed to assist UK credit unions in identifying potential AI governance gaps. It does not constitute legal or regulatory advice and should not be relied upon as a definitive assessment of compliance. The results reflect the answers provided and are based on regulatory guidance current as at April 2026. Credit unions should seek appropriate professional advice when addressing identified gaps. Results are stored in your browser only and are not transmitted to any third party.
Clarendon

UK Credit Union AI Governance Assessment. April 2026