DIFC Regulation 10: What It Means for AI in Financial Services — and Why ADGM Firms Should Pay Attention

DIFC enacted the first dedicated AI governance regulation for financial services in the Gulf in 2023. It's a preview of where ADGM is heading — and firms using AI tools in compliance today should be preparing now.

In 2023, the Dubai International Financial Centre enacted Regulation 10 of 2023 on the Regulation of Autonomous and Semi-Autonomous Systems. It was a quiet piece of legislation compared to the headline-grabbing developments in the EU and US. But for financial firms operating in the Gulf region, it was significant.

DIFC Regulation 10 is the first dedicated regulatory framework for AI in financial services in the Gulf. It applies to DIFC-regulated firms using automated or semi-automated systems in regulated activities. And it sets out, in specific terms, what those systems must do to be considered compliant.

ADGM hasn’t yet enacted equivalent regulation. But the FSRA has been watching. For ADGM-regulated firms using AI tools in their compliance programmes, now is the time to understand what Regulation 10 requires — and to ask whether your current AI tools could meet those requirements.

What Regulation 10 Actually Says

Regulation 10 applies to “autonomous and semi-autonomous systems” used in regulated activities. The definition is intentionally broad — it covers any system that makes or influences a decision with regulatory significance without continuous human oversight for each individual decision.

That includes a lot of what gets marketed as “AI for compliance”: automated gap analysis, regulatory Q&A systems, policy review tools, obligation monitoring platforms.

The core requirements under Regulation 10 are:

1. Auditability

Every automated decision must be auditable. The firm must be able to reconstruct, after the fact, how the system arrived at a decision: what inputs it used, what logic it applied, what output it produced.

This is a structural requirement, not a documentation requirement. It’s not enough to say “we keep logs.” The system itself must produce structured, auditable outputs that can be reviewed by the regulator.

2. Explainability

Regulated firms must be able to explain, in plain terms, how a system reached its conclusion. The DFSA’s guidance makes clear that “the model said so” is not an explanation.

Explainability means providing the reasoning chain: the inputs, the inference steps, the basis for each step, and the confidence level. It means being able to point to specific regulatory text, specific firm data, or specific logical rules that produced the output.

3. Human oversight

Regulation 10 does not prohibit automated decision-making in regulated contexts. But it requires that humans remain accountable for the decisions. Where a system operates autonomously, the firm must demonstrate that appropriate human review processes are in place.

For compliance applications, this typically means: a qualified compliance officer reviews AI-generated assessments before they are relied upon, and that review is documented.

4. Bias and accuracy testing

Firms must be able to demonstrate that their AI systems have been tested for accuracy and that known biases have been identified and mitigated. For compliance applications, this means being able to show that the system’s outputs are reliable — not just on average, but for the specific regulatory questions the firm is relying on it to answer.

Why ADGM Firms Should Care About DIFC Regulation

ADGM and DIFC have distinct regulatory frameworks and separate regulators. FSRA and DFSA are not the same body. Regulation 10 does not apply to ADGM-regulated firms.

So why should ADGM compliance officers be reading this?

Three reasons:

Regulatory convergence. ADGM and DIFC have historically tracked each other closely on regulatory development. When DIFC enacted its crowdfunding framework, ADGM followed. When DIFC moved on virtual assets, ADGM followed. The FSRA and DFSA participate in the same international regulatory forums (IOSCO, FATF, FSB). The probability that ADGM will not develop AI governance requirements is low. The timeline is the variable.

Operational reality. Many firms are regulated in both DIFC and ADGM. If your AI tools need to meet Regulation 10 for your DIFC entity, they should meet equivalent standards for your ADGM entity — both for operational consistency and because you’ll want the same audit trail across both regulatory relationships.

Early preparation. Retrofitting explainability into an AI-dependent compliance programme after regulation arrives is significantly harder than building it in from the start. Firms that are relying on black-box AI tools for compliance decisions today will face a transition challenge when ADGM’s framework arrives.

What the Regulation Tells Us About Good AI Governance

Regulation 10 is useful not just as a compliance requirement for DIFC entities, but as a template for what good AI governance looks like in a regulated financial services context.

The requirements it sets out — auditability, explainability, human oversight, accuracy testing — are not arbitrary. They reflect the core principle that regulatory compliance must be defensible: if the DFSA (or FSRA) asks why you made a particular compliance decision, you must be able to show your working.

That principle has always applied to human compliance officers. Regulation 10 extends it to the AI tools they rely on.

For any firm evaluating AI tools for compliance use, these requirements provide a useful evaluation framework:

  • Can the system produce a structured, auditable output for every decision? Not just logs — structured, queryable records of how each output was produced.

  • Can you trace the reasoning chain from input to output? For regulatory Q&A, that means specific rule citations and reasoning steps. For gap analysis, that means showing exactly how the policy text was compared to the obligation and why a gap was or wasn’t identified.

  • What human review process does the tool support? Can a compliance officer easily review and approve AI-generated assessments? Does the tool make that review process efficient rather than cumbersome?

  • How does the vendor test accuracy? What test datasets do they use? How do they handle regulatory updates that might affect the accuracy of existing outputs?

The Trajectory Is Clear

I’ve been watching AI governance develop in financial services regulation for several years. The GDPR’s right to explanation, the EU AI Act’s tiered risk approach, DIFC Regulation 10 — these are not isolated developments. They’re part of a coherent international movement towards AI accountability in regulated contexts.

The direction is consistent: AI tools in regulated activities will be held to the same accountability standards as human decision-makers. The reasoning chain must be documented. The outputs must be auditable. The humans relying on AI must remain accountable.

ADGM firms that build their compliance AI strategy around those principles today will be ahead of the regulation when it arrives. Firms that treat explainability as optional will face a reclassification challenge.

How Seif Approaches This

Seif was designed with these governance requirements as first principles, not as features to be added later.

Every output Seif produces includes a ComplianceAnswer envelope: the specific regulatory citations the answer is based on, the confidence level, the reasoning chain (which graph nodes were traversed, which retrieval steps were taken, which inference decisions were made), and an audit hash that chains back to the underlying regulatory text.

When an FSRA examiner asks why your firm concluded that a particular obligation was satisfied, you can produce: the Seif assessment, the underlying regulatory text, the gap classification reasoning, the human review record, and the audit trail — in a package, exportable in minutes.

That’s what defensible AI for compliance looks like.

If you want to understand how Seif would apply to your firm’s specific regulated activities and firm type, book a demo. We’ll walk through a live obligation mapping and show you what the explainability envelope looks like in practice.


This post is based on publicly available information about DIFC Regulation 10 and general regulatory development trends. It is not legal advice. For specific guidance on Regulation 10 compliance or ADGM regulatory requirements, consult a qualified legal adviser.