
The digital and technological universe is constantly evolving, and with it, the regulatory frameworks that govern it. Two major regulations, the Digital Operational Resilience Act (DORA) and the AI Act, are set to redefine the obligations of companies across Europe. This dual legislation imposes strict compliance requirements to ensure better risk management, enhanced security, and greater transparency. This article offers a comprehensive checklist to help your organization navigate these new demands.
What is the DORA Regulation?
The Digital Operational Resilience Act, known by its acronym DORA, is a European Union initiative aimed at strengthening cybersecurity and operational resilience for entities in the financial sector and their critical technology service providers. In response to the growing number of cyber threats, DORA seeks to harmonize rules to ensure that the European financial sector can withstand, respond to, and recover from all types of disruptions and threats related to information and communication technologies (ICT).
Definition and Objectives of DORA
Regulation DORA (EU) 2022/2554 primarily intends to establish a unified and comprehensive regulatory framework for digital operational resilience. Before DORA, ICT risk management was fragmented, with rules scattered and varying from one Member State to another. This new legislation aims to create a single, coherent standard across the entire EU.
The main objectives of DORA are as follows:
- To harmonize requirements regarding ICT risk management and cybersecurity.
- To strengthen the financial sector’s ability to face cyberattacks and other IT incidents.
- To ensure the continuity of financial activities in case of major disruptions.
- To establish direct supervision of critical ICT service providers.
- Credit and payment institutions.
- Insurance and reinsurance companies.
- Investment firms and asset management companies.
- Crypto-asset service providers.
- Market infrastructures such as trading platforms.
Significantly, DORA also extends to third-party ICT service providers, such as cloud and software service suppliers, who are essential partners for the financial sector. Those deemed "critical" will be subject to direct European supervision.DORA Regulation Application Date
The DORA regulation was officially published on December 27, 2022, and came into force on January 16, 2023. However, to allow concerned entities to prepare, a transition period was established. DORA has been fully applicable in all EU Member States since January 17, 2025.What is the AI Act?
The AI Act, or Artificial Intelligence Act, is the world’s first comprehensive regulatory framework for AI. Proposed by the European Commission, this regulation aims to ensure that AI systems used within the European Union are safe, transparent, ethical, and respect fundamental rights.Definition and Objectives of the AI Act
The AI Act (Regulation (EU) 2024/1689) is legislation governing the development, marketing, and use of artificial intelligence within the EU. Its approach is risk-based: the higher the potential risk of an AI system, the stricter the rules.
The objectives of the AI Act are to: - Protect the health, safety, and fundamental rights of European citizens against potential AI risks.
- Promote trustworthy, ethical, and human-centered AI.
- Create a European single market for lawful, safe, and reliable AI applications.
- Encourage innovation while establishing clear safeguards.
Scope of the AI Act: Which AI Systems Are Covered?
The AI Act has a very wide scope, covering most AI systems marketed or used within the EU, whether providers or users are based inside or outside the EU. The definition of an “AI system” is deliberately broad to include current and future technologies. This includes automated systems designed to operate with varying levels of autonomy and capable of generating predictions, recommendations, or decisions.
The regulation applies to a wide range of actors, including: - Providers who develop and place AI systems on the market.
- Deployers (professional users) who use AI systems in their activities.
- Importers and distributors of AI systems within the EU.
Risk Classification of AI According to the AI Act
The AI Act classifies AI systems into four risk categories, each entailing different obligations.
Scope of DORA: Which Actors Are Concerned?
The scope of DORA is broad and not limited to traditional banks. It covers over 20 types of financial entities, including:- Unacceptable risk: AI systems that pose a clear threat to fundamental rights are prohibited. This includes, for example, government social scoring or subliminal behavioral manipulation systems.
- High risk: AI systems used in critical sectors or sensitive applications (healthcare, recruitment, justice, critical infrastructure) fall under this category. These systems are subject to strict requirements before and after market placement, including risk management, data quality, technical documentation, human oversight, and cybersecurity.
- Limited risk: This category includes AI systems that pose a risk of manipulation or deception. The primary obligation is transparency. For example, users must be informed when interacting with a chatbot or when the content they view is a “deepfake.”
- Minimal or no risk: The vast majority of AI applications fall under this category (video games, spam filters, etc.). For these systems, the AI Act imposes no legal obligations but encourages the adoption of voluntary codes of conduct.
DORA: The 5 Pillars of Compliance
Compliance with the DORA regulation is structured around five fundamental pillars that shape digital operational resilience.1. ICT Risk Management
This is the first pillar of DORA compliance. Financial entities must have a comprehensive, robust, and well-documented ICT risk management framework.
- ICT risk management framework: The organization must establish a governance and internal control framework that ensures effective management of ICT risks. This framework should be proportionate to its size and risk profile.
- Risk assessment: It is essential to continuously identify, classify, and assess all ICT risks. This assessment must be reviewed at least once a year.
- Security controls: Robust information security policies must be developed to protect the confidentiality, integrity, and availability of data, based on a risk-based approach.
- Continuous monitoring and risk mitigation: Prompt detection of anomalies and threats is crucial. The organization must implement continuous monitoring mechanisms to identify vulnerabilities and take preventive and corrective actions.
- Compliance documentation: All strategies, policies, procedures, and assessments related to ICT risk management must be formally documented and be ready for presentation to competent authorities.
2. ICT Incident Notification
The DORA regulation establishes a harmonized process for managing, classifying, and notifying major ICT incidents. - Incident notification procedures: Companies must have a clear internal process to detect, handle, and report security incidents.
- Notification deadlines: In the event of a major incident, an initial notification must be sent to the competent authority within very short deadlines. For example, a first notification could be required within 4 hours after the incident is classified as major, and no later than 24 hours after detection. Intermediate and final reports are also required.
- Incident classification: DORA sets precise criteria to classify incidents based on their impact, which determines filing obligations.
3. Digital Operational Resilience Testing
To comply with DORA, having policies is not enough; you must prove they work. Resilience testing is therefore a core obligation. - Types of resilience tests: Entities must conduct a comprehensive testing program, including vulnerability scans, code analyses, performance tests, and penetration testing.
- Testing frequency: Security and resilience tests must be performed at least annually on all critical systems and applications. The largest entities must carry out advanced threat-led penetration testing (TLPT) every three years.
- Test documentation: All tests, results, and remediation plans must be thoroughly documented.
4. Third-Party ICT Service Provider Risk Management
DORA recognizes that the resilience of a financial institution strongly depends on the resilience of its technology partners. - Third-party risk assessment: Before signing a contract with an ICT service provider, the financial entity must conduct an in-depth risk assessment. This third-party risk management must be integrated into the overall ICT risk management framework.
- Contracts and security clauses: Contracts with ICT service providers must include specific and robust clauses covering security, access, audit rights, and incident obligations. The financial entity remains fully responsible for DORA compliance.
- Exit strategies: Companies must develop exit plans for each critical provider to be able to switch suppliers without disrupting their essential operations.
5. Cyber Threat Information Sharing
The final pillar of DORA compliance encourages collaboration to strengthen collective security. - Information-sharing mechanisms: DORA encourages financial entities to set up agreements for voluntary sharing among themselves of cyber threat intelligence (indicators of compromise, attacker tactics, etc.).
- Collaboration with competent authorities: Information sharing extends to cooperation with regulatory authorities to anticipate and better collectively respond to threats.
AI Act: The Key Requirements
Compliance with the AI Act depends on the AI system’s risk level, with the strictest obligations applying to high-risk systems.AI Risk Assessment
One of the first steps to comply with the AI Act is determining whether your AI systems fall into the high-risk category. This requires a thorough analysis of their purpose and usage context compared to the cases listed in Annex III of the regulation. Companies must implement a risk management system throughout the AI lifecycle.Transparency and Documentation Obligations
For high-risk systems, technical documentation is a fundamental requirement. It must be detailed enough to enable authorities to assess compliance. Additionally, transparency obligations apply to certain systems, such as those interacting with humans, which must clearly disclose their artificial nature. Traceability of data and decision-making processes is also a major requirement.Compliance with Ethics and Data Protection Principles
The AI Act emphasizes respect for fundamental rights, including non-discrimination, privacy, and personal data protection. High-risk systems must be designed to minimize bias and ensure effective human oversight. The quality and governance of datasets used to train AI models are essential control points to ensure compliance.Points of Convergence Between DORA and the AI Act
Although DORA and the AI Act are two separate regulations, they share common philosophies and requirements, notably in their approach to risk management and security.Risk Management
Both texts place risk management at the core of their framework. DORA requires a robust ICT risk management system, while the AI Act imposes a risk-based approach for AI. An organization establishing a solid risk management process for DORA will already have a strong foundation to meet AI Act requirements, and vice versa.Data Security
Data security is another major point of convergence. DORA aims to protect data integrity and availability in the financial sector. The AI Act, in turn, imposes strict governance requirements on data used for high-risk AI systems to ensure its quality and relevance. Data protection is a cornerstone of both regulations.Transparency and Accountability
Finally, transparency and accountability are fundamental shared principles. DORA requires clear documentation and notification of incidents, while the AI Act imposes transparency obligations regarding AI systems' operation and comprehensive technical documentation. Both regulations aim to ensure companies are responsible for the technology they deploy.DORA and AI Act Compliance Checklist
To approach compliance in a structured way, here is a checklist of key actions.Compliance Gap Analysis
The first step is to carry out a gap analysis to assess your current posture against the requirements of DORA and the AI Act. - Map your assets: Identify all your business processes, supporting information systems, and ICT service providers.
- Identify AI systems: List all AI systems you use or develop and classify them by the AI Act risk levels.
- Assess existing controls: Compare your current security, risk management, and governance policies with the requirements of both regulations.
Action Plan for Compliance
Based on the gap analysis, develop a detailed action plan. - Prioritize actions: Focus first on higher-risk areas and the most critical obligations.
- Assign responsibilities: Designate clear owners for each compliance task.
- Set a timeline: Establish realistic deadlines for implementing corrective measures for DORA and according to the AI Act’s phased schedule.
Useful Resources and Tools
Compliance is a complex project requiring adequate resources. - Internal expertise: Ensure your teams have the necessary skills in cybersecurity, legal matters, and project management.
- External experts: Don’t hesitate to call on consultants or legal experts to assist with this process.
- Technological solutions: Compliance management, security monitoring, and risk analysis tools can greatly facilitate your teams’ work.
Sanctions for Non-Compliance
Both regulations foresee severe sanctions to encourage compliance.Financial Penalties for Non-Compliance with DORA
DORA allows Member States to define exact penalties, but they must be “effective, proportionate, and dissuasive.” Sanctions may include significant fines. Some sources indicate fines for non-compliance could reach up to €10 million or 2% of the entity’s worldwide annual turnover. For critical providers, daily penalties might be imposed until compliance is achieved.Penalties for Non-Compliance with the AI Act
The AI Act provides for very heavy financial penalties, often compared to those under GDPR. Fines vary according to the severity of the violation: - For the use of prohibited AI: up to €35 million or 7% of worldwide annual turnover.
- For failure to meet obligations related to high-risk systems: up to €15 million or 3% of worldwide annual turnover.
- For providing false information: up to €7.5 million or 1% of worldwide annual turnover.
Additional Resources
Links to DORA and AI Act Regulatory Texts
- Regulation (EU) 2022/2554 (DORA): The full text can be accessed through the official publication portals of the European Union.
- AI Regulation (AI Act): The final text is also available via official EU sources.
Guides and Recommendations from Competent Authorities
European supervisory authorities (EBA, ESMA, EIOPA) and national agencies like France’s ACPR regularly publish guides and technical standards to assist in interpreting and applying DORA. Similarly, the European AI Office will provide guidelines for the AI Act.Examples of Best Practices
Many consulting firms, technology companies, and industry associations publish white papers, guides, and case studies on best practices to comply with DORA and the AI Act. These resources offer concrete examples and practical solutions for your organization.Conclusion: Preparing Your Organization for DORA and AI Act Compliance
The entry into force of DORA and the AI Act marks a major regulatory milestone for many organizations. Compliance is not just a legal obligation; it is an opportunity to strengthen your organization's resilience, improve risk management, and build lasting trust with clients and partners.
By adopting a proactive approach, starting your assessment now, and developing a structured action plan, you can turn this dual requirement into a genuine strategic advantage. The key to success lies in meticulous preparation, appropriate resource allocation, and ongoing commitment to digital operational resilience and ethical artificial intelligence.