Blog
Article

GDPR, DORA, and AI Act: Understanding the Connections and Ensuring Compliance

September 29, 2025
xx
min
Emmanuel Adjanohun
Co-founder
Copier le lien
GDPR, DORA, and AI Act: Understanding the Connections and Ensuring Compliance
Partager sur Linkedin
Partager sur X
Partager sur Facebook

The European digital ecosystem is governed by a complex regulatory framework. Three major texts – GDPR, DORA, and the AI Act – are reshaping data management, cybersecurity, and artificial intelligence (AI). Though distinct, these regulations weave an interconnected set of requirements, creating compliance challenges but also strategic opportunities.
This article aims to demystify the links between these three legislative pillars. Understanding their interactions enables companies to build a comprehensive approach, turning regulatory constraints into a lever for trust, security, and responsible innovation. The goal is to build a digital future where AI develops within an ethical framework, financial resilience is guaranteed, and personal data is rigorously protected.

Introduction: a complex regulatory framework

The European regulatory landscape protects citizens and stabilizes the single market. The GDPR laid the foundations for data protection. More recently, DORA and the AI Act have supplemented this framework to address the challenges of financial digitization and the rise of artificial intelligence.
Understanding this triptych is a strategic necessity for any organization processing personal data, operating in the financial sector, or using AI systems. This article provides a roadmap to navigate this environment, highlighting the synergies between GDPR, DORA, and the AI Act. The objective is to move from reactive compliance to an integrated governance strategy that creates value.

GDPR: personal data protection

Effective since 2018, the General Data Protection Regulation (GDPR) transformed how organizations handle personal data of EU residents. It aims to strengthen individual rights and harmonize data protection laws across Europe.
The GDPR forms the cornerstone of privacy protection, a fundamental right that other regulations reinforce. This European regulation applies uniformly throughout the Union.

Key GDPR principles (data minimization, specific purpose, etc.)

The GDPR is built around seven core principles for any personal data processing:

  1. Lawfulness, fairness, and transparency: Data processing must be legal, fair, and transparent to the individual.
  2. Purpose limitation: Data must be collected for specific, explicit, and legitimate purposes.
  3. Data minimization: Only data strictly necessary for the purpose should be collected.
  4. Accuracy: Personal data must be accurate and kept up to date.
  5. Storage limitation: Data should only be kept as long as necessary.
  6. Integrity and confidentiality: Appropriate security measures must protect the data.
  7. Accountability: The data controller must be able to demonstrate compliance.

These principles are the foundation of data protection in Europe.

Application of GDPR to AI data processing


Whenever an AI system processes personal data, it falls under the GDPR. This raises specific issues. For example, the data minimization principle may appear to clash with AI algorithms’ need to be trained on large datasets. However, the GDPR requires a rigorous selection to use only relevant and necessary data. Likewise, transparency is crucial: individuals must be informed about how their data is used by AI.

DORA: digital operational resilience

The Digital Operational Resilience Act (DORA), effective since January 17, 2025, aims to strengthen cybersecurity and operational resilience in the financial sector against ICT-related risks. It ensures the stability and continuity of European financial services.

This EU regulation establishes a harmonized framework for ICT risk management, incident handling, resilience testing, and monitoring of critical suppliers. It applies to banks, insurers, investment firms, and their technology providers.

Security obligations for financial entities

DORA imposes stringent requirements to ensure financial entities can withstand and recover from ICT-related threats. The main requirements include:

  • Robust ICT risk management framework: Establish governance and internal controls for effective ICT risk management.
  • Incident management and notification: Implement processes to detect, manage, and report major incidents to authorities.
  • Digital operational resilience testing: Conduct regular security testing, including advanced penetration tests for critical entities.
  • Third-party risk management: Actively monitor risks posed by ICT service providers and ensure contracts meet strict requirements.

Impact of DORA on risk and security incident management

DORA encourages financial institutions to adopt a more structured and proactive approach to cybersecurity and risk management. It places digital resilience at the heart of corporate strategy and increases management accountability. By enforcing end-to-end risk management, DORA requires a comprehensive and integrated view of security.

AI Act: artificial intelligence regulation

Adopted in 2024, the AI Act is the world’s first regulatory framework for artificial intelligence. It balances innovation with the protection of fundamental rights using a risk-based approach, classifying AI systems into four categories: unacceptable risk (prohibited), high risk, limited risk, and minimal risk.
The AI Act governs the placing on the market and putting into service of AI systems within the EU, with phased application starting August 2024.

Definition of high-risk AI systems

The AI Act focuses on regulating high-risk AI systems. A system is classified as high risk if it is a safety component of a regulated product or operates in critical sectors listed in Annex III (e.g., critical infrastructure management, employment, access to essential services, justice).

Specific obligations for high-risk AI systems (risk assessment, transparency, etc.)

High-risk AI systems are subject to strict requirements before market entry:

  • Risk management system: Implement a continuous risk management process.
  • Governance and data quality: Use high-quality, relevant, and bias-free training data sets.
  • Technical documentation and logging: Produce detailed documentation and ensure traceability through logs.
  • Transparency and information: Design systems to be transparent so users can interpret results.
  • Human oversight: Enable adequate human supervision to prevent or minimize risks.
  • Accuracy, robustness, and cybersecurity: Ensure appropriate levels of performance and security.

Interactions between GDPR, DORA, and AI Act

These three regulations are not silos. They form a coherent framework aimed at building a safe European digital space. They share objectives of protection and technological risk management, making an integrated compliance approach essential.

GDPR and AI Act: an essential complementarity

The AI Act and GDPR are intrinsically linked whenever an AI system processes personal data. The AI Act supplements the GDPR without replacing it.

Protection of personal data in AI system development and use

An AI system compliant with the AI Act must also comply with the GDPR. The AI Act’s data governance requirements reinforce GDPR principles like accuracy and minimization. The human oversight required by the AI Act echoes individuals’ right not to be subject to fully automated decisions.

Consent and impact assessments

Both regulations promote a risk-based approach. The GDPR requires a Data Protection Impact Assessment (DPIA) for high-risk processing. The AI Act mandates conformity assessments for high-risk systems. AI Act information is essential to perform the DPIA required by the GDPR.

Handling sensitive data and data minimization


The GDPR imposes strict rules on processing sensitive data. The AI Act strengthens this protection by often classifying systems using such data as high risk. The GDPR’s minimization principle—a challenge for AI—is complemented by the AI Act’s requirement for high-quality data, leading to better data governance.

DORA and GDPR: a complementary approach

DORA can be seen as a sector-specific and detailed application of the GDPR’s general security requirements.

Information system security and data protection

Article 32 of the GDPR requires appropriate security measures. DORA specifies this obligation for the financial sector by detailing an ICT risk management framework. Compliance with DORA helps directly demonstrate compliance with GDPR Article 32.

Security risk management and privacy compliance

Both texts focus on risk management. The security risk analysis of a system (DORA) must include the risk of personal data breaches (GDPR) and their impact. Incident management must coordinate notifications required by both DORA and the GDPR.

DORA and AI Act: convergence on security

For a financial entity using high-risk AI, both regulations apply, creating dual security requirements.

AI system security and digital operational resilience

The AI Act requires high-risk AI systems to be robust and secure. DORA requires that the entire digital infrastructure be resilient. The security of an AI system (AI Act) depends on the security of its environment (DORA).

Developing a holistic security approach for financial institutions using AI

Financial institutions must adopt a holistic approach. Risk assessments of AI systems (AI Act) should integrate into DORA’s ICT risk management framework. Monitoring an AI provider must cover requirements from both the AI Act and DORA.

Challenges and opportunities for businesses

Navigating this regulatory framework is challenging but also an opportunity to strengthen trust and security.

Complying with the three regulations

The key is to avoid siloed approaches. Compliance requires an overall vision and close internal collaboration.

Developing a comprehensive compliance strategy

It is essential to map applicable obligations and develop a unified strategy integrating requirements from each regulation. The data governance framework, for example, can address both GDPR and AI Act demands.

Adapting processes and information systems

Compliance requires adapting processes and IT systems. Compliance must be embedded “by design” from the conception of new AI-involving products and services.

Staff training

Training staff is a fundamental pillar. All employees must understand their role. Executives must be aware of their heightened responsibilities, especially under DORA.

Enhancing cybersecurity and data protection

Compliance is an opportunity to drastically improve a company’s cybersecurity and data protection posture.

Building customer and partner trust

A company demonstrating commitment to security and ethics builds invaluable trust capital, becoming a competitive advantage.

Proactive risk and incident management

A structured risk management approach shifts posture from reactive to proactive, reducing the likelihood and impact of security incidents.

Resources and practical tools

Many resources and tools assist companies in their compliance efforts.

Official regulatory texts

It is crucial to refer to source texts to understand requirements: GDPR (EU 2016/679), DORA (EU 2022/2554), and the AI Act.

Best practice examples and use cases

Regulatory authorities (such as the CNIL) publish guides and best practices. Adopting a “privacy by design” and “security by design” approach is fundamental.

Tools and technologies to facilitate compliance

The market offers a range of tools to support compliance:

  • Governance, Risk, and Compliance (GRC) software: to centralize obligation tracking and risk management.
  • Data mapping tools: to maintain the GDPR processing register.
  • AI governance platforms: to document models, test biases, and manage AI system lifecycles.
  • Cybersecurity solutions: for threat detection and incident response (a pillar of DORA).

Conclusion: towards responsible and secure AI

The articulation between GDPR, DORA, and the AI Act outlines a European digital future where innovation goes hand in hand with security and ethics.

Importance of a comprehensive compliance approach

The era of siloed compliance is over. Businesses must adopt a holistic approach, integrating data protection and security principles from project inception, transforming regulatory obligation into strategic advantage.

Need for collaboration between public and private actors

The complexity of these technologies requires ongoing dialogue among legislators, regulators, and companies. This collaboration is essential to create a framework that is both protective and conducive to innovation.

Outlook for regulatory framework evolution

2025 is a pivotal year with DORA’s entry into force and continued implementation of the AI Act. Supervisory authorities will intensify their actions. Companies anticipating these milestones with an integrated compliance strategy will be best positioned to thrive, making trust and security cornerstones of their development.

Do you want to have more information about our service offer ?