Home » Governance and AI Act: How to articulate legal constraints, effective governance, and innovation?
The AI Act, which came into effect in August 2024, is not limited to technical or legal requirements: it imposes a real transformation of governance on organisations. Artificial intelligence systems classified as high-risk will need to be governed by supervision, documentation, and data quality mechanisms.
Or, Compliance cannot rest on a single department. Legal, IT, data, business units, and innovation departments must collaborate to build transversal governance. It is on this condition that the AI Act can be effectively respected and without hindering innovation.
The CNIL reminds us that the regulation does not create an isolated framework, but rather is aligned with existing texts such as the GDPR. This means that the processing of personal data carried out within AI systems remains subject to the GDPR, in addition to the new obligations of the AI Act. An AI Act impact assessment may also be coordinated with the impact assessment provided for by the GDPR.
If governance is fragmented, the risk is twofold : a legal reading without technical vision, or conversely, a technical implementation without consideration of fundamental rights. In both cases, the organisation exposes itself to Regulatory blind spots.
In many companies, AI initiatives originate from innovation or the IT department, while compliance is managed by legal and risk managers. This siloed approach quickly shows its limitations:
Result: friction, delays, and sometimes suspended projects. The AI Act requires a rethink of this organisation to prevent compliance from becoming a hindrance.
Are you concerned your AI projects will be slowed down by organisational silos?
Successful governance rests on four complementary pillars:
| Governance model | Description | Advantage | Limit |
|---|---|---|---|
| Centralised | A dedicated team manages compliance for all AI projects. | Coherence. | Risk of heaviness. |
| Federated | Each business entity or subsidiary has its AI compliance representatives, with central coordination. | Proximity of use. | Risk of heterogeneity. |
| Integrated DevOps / MLOps | The AI Act requirements are integrated directly into the model development and deployment processes. | Operational efficiency. | Technical maturity required. |
Beyond compliance, well-thought-out governance brings tangible benefits:
As the ACPR recalls, the AI Act should not be seen merely as a compliance text, but as an opportunity to strengthen the sector's resilience.
Anticipating implementation is essential, as obligations for high-risk systems will take effect as early as August 2026. Organisations can, from now on:
The AI Act introduces a new way of working, where compliance becomes everyone's responsibility. By articulating legal, data, and business functions, organisations are not just meeting regulatory requirements: they are equipping themselves with a robust framework for developing trustworthy and sustainable AI.
Building good governance today is about transforming compliance into a lever for innovation.
Why is governance essential in the AI Act?
Because compliance isn't about one single direction. The regulation imposes technical, legal and organisational obligations. Only cross-functional governance can cover all of these dimensions.
The following roles should be involved in governance:
Lawyers and risk managers, IT and CDOs, business functions and HR, as well as innovation teams. Each brings indispensable expertise.
Which governance model to choose?
This depends on the size and culture of the organisation. The three main options are: centralised, federated, or integrated into DevOps/MLOps.