Téléchargez gratuitement notre eBook "Pour une stratégie d'entreprise éco-responsable"
télécharger
French
French
Formation juridique
Propriété intellectuelle : formez vos équipes au-delà de la conformité
Stratégie PI, preuve d’antériorité, secrets d’affaires, outils de valorisation : une formation sur-mesure animée par nos avocats.
En savoir plus
Formation juridique
Intelligence Artificielle : maîtriser vos risques juridiques & anticiper l’IA Act
Découvrez notre formation sur les risques et obligations liés à l’intelligence artificielle
En savoir plus
Actualité
27/1/26

The European Commission Opens Proceedings Against Grok: A Turning Point for AI Accountability

On 26 January 2026, the European Commission formally opened proceedings against Grok, the generative AI system developed by xAI and integrated into the X platform. The investigation, launched under the Digital Services Act (DSA), marks a significant step in the EU’s strategy to impose concrete accountability on large-scale AI systems whose deployment may generate systemic risks.

Beyond the specific case of Grok, the decision reflects a broader regulatory shift: the transition from reactive content moderation to proactive governance of artificial intelligence.

The Facts: Uncontrolled Generation of Sexualised Content

The Commission’s action follows the identification of serious deficiencies in the operation of Grok’s image-generation features.

According to the information made public, the tool was capable of producing:

  • sexually explicit images generated from minimal prompts,
  • non-consensual sexualised representations,
  • and, in certain configurations, content potentially involving minors or simulating such situations.

These outputs were reportedly accessible through the X platform without sufficient technical safeguards, moderation layers, or effective content filtering mechanisms.

What is at issue is not a marginal malfunction, but the structural ability of the system to generate unlawful or harmful content under foreseeable conditions of use.

The Legal Basis: Systemic Obligations Under the Digital Services Act

The Commission’s investigation is grounded in the core obligations imposed by the DSA on very large online platforms and services:

  • the duty to identify and assess systemic risks arising from their services,
  • the obligation to implement proportionate and effective mitigation measures,
  • and the requirement to protect users, particularly vulnerable groups, from foreseeable harm.

In the Commission’s view, Grok was deployed without adequate risk assessment and without safeguards commensurate with the nature of the content it was capable of producing. This, in itself, constitutes a potential breach of the DSA, irrespective of whether individual outputs are later removed.

The legal reasoning is clear: responsibility arises at the level of design and governance, not merely at the level of moderation.

A Structural Shift in European Digital Regulation

What makes this case particularly significant is the regulatory philosophy it reflects.

The Commission is no longer focusing solely on isolated illegal content. Instead, it is examining:

  • the architecture of the system,
  • the foreseeability of misuse,
  • and the adequacy of internal risk-management mechanisms.

This approach aligns closely with the logic of the forthcoming AI Act: AI systems, especially those deployed at scale, must be conceived with built-in safeguards, traceability and accountability.

In other words, technical power now carries a legal duty of anticipation.

Conclusion

The Grok investigation marks a decisive moment in European digital regulation.

It signals that generative AI systems will be assessed not only on what they do, but on how responsibly they are designed.

For developers and platforms alike, the message is unambiguous:

innovation remains welcome, but only where it is accompanied by governance, foresight and control.

In the European Union, artificial intelligence is no longer judged solely by its performance, but by its capacity to respect the legal and ethical boundaries of the digital public space.

Vincent FAUCHOUX
Image par Canva
Formation juridique
Propriété intellectuelle : formez vos équipes au-delà de la conformité
Stratégie PI, preuve d’antériorité, secrets d’affaires, outils de valorisation : une formation sur-mesure animée par nos avocats.
En savoir plus
Formation juridique
Intelligence Artificielle : maîtriser vos risques juridiques & anticiper l’IA Act
Découvrez notre formation sur les risques et obligations liés à l’intelligence artificielle
En savoir plus

Abonnez vous à notre Newsletter

Recevez chaque mois la lettre du DDG Lab sur l’actualité juridique du moment : retrouvez nos dernières brèves, vidéos, webinars et dossiers spéciaux.
je m'abonne
DDG utilise des cookies dans le but de vous proposer des services fonctionnels, dans le respect de notre politique de confidentialité et notre gestion des cookies (en savoir plus). Si vous acceptez les cookies, cliquer ici.