


On 3 February 2026, the Paris Public Prosecutor’s Office carried out a judicial raid on the French offices of the social media platform X. In the same proceedings, Elon Musk was summoned for a voluntary interview (audition libre), in his capacity as a de jure or de facto executive of the platform.
These procedural steps form part of a criminal investigation opened in January 2025 at the initiative of the French prosecution authorities, which is still ongoing.
The core issue is immediate and clear: French prosecutors are seeking to determine whether X, through its technical, algorithmic and organisational design, enabled, facilitated or failed to prevent the dissemination of criminally unlawful content, and whether such failures may give rise to criminal liability for the company and its senior executives.
On 3 February 2026, specialised cybercrime investigators, acting under the authority of the Paris Public Prosecutor, searched X’s French premises.
The purpose of the raid was to secure and seize evidence, in particular:
Following the raid, Elon Musk was summoned for a voluntary interview, along with other current or former senior executives of the platform.
Under French criminal procedure, a voluntary interview allows investigators to question an individual without coercive measures, while still addressing facts that may potentially give rise to criminal charges.
The investigation covers several distinct factual areas, all connected to the operation and governance of the X platform.
The investigation was initially triggered by reports challenging the functioning of X’s algorithms, suspected of:
French prosecutors are examining whether there was a faulty or negligent use of automated data-processing systems, where such systems may have directly contributed to the commission or propagation of criminal offences.
The scope of the investigation was subsequently expanded to include several categories of serious unlawful content, each of which is criminally sanctioned under French law.
Investigators are examining whether the platform:
in breach of the strict provisions of the French Criminal Code governing child sexual abuse material.
Another focus concerns the creation and dissemination of synthetic pornographic content, in particular non-consensual sexual deepfakes, sometimes generated or amplified through AI-based tools.
Such content may constitute:
The investigation also addresses the circulation of content denying or contesting crimes against humanity, an offence specifically criminalised under French law.
Here again, prosecutors are assessing whether X’s moderation mechanisms were inadequate or insufficiently enforced.
The Grok generative AI system, developed and deployed by X, is a central element of the investigation.
Prosecutors are seeking to determine whether this tool:
without appropriate safeguards.
The underlying legal issue is that of criminal liability arising from the deployment of an AI system whose outputs may include criminally prohibited content.
Elon Musk’s summons reflects an inquiry into executive-level responsibility.
The investigation focuses in particular on:
Under French criminal law, an executive may incur personal liability where offences result from decisions, omissions or organisational failures attributable to their authority or control.
At this stage:
Possible outcomes range from dismissal of the case, to the opening of a judicial investigation, or criminal prosecutions against X and/or individual executives.
Case to be followed closely, as the procedural developments in this matter could mark a significant turning point in the criminal treatment of digital platforms and executive responsibility in France.

