X removes Grok undressing feature following backlash

X says it has a zero-tolerance approach to child sexual exploitation and non-consensual nudity.

X: Ofcom launched an investigation into the Elon Musk-owned platform over the use of Grok to create sexualised imagery

X has confirmed updates to its global safety approach, which include preventing all users from editing photos of real people to depict them in revealing clothing.

The social network, which was formerly known as Twitter, has come under heavy scrutiny since late last year, after users of its Grok AI application created images of real people wearing little clothing, without their consent, as well as child sexual abuse material. 

Last week, reports suggested X had restricted the ability of non-paying users to edit photos of real people into revealing clothing. In fact, X confirmed to Campaign the changes restrict its photo editing service to paying users, but do not allow those users to edit people into revealing clothing.

These changes apply to the Grok feature, @Grok, within the X platform, and not to the standalone Grok app.

However, in a further update this week, in countries where editing images of people to put them in revealing clothing is illegal, including in the UK if the planned law goes ahead, the ability to do so will be geo-blocked across all Grok products, the public @Grok account on X, Grok in X and the Grok app.

Jonathan Lewis, the UK managing director of X, said: “The X platform has been restricted to no longer allow the editing of images of real people in revealing clothing. So, for example, the issue of some users choosing to put people in bikinis.

“On top of this, image creation and the ability to edit images by the @Grok account on X is now only available to paid subscribers. But just to be clear, the first principle still maintains, so all this is doing is adding an extra layer of protection by linking a feature to identifiable paid subscribers who also can be held accountable.”

Paid subscribers to X will be able to edit photographs, but only to change the colour of people’s clothing or hair, for example, not to put them into revealing clothes

Lewis, an experienced media executive who left Channel 4 in 2024, said: “We remain committed to making X a safe platform for everyone and continue to have zero tolerance, simply in terms of any forms of child sexual exploitation and also non-consensual nudity, which is really, really important. 

“We're taking action to remove high-priority violative content, including CSAM – child sexual abuse material – and also non-consensual sexual nudity, and we're doing that by permanently suspending accounts that violate X rules and use prompts to generate and share illegal content. We also report any of these accounts to the relevant law enforcement authorities, as necessary.”

The issue of image manipulation on X has been a big political story in recent weeks, and at Prime Minister's Questions today, Keir Starmer said he had been informed X was “acting to ensure full compliance with UK law”.

Lewis said: “I think it is an important point for us to make that X always complies with the UK law.”

Grok is not the only large language model that allows users to edit people’s clothing to make it more revealing; OpenAI’s ChatGPT and Google’s Gemini have previously been reported as able to do this.

On January 12, Ofcom launched an inquiry into X, which is owned by Elon Musk, to determine whether it has failed to comply with its obligations under the Online Safety Act to protect people from illegal content.

Also on Monday, Liz Kendall, the technology secretary, told fellow MPs that a law making creating non-consensual, intimate images illegal would be brought into force this week.

| grok , x