X To Stop Grok AI From Undressing Images After California Probe


Elon Musk’s X has bowed to pressure and will stop the Grok image tool from editing photos of real people in revealing clothing following backlash around the world.

X last night said that it has “implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis.” “This restriction applies to all users, including paid subscribers,” it added.

The move comes after the top California prosecutor said the state is probing the spread of sexualised AI deepfakes, including of children, generated by the AI model. French and UK prosecutors and regulators have also been probing Grok.

Following backlash, X had initially altered the Grok Imagine tool so that only paying subscribers could use the undress feature, but this restriction has now been rolled out to all users. “Additionally, image creation and the ability to edit images via the Grok account on the X platform are now only available to paid subscribers,” it added. “This adds an extra layer of protection by helping to ensure that individuals who attempt to abuse the Grok account to violate the law or our policies can be held accountable.”

X has also geoblocked “the ability of all users to generate images of real people in bikinis, underwear, and similar attire via the Grok account and in Grok in X in those jurisdictions where it’s illegal.”

In the UK, where Grok has attracted vehement criticism including from Prime Minister Keir Starmer, regulator Ofcom said it is still investigating X under the new Online Safety Act. If found to have broken the law, Ofcom could issue X with a fine of up to 10% of its worldwide revenue or £18M ($24.2M), whichever is greater.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top