Grok AI used for ‘abhorrent’ creation of non-consensual, sexualised imagery

Thousands of non-consensual, sexualised images of identifiable people have been generated by users on X, using its built-in AI assistant Grok. The vast majority of images were of women, while several news outlets found instances where images of children in minimal clothing had been produced.

Earlier this month, Grok posted that X had “identified lapses in safeguards and are urgently fixing them … CSAM (Child Sexual Abuse Material) is illegal and prohibited”.

Prime Minister Anthony Albanese said “the fact that this tool was used so that people were using its image creation function through Grok is, I think, just completely abhorrent”. Australia’s eSafety Commissioner is investigating several complaints relating to non-consensual sexualised images generated by Grok.

xAI, the Elon Musk-owned tech company behind Grok, has now restricted the use of Grok’s edit image function to paid subscribers. Musk has said users who generate illegal content with Grok would face the same consequences as those who upload it directly.

SBS News is Australia’s trusted news source for the latest news from Australia and across the world.

Subscribe to the SBS News YouTube channel: https://www.youtube.com/@SBSNews
Follow SBS News on Instagram: https://www.instagram.com/sbsnews_au
Follow SBS News on TikTok: https://www.tiktok.com/@sbsnews_au
Follow SBS News on X: https://www.twitter.com/SBSNews
Follow SBS News on Facebook: https://www.facebook.com/sbsnews
Apple News: https://trib.al/MTx0gUe

For the latest news, visit www.sbs.com.au/news

Network Terms & Conditions: https://sbs.com.au/terms
Privacy Policy: https://sbs.com.au/privacy
Feedback or complaints: https://sbs.com.au/complaints

Related posts

Leave a Comment