Australia and Oceania Crime Science World

Australia blocks AI “nudify” sites linked to child exploitation images

Australia blocks AI “nudify” sites linked to child exploitation images
Source: AP Photo

 

Australian authorities have blocked access to several websites that used artificial intelligence to create child sexual exploitation material, after the country’s internet regulator stepped in with enforcement action.

The three so-called “nudify” sites pulled out of Australia following an official warning, according to eSafety Commissioner Julie Inman Grant, who said the platforms had been drawing around 100,000 visits a month from Australian users and had featured in high-profile cases involving school students.

Grant described the impact of these services as “devastating” for Australian schools, noting how easily they were being used to target real children.

“We took enforcement action in September because this provider failed to put in safeguards to prevent its services being used to create child sexual exploitation material and were even marketing features like undressing ‘any girl,’ and with options for ‘schoolgirl’ image generation and features such as ‘sex mode,’” Grant said.

Her office had issued a formal warning to the UK-based company behind the sites, threatening civil penalties of up to 49.5 million Australian dollars ($32.2m) if it did not introduce protections against image-based abuse.

In parallel, AI hosting platform Hugging Face has also moved to comply with Australian law, updating its terms to require users to actively reduce the risk of harmful misuse of its tools.

Australia has positioned itself at the forefront of tackling online harm to children, from banning social media for under-16s to targeting apps used for stalking and deepfake creation. The regulator warns that the rapid spread of AI tools capable of producing photo-realistic imagery has significantly lowered the barrier to abuse.

A survey by US advocacy group Thorn highlights the scale of the problem: 10 percent of young people aged 13–20 said they knew someone who had been targeted with deepfake nude imagery, while 6 percent reported being directly victimised.

 

Wyoming Star Staff

Wyoming Star publishes letters, opinions, and tips submissions as a public service. The content does not necessarily reflect the opinions of Wyoming Star or its employees. Letters to the editor and tips can be submitted via email at our Contact Us section.