Pornography

Open source AI is being used by paedophiles to create child sex abuse content.

Shutterstock 1386061847

Safety watchdog, the Internet Watch Foundation (IWF) have revealed that a freely available AI software is being used to manipulate images for the dark web.

IWF have discovered online forums that discuss which models to use – including celebrity children, other publicly available children, and even known abuse victims – and how exactly to use the software to create the content.

The offenders start with a basic source image generating model that is trained on billions and billions of tagged images to create a basic generated image, which is then fine-tuned against already existing abuse material.

The Chief Technology Officer at the IQF, Dan Sexton says, “There’s a technical community within the offender space, particularly dark web forums, where they are discussing this technology. They are sharing imagery, they’re sharing [AI] models. They’re sharing guides and tips.”

He added: “The content that we’ve seen, we believe is actually being generated using open-source software, which has been downloaded and run locally on people’s computers and then modified. And that is a much harder problem to fix.

It’s been taught what child sexual abuse material is, and it’s been taught how to create it.”

Andrew Rogoyski of the Institute for People-Centre AI at the University of Surrey have said that open-source tech is important to “democratising” AI, in order that it’s not controlled by a handful of very large corporates.

Yet acknowledges that “the downside of making AI software freely available is that there are people who will misuse the technology.”

According to a UK government spokesperson, AI-generated child sexual abuse content will be covered by the Online Safety Bill.

CARE welcomes this announcement and will pay close attention to further developments in this area.

Share