Man who used AI to create child abuse images jailed for 18 years in UK

0 1

Stay informed with free updates

A man who used artificial intelligence technology to create child sexual abuse imagery was sentenced to 18 years in prison on Monday, in a landmark prosecution over deepfakes in the UK.

Hugh Nelson, 27, from Bolton, pleaded guilty to a total of 16 child sexual abuse offences, including transforming everyday photographs of real children into sexual abuse material using AI tools from US software provider Daz 3D. He also admitted encouraging others to commit sexual offences on children.

At Bolton Crown Court, Judge Martin Walsh imposed an extended sentence on Nelson, saying he posed a “significant risk” of causing harm to the public. That means Nelson will not be eligible for parole until he has completed two-thirds of his sentence.

Advances in AI mean fake images have become more realistic and easier to create, prompting experts to warn about a rise in computer-generated indecent images of children.

Jeanette Smith, a prosecutor from the Crown Prosecution Service’s Organised Child Sexual Abuse Unit, said Nelson’s case set a new precedent for how computer-generated images and indecent and explicit deepfakes could be prosecuted.

“This case is one of the first of its kind but we do expect to see more as the technology evolves,” said Smith.

Greater Manchester Police found both real images of children and computer-generated images of child sexual abuse on Nelson’s devices, which were seized last June. 

The computer-generated images did not look exactly like real photographs but could be classified as “indecent photographs”, rather than “prohibited images”, which generally carry a lesser sentence. This was possible, Smith said, because investigators were able to demonstrate they were derived from images of real children sent to Nelson.

Nelson in August admitted to creating and selling bespoke images of child sexual abuse tailored to customers’ specific requests. He generated digital models of the children using real photographs that his customers had submitted. Police also said he further distributed the images he had created online, both for free and for payment.

It comes as both the tech industry and regulators are grappling with the far-reaching social impacts of generative AI. Companies such as Google, Meta and X have been scrambling to tackle deepfakes on their platforms.

Graeme Biggar, director-general of the UK’s National Crime Agency, last year warned it had begun seeing hyper-realistic images and videos of child sexual abuse generated by AI.

He added that viewing this kind of material, whether real or computer-generated, “materially increases the risk of offenders moving on to sexually abusing children themselves”.

Greater Manchester Police’s specialist online child abuse investigation team said computer-generated images had become an common feature of their investigations.

“This case has been a real test of the legislation, as using computer programmes in this particular way is so new to this type of offending and isn’t specifically mentioned within current UK law,” detective constable Carly Baines said when Nelson pleaded guilty in August.

The UK’s Online Safety Act, which passed last October, makes it illegal to disseminate non-consensual pornographic deepfakes. But Nelson was prosecuted under existing child abuse law.

Smith said that as AI image generation improved, it would become increasingly challenging to differentiate between different types of images. “That line between whether it’s a photograph or whether it’s a computer-generated image will blur,” she said.

Daz 3D, the company that created the software used by Nelson, said that its user licence agreement “prohibits its use for the creation of images that violate child pornography or child sexual exploitation laws, or are otherwise harmful to minors” and said it was “committed to continuously improving” its ability to prevent the use of its software for such purposes.

Read the full article here

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy