Paedophiles using AI to turn Celebrities into kids…

Paedophiles are using artificial intelligence (AI) to create images of Singer and Film Stars as children.

Pic Credit By Joe Tidy Cyber correspondent.

IWF stated that pictures of female celebrity remade to look like a child are distributed among pedophiles.

Images of child actors are also being tampered with in one dark web forum, according to the said charity.

Bespoke image generators can create hundreds of photos of actual victims of child sexual abuse.

More recently, these kinds of details come from the IWF’s report on the increasing danger, and attempts to draw attention to this hazard.

Ever since these powerful image generation systems became available in the public domain, researchers have cautioned that they could be used fraudulently to create explicit images.

In May, Home Secretary Suella Braverman and US Homeland Security Secretary Alejandro Mayorkas released a joint statement warning against despicable AI-generated images that show adults engaging in sexual acts with minors.

Predators are purportedly sharing images of a renowned female singer, imagined as a child.

The organization asserts that in another dark web forum, pictures of the children actors are being edited to appear sexually.;

Additionally, bespoke image generators are used to create hundreds of new photographs with actual subjects who are victims of child sexual abuse.

It also gives the details gleaned from the IWF’s most recent report addressing the escalating predicament as a means of raising consciousness for the perils associated with pedophiles utilizing AI systems that convert simple commands into pictures.

Once these powerful image generation systems became available to the public, researchers have cautioned about their abuse towards creating lewd pictures.

May’s Home Secretary Suella Braverman and US Homeland Security Secretary Alejandro Mayorkas released a joint pledge to combat against “disgraceful artificial intelligence generated images of child sexual abuse perpetrated by paedophiles”.

In its report, the IWF also outlines how more than 2,800 synthetic images that violate UK laws were identified by researchers who spent a whole month logging all artificial imagery on just one darknet site.

According to analysts, a recent trend has seen predators taking just one photo of widely known young victims on child abuse scenes and using it as a basis for manufacturing multiple other such pictures in various sexually perverted environments.

Into one folder were over 501 pictures of some little girl who was roughly 9-10 years old and sexually assaulted. Other’s were able to create additional images of “predators” by sharing a highly tuned AI model file into the same folder.

Some of their imagery is so real; even adults would mistake it for real child pornography, e.g., images of even celebrities portrayed pretending to be children.

They noticed photos primarily of younger looking female singers or actresses, which had been altered with the imaging software package to make them appear as kids.

No names or specifics of those who were involved in the attack were mentioned in the report.

The organization stated that it was publishing the study in order to have the problem placed on the agenda during the British AI Summit next week at Bletchley Park.

Within one month, the IWF looked into 11,108 images generated by an artificial intelligence that were circulated in a darkweb child abuse forum.

Out of this number, 2,978 were identified and certified that meet the definition of “images which fail the test of UK law”, meaning they showed child sexual abuse
Out of the total 2857 pictures, more than twenty percent were rated as type A, which is the worst category.
The majority of these photos comprised about 54% (1,372), which involved primary school-aged persons (seven to ten years).
Besides this, there were 143 images displaying children between the ages of three and six years and two displaying babies (less than two year).

The fears that the IWF gave in June are now the reality as predators are already taking advantage of AI to create obscene pictures involving children.

This was stated by Susie Hargreaves,, the CEO of IWF,, our worst nightmares have happened.

Three months ago, we said AI imagery may soon appear as genuine photographs depicting abuse of children, and that these images are going to appear in abundance. This is already a fact.

AI-based images remain harmful as observed in the IWF report. The images may not actually harm children directly but they legitimize and normalize predators’ behavior while wasting police manpower searching for the non-existent children.

New forms of offences are also being investigated in some cases resulting into new complexities for enforcing agencies.

One case where this happened was when another organization called the IWF located several pictures of at least two girls who were depicted in Category A sexual abuse scene based on their manipulated photos from an innocent photoshoot at non-nude modelling agency.

In truth, they are being prosecuted for category A offences that did not occur at all.

The IWF has now stated that its fears of such concerns turning into a reality have become true in June.

In this respect, as indicated by Susie Hargreaves, the CEO of IWF, “our worst nightmares have become reality”.

“As warned by us earlier in the year, the day would come when artificial intelligence imagery would soon resemble authentic photos of children being sexually abused; a stage that we are currently past.”

The IWF report states in essence that true-to-life harm is a reality when it comes to AI images. However, these materials do not harm children directly during their production. In addition, photos of fictional victims become a normal part of predatory behavior. It may also be that police have to spend time searching for children whom do not exist.

In some cases even new ways of committing offense are studied, putting extra problems in the way of law enforcement agencies.

As an illustration, the IWF identified lots of photos of two girls who were not even ten years old in which their unaltered images taken for a photoshoot at a non-nude modeling company had been altered to make them appear in category A sex offenses.

What this means is that they’re actually victims of Category A offences that did not take place.

Leave a Reply

Your email address will not be published. Required fields are marked *