Artificial intelligence is opening the door to a disturbing trend for people to create realistic images of children in sexual scenes, which could increase the number of sex crimes against real children, experts warn.

AI platforms that can mimic human conversations and create realistic images have exploded in popularity from late last year to 2023 after the release of the chatbot ChatGPT, marking a turning point in the use of artificial intelligence. While technology for work and school arouses the curiosity of people around the world, others use the platform for more nefarious purposes.

Britain’s lead agency in fighting organized crime, the National Crime Agency, warned this week that the prevalence of machine-generated explicit images of children had a “radicalizing” effect, “normalizing” pedophilia and disturbing behavior towards children.

In a recent report, NCA Executive Director Graham Bigger said: “Viewing these images, whether real or AI-generated, significantly increases the risk that offenders themselves will go on to sexually abuse children.”

AI ‘deepfake’ of innocent images fuels surge in sextortion scams, FBI warns

National Crime Agency (NCA) Commissioner Graeme Bigger during the Northern Ireland Police Commission meeting at James House, Belfast. Taken on Thursday, June 1, 2023 (Photo credit: Liam McBurney/PA Images via Getty Images) (Getty Images)

The agency estimates that up to 830,000 adults, or 1.6% of the adult population in the UK, pose some type of sexual risk to children. Bigger said the estimated number is ten times the UK prison population.

Bigger said the majority of child sexual abuse cases involve viewing explicit images, and the creation and viewing of sexual images with the help of AI could “normalize” child abuse in the real world.

Artificial intelligence could proactively detect ‘sexual exploitation’ and help FBI: expert

”[The estimated figures] This reflects, in part, a better understanding of a historically underestimated threat, and a real increase driven in part by the radicalizing effect of the internet, where videos and images of children being abused and raped became widely available, and groups shared and debated the images, making such practices the norm,” Bigger said.

In this illustrated photo from July 18, 2023, you can see an illustration of an artificial intelligence on a laptop against a background of books. (Photo by Jaap Arriens/NurPhoto via Getty Images) (Getty Images)

Similar cases of using AI to create sexual images of children have exploded in the United States.

“Images of children with content of known victims are being reused for this truly evil output,” Rebecca Portnoff, data science director at Thorne, a nonprofit that works to protect children, told The Washington Post last month.

Canadian man sentenced to prison for AI-generated child pornography: report

“Victim identification is already a needle in a haystack, with law enforcement trying to find children at risk,” she said. “The ease of use of these tools is not only a reality, but a big change. Everything just gets more difficult.”

Popular AI sites that allow you to create images based on simple prompts often have community guidelines that prevent creating offensive photos.

A teenage girl in a dark room. (Getty Images)

Such platforms are trained with millions of images collected from across the internet that act as building blocks for AI that creates compelling depictions of people and places that don’t actually exist.

Lawyers warn AI could overturn court cases with fake evidence

For example, Midjourney wants PG-13 content that avoids “nudity, genitals, preoccupation with bare breasts, people in showers or toilets, sexual images, and fetishes.” OpenAI’s image creation platform, DALL-E, only allows G-rated content, banning images that show “nudity, sexual acts, sexual services, or content intended to arouse sexual arousal.” However, according to various reports on AI and sex crimes, workarounds to create disturbing images are being discussed on dark web forums where people with bad intentions gather.

Police car with 911 sign. (Getty Images)

Bigger also pointed out that AI-generated images of children throw police and law enforcement into a maze of deciphering fake images from those of real victims who need assistance.

“The use of AI in child sexual abuse will make it more difficult to identify real children in need of protection, further normalizing abuse,” said the NCA executive director.

AI-generated images could also be used in sextortion scams, a crime the FBI issued a warning about last month.

Deepfakes, which often use deep learning AI to edit videos and photos of people to make them look like someone else, have been used to harass and extort money from victims, including children.

FBI warns that AI deepfakes are being used to create a ‘sextortion’ plan

“Malicious actors use content manipulation techniques and services to exploit photos and videos (usually captured from personal social media accounts, the open internet, or requested by victims) to create lifelike, sexually themed images that resemble victims and distribute them on social media, public forums, or pornographic websites,” the FBI said in June.

CLICK HERE TO GET THE FOX NEWS APP

“Many victims, including minors, are unaware that their images have been copied, manipulated or disseminated until someone else tells them.”

Share.

TOPPIKR is a global news website that covers everything from current events, politics, entertainment, culture, tech, science, and healthcare.

Leave A Reply

Exit mobile version