newYou can now listen to Fox News articles.
Sexual predators are using a powerful new tool to exploit children: AI image generators. A user on a single dark web forum shared nearly 3,000 AI-generated images of child sexual abuse in just one month, according to a recent report from the UK-based Internet Watch Foundation. .
Unfortunately, current child sexual abuse laws are outdated. It does not adequately account for the unique risks posed by AI and other emerging technologies. Lawmakers must act quickly to put legal protections in place.
CyberTipline, the national reporting system for suspected online child exploitation, received a staggering 32 million reports in 2022, up from 21 million just two years ago. This already alarming number is sure to increase with the rise of image-generating AI platforms.
The AI platform is “trained” based on existing visual material. Sources used to create images of abuse may include real children’s faces taken from social media and photos of real exploitation. Given that there are tens of millions of abusive images online, there is almost inexhaustible source material from which AI can generate even more harmful images.
For more FOX News opinions, click here
Cutting-edge AI-generated images are now nearly indistinguishable from raw photographs. Investigators say new images of old victims, “de-aged” images of celebrities depicted as children in abuse scenarios, and “denuded” images taken from benign photos of clothed children Found the image.
The scope of the problem is expanding every day. Text conversion software makes it easy to create images of child abuse based on what the perpetrator wants to see. Additionally, much of this technology is downloadable, allowing criminals to generate images offline without fear of detection.
Using AI to create photographs of child sexual abuse is not a victimless crime. Behind every AI image are real children. Survivors of past exploitation are victimized again when new depictions are created using their likenesses. Research also shows that the vast majority of people who possess or distribute child sexual abuse materials are actually committing the abuse.
AI Working Group’s top lawmaker says privacy regulations should be a priority for Congress
Adults can also use text-generating AI platforms like ChatGPT to update old tactics to better seduce children. Criminals have long used fake online identities to meet young people in games or on social media, gain their trust and manipulate them into sending explicit images in exchange for money, more photos, or their bodies. He has engaged in “sextortion,” which involves demanding sexual acts.
But ChatGPT makes it incredibly easy to impersonate a child or teenager with youthful language. Today’s criminals may use AI platforms to generate realistic messages aimed at manipulating young people into interacting online with people they believe are their own age. What’s even scarier is that many modern AI tools have the ability to quickly “learn” and therefore teach people which grooming techniques are most effective.
President Biden recently signed an executive order aimed at managing AI risks, including protecting Americans’ privacy and personal data. But we need the help of lawmakers to tackle online child abuse using AI.
First, the federal legal definition of child sexual abuse must be updated to include AI-generated depictions. Under current law, prosecutors must prove actual harm to the child. However, this requirement is not compatible with today’s technology. AI child sexual abuse material depicts real children, even though defense attorneys know that AI-generated images are often drawn from source material that victimizes real children They might argue that it’s not harmful because it’s not.
Second, we need to adopt policies that require technology companies to continually monitor and report exploitative content. Some companies may actively scan such images, but they are not required to do so. Only three of his companies were responsible for his 98% of all cyber tips in 2020 and 2021: Facebook, Google, and Snapchat.
Child sex abuse laws in many states designate “mandated reporters,” or professionals such as teachers or doctors, who are legally required to report suspected abuse. But in an age where we spend so much of our lives online, employees of social media and other technology companies should have similar legally mandated reporting responsibilities.
Finally, you should rethink how you use end-to-end encryption, where only the sender and recipient have access to the contents of a message or file. While there are useful applications such as banking and medical records, end-to-end encryption can also help store and share images of child abuse. To illustrate how many abusers may go undetected, of the 29 million tips received by CyberTipline in 2021, we identified some who maintain end-to-end encryption for iMessage and iCloud. Consider that only 160 of these are from Apple.
CLICK HERE TO GET THE FOX NEWS APP
Even if law enforcement had a warrant to access the perpetrator’s files, technology companies with end-to-end encryption could argue that they can’t help because they don’t have access to those files. there is. Indeed, an industry built on innovation can develop solutions to protect children and makes it a priority.
AI technology and social media are evolving every day. If lawmakers act now, we can prevent large-scale harm to children.
Teresa Huizar is CEO. National Children’s AllianceAmerica’s largest network of care centers for child abuse victims.