TO THE

AI could help normalize child sexual abuse as graphic images explode online: Experts

News

July 21, 2023 | 21:14

AI is opening the door for a disturbing trend of people creating realistic images of children in sexual contexts, which could increase the number of real-life cases of sex crimes against children, experts warn.

AI platforms that can mimic human conversation or create realistic images exploded in popularity late last year through 2023 following the release of the ChatGPT chatbot, which served as a watershed moment for the use of AI.

As the curiosity of people around the world has been piqued by technology for work or school assignments, others have embraced the platforms for more nefarious purposes.

The National Crime Agency (NCA), which is the UK’s lead agency fighting organized crime, warned this week that the proliferation of explicit machine-generated images of children is having a radicalizing effect, normalizing pedophilia and disturbing behaviors towards children.

We estimate that viewing these images, whether real or artificially generated, materially increases the risk of criminals turning to sexually abusing children themselves, NCA director general Graeme Biggar said in a recent report.

The agency estimates that there are up to 830,000 adults, or 1.6% of the adult population in the UK, who pose some type of sexual danger to children.

AI platforms that can mimic human conversation or create realistic images exploded in popularity late last year through 2023 following the release of the ChatGPT chatbot, which served as a watershed moment for the use of AI.
Looker Studio

The estimated figure is 10 times greater than the UK prison population, according to Biggar.

Most child sexual abuse cases involve the viewing of explicit images, according to Biggar, and with the help of artificial intelligence, creating and displaying sexual images could normalize child abuse in the real world.

[The estimated figures] they reflect partly a better understanding of a threat that has historically been underestimated, and partly a real increase caused by the radicalizing effect of the Internet, where the widespread availability of videos and images of abused and raped children, and groups sharing and discussing the images, have normalized such behavior, Biggar said.

In the United States, there is a similar explosion in the use of artificial intelligence to create sexual images of children.

Images of children, including content from known victims, are being repurposed for this truly evil production, Rebecca Portnoff, director of data science at Thorn, a non-profit that works to keep children safe, told the Washington Post last month.

The National Crime Agency (NCA), the UK’s lead agency in the fight against organized crime, warned this week that the proliferation of explicit machine-generated images of children is having a radicalizing effect by normalizing pedophilia.
Alena Ivochkina

Victim identification is already a needle-in-a-haystack problem, where law enforcement agencies are trying to find a child in danger, he said.

Ease of use of these tools is a significant change, as is realism. It makes everything more of a challenge.

Popular AI sites that can create images based on simple suggestions often have community guidelines that prevent creepy photos from being created.

Such platforms are trained on millions of images from the internet that serve as the building blocks for artificial intelligence to create compelling representations of people or places that don’t actually exist.

Midjourney, for example, requires PG-13 content that avoids nudity, sex organs, fixation on bare breasts, people in showers or toilets, sexual imagery, fetishes.

While DALL-E, OpenAI’s image creation platform, only allows G-rated content, prohibiting images that show nudity, sexual acts, sexual services, or content otherwise intended to elicit sexual arousal.

However, dark web forums of malicious people discuss workarounds to create disturbing images, according to various reports on artificial intelligence and sex crimes.

Biggar noted that the AI-generated images of children also throw police and law enforcement into a maze of deciphering fake images from those of real victims in need of assistance.

Using AI for this purpose will make it harder to identify real children in need of protection and will further normalize child sexual abuse among offenders and those on the periphery of crime. We also assess that viewing these images, real or AI-generated, increases the risk that some offenders will turn to sexually abusing children in real life, Biggar said in commentary provided to Fox News Digital.

In collaboration with our international law enforcement partners, we are pooling our expertise and technical capabilities to understand the threat and ensure we have the right tools to tackle AI-generated material and protect children from sexual abuse.

AI-generated images may also be used in sextortion scams, with the FBI issuing a crime alert last month.

Deepfakes often involve editing videos or photos of people to look like someone else using deep learning AI, and have been used to harass victims or raise money, including children.

Malicious actors use content manipulation technologies and services to exploit photos and videos, typically captured from an individual’s social media account, the open Internet, or requested from the victim, into sexually-themed images that appear realistic to a victim’s likeness, then circulate them on social media, public forums or pornographic websites, the FBI said in June.

Many victims, including minors, are unaware that their images have been copied, manipulated and circulated until someone else has brought them to their attention.





Load more…





https://nypost.com/2023/07/21/ai-could-normalize-child-sex-abuse-as-graphic-images-erupt-online-experts/?utm_source=url_sitebuttons&utm_medium=site%20buttons&utm_campaign=site%20buttons

Copy the URL to share

#normalize #child #sexual #abuse #graphic #images #explode #online #Experts
Image Source : nypost.com

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *