<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=192888919167017&amp;ev=PageView&amp;noscript=1">
Tuesday,  December 3 , 2024

Linkedin Pinterest
Check Out Our Newsletters envelope icon
Get the latest news that you care about most in your inbox every week by signing up for our newsletters.
News / Nation & World

Fast rise in AI nudes of teens has unprepared schools, legal system scrambling for solutions

By Josh Cain and Mona Darwish, The Orange County Register
Published: April 14, 2024, 6:05am

LOS ANGELES — The trouble began, she said, when the boy asked to follow her on TikTok. The 13-year-old girl, an 8th grader at Aliso Viejo Middle School, said the request popped up on her profile in February.

Her account was private, and she’d kept most of her previous followers to a small circle of friends and family. But she was familiar with the boy from being in several classes together. She hit accept.

A few weeks later, the girl said, she learned that not only had the boy taken a screenshot of her from her account, but he had also used artificial intelligence software to place put her face on a photo of a nude body that wasn’t hers. And now, a small group of boys was sharing manipulated photos of her among themselves.

Her stepmother reported the photos to the Capistrano Unified School District. Administrators pulled out one of the boys who shared the photos from the girl’s classes, but he continued to harass her outside of class, she said. The boy who created the AI-generated images remained in her class.

“I feel uncomfortable,” the girl said. “I don’t want him near me.”

The family said they later found out the boy had created fake nude photos of at least two other girls and shared them.

Ryan Burris, a spokesman for Capistrano Unified, said the school district is investigating what happened. The district has refused to say how many students are being investigated. They would not say how many were targeted with phony nude photos. And they would not say whether the students involved would be disciplined.

“In general, disciplinary actions may include suspension and potentially expulsion depending on the circumstances of the case,” Burris said in an email.

The Southern California News Group is not identifying the girl or her stepmother.

‘Behind the curve’

What happened at Aliso Viejo Middle School has played out several times at other schools this year. In April, the principal at nearby Laguna Beach High School told parents in an email that several students were being investigated for allegedly using online AI tools to create nude photos of their classmates. In March, five students were expelled from a Beverly Hills middle school after girls there said they were targeted in the same way.

Stay informed on what is happening in Clark County, WA and beyond for only
$9.99/mo

Whether most school administrators across the country know it, the same type of AI-generated sexual harassment and bullying could already be occurring on their campuses, too, experts said.

“We’re way behind the curve,” said John Pizzuro, a former police officer who once led New Jersey’s task force on internet crimes against children. “There is no regulation, policy or procedure on this.”

Pizzuro is now the CEO of Raven, a nonprofit firm lobbying Congress to strengthen laws protecting children from internet-based exploitation. He said U.S. policymakers are still trying to catch up to a technology that only recently became widely available to the public.

“With AI, you can make a child appear older. You can make a child appear naked,” Pizzuro said. “You can use AI to create (child sexual abuse material) from a photo of just one child.”

Just within the last year, powerful apps and programs using AI have exploded in popularity. Anyone with internet access now can make use of chatbots that simulate a conversation with a real person, or image generators that create realistic-looking photos from just a text prompt.

Amid the surge, an untold number of tools have also emerged allowing users to create “deepfakes” — essentially, videos using the faces of celebrities and politicians, animated using AI to place them in not only satirical content, but also in nonconsensual pornography.

Along these lines, some apps offer “face-swap” technology that allows users to put an unknowing person’s face on the body of a pornographic actor in photos or videos. Other apps offer to “undress” anyone in any photo, replacing their clothed body with an AI-generated nude one.

When they first emerged, deepfake programs were still crude and easy to spot, experts said. But being able to tell the difference between a real video and a fake one could only grow more difficult as the technology gets better.

“(These programs) are lightyears ahead of where we could have imagined them a few years ago,” said Michael Karanicolas, executive Director of the UCLA Institute for Technology, Law and Policy.

He said the ease of use of AI-generating programs ensured almost anyone could use them to create realistic photos of another person.

“You don’t need to have a Ph.D. to set these things up,” he said. “Kids always tend to be on the leading edge of tech innovation. It doesn’t surprise me that you have young people with the sophistication to do this kind of stuff.”

An expert in technological abuse, Newport Beach-based psychotherapist Kristen Zaleski says she has yet to see a law enforcement officer or school staff member who truly understands the harms of AI and sexual violence.

“As an advocate, I feel we need to do a lot more to educate politicians and law enforcement on the extent of this problem and the psychological harm it causes,” said Zaleski, chief clinical officer at the Mental Health Collective. “I have yet to reach out to law enforcement to take a report who has taken it seriously or who has knowledge of it. I find a lot of my advocacy with law enforcement and politicians is educating them on what this is rather than them understanding how to help survivors.”

Which laws apply?

Despite their potential for harm, whether the images the students generated of their classmates would actually be considered illegal remains largely unsettled.

Only two years ago did Congress update the Violence Against Women Act to include criminalizing revenge porn, which covers the nonconsensual release of intimate visual depictions of a person. But legal experts said it’s not clear if the updated law would apply to fictional depictions of a person, versus real photos showing a crime being committed against them. That likely would apply to defining child pornography, too.

“In most states, the definition would not include a synthesized, digital, intimate photo of someone — they’re just excluded,” said Rebecca Delfino, associate dean for Clinical Programs and Experiential Learning at Loyola Law School, and an expert on the “intersection of the law and current events and emergencies.”

She explained, “You have to have one individual, one clear individual — you see their face, you see their body. You know that is a person. You have a victim who is being abused, you took real images of them doing something. Those are genuine photos.”

Multiple experts cited the 2002 U.S. Supreme Court case, Ashcroft v. Free Speech Coalition, which struck down a provision of the Child Pornography Prevention Act that outlawed all depictions of child pornography, including computer-generated ones. The court ruled the law was overly broad and violated First Amendment protections for speech; the justices blocked the U.S. government from banning images where no crime was committed to create them.

“Can you arrest me, and charge me, in a case where the entire video is a fake child?” Delfino said. “Under that Supreme Court case, the answer is ‘no.’ “

Many states so far have attempted to address the issue, but their efforts still varied widely. At least 10 states have passed laws explicitly outlawing nonconsensual pornographic deepfakes. But only some added criminal penalties of fines and jail time; others opened the perpetrators up to civil lawsuits and penalties.

That still leaves most states without any laws on the books banning deepfakes under most circumstances. That includes California.

Several bills introduced in the state legislature this year are seeking to address the issue. But for now, police and local prosecutors have few options for bringing cases just for the production of deepfake material, especially when the perpetrators are also children themselves.

Delfino said police could attempt to bring cases against deepfake creators under cyber harassment and bullying laws that already exist. But typically such laws include the requirement that the perpetrator’s actions cause a victim to reasonably fear for their safety.

That means school districts, and the parents of the children they serve, don’t have much to rely on as they navigate the fallout of widely available AI.

“If a parent called me and asked, ‘What do I do?’ the first thing you do is go to your school district,” Delfino said. “Most school districts have codes of conduct related to the behavior of their students, and they’re typically broad enough to address what would be considered harassment.”

At Aliso Viejo Middle School, the stepmother of the 13-year-old girl victimized by her classmates believes the incident has so far been handled “very poorly.”

Despite reporting the photos, the stepmother said she didn’t hear from anyone at the school district until she filed a formal complaint more than a week and a half later.

As of this week, she said, there have been no clear actions taken by the school district. She said she has not been notified of any official disciplinary measures by the school against the students involved in creating the photos.

“I feel that the school is failing to protect these girls,” she said, “or girls in the future, by handling this swiftly.”

Support local journalism

Your tax-deductible donation to The Columbian’s Community Funded Journalism program will contribute to better local reporting on key issues, including homelessness, housing, transportation and the environment. Reporters will focus on narrative, investigative and data-driven storytelling.

Local journalism needs your help. It’s an essential part of a healthy community and a healthy democracy.

Community Funded Journalism logo
Loading...