Students at a middle school in Beverly Hills used artificial-intelligence technology to create fake nude photos of their classmates, according to school administrators. Now, the community is grappling with the fallout.

School officials at Beverly Vista Middle School were made aware of the “AI-generated nude photos” of students last week, the district superintendent said in a letter to parents. The superintendent told NBC News the photos contained students’ faces superimposed onto nude bodies. The district did not share how they determined the photos were produced with artificial intelligence.

“It’s very scary because people can’t feel safe to, you know, come to school,” a student at Beverly Vista Middle School who did not want to be identified told KNBC in Los Angeles. “They’re scared that people will show off, like, explicit photos.”

Lieutenant Andrew Myers of the Beverly Hills Police Department told NBC News that police responded to a call from the Beverly Hills Unified School District late last week and took a report about the incident. Currently a non-criminal investigation is underway, Myers said. Because the investigation involves juveniles, Myers said no further information could be shared.

The Beverly Hills middle school case follows a series of similar incidents involving students creating and sharing AI-generated nude photos of their female classmates at high schools around the world. One New Jersey teen victim spoke about her experience in January in front of federal legislators in Washington, D.C. to advocate for a federal law criminalizing all nonconsensual sexually-explicit deepfakes. No such federal law currently exists. 

Tune in to “NBC Nightly News with Lester Holt” tonight at 6:30 p.m. ET/5:30 p.m. CT for more on this story.

In a letter to parents obtained by NBC News, Beverly Hills Unified School District Superintendent Dr. Michael Bregy characterized the deepfake incident as part of “a disturbing and unethical use of AI plaguing the nation.”

“We strongly urge Congress as well as federal and state governments to take immediate and decisive action to protect our children from the potential dangers of unregulated AI technology,” Bregy wrote. “We call for the passing of legislation and enforcement of laws that not only punish perpetrators to deter future acts but also strictly regulate evolving AI technology to prevent misuse.”

Bregy told NBC News that the school district would punish the student perpetrators in accordance with the district’s policies. For now, he said those students have been removed from the school pending the results of the district’s investigation. Then, Bregy said student perpetrators will be punished with anything from suspension to expulsion, depending on their level of involvement in creating and disseminating the images. Outside the district, however, the path to recourse for student victims is less clear. 

Security guards stand outside at Beverly Vista Middle School on Feb. 26, 2024 in Beverly Hills, Calif.
Security guards stand outside at Beverly Vista Middle School on Monday in Beverly Hills, Calif.Jason Armond / Los Angeles Times via Getty Images

In 2020, California passed a law that allows victims of nonconsensual sexually-explicit deepfakes to sue the people who created and distributed the material. A plaintiff can recover up to $150,000 in damages if the perpetrator was found to have committed the act with malice. It’s not clear if damages have ever been awarded via the law.

Mary Anne Franks, the President of the Cyber Civil Rights Initiative and a professor at George Washington University Law School, said California’s laws still don’t clearly prohibit what happened at Beverly Vista Middle School — based on the information that is currently available about the incident. Not all nude depictions of children are legally considered pornographic, so without more information about what the photos depict, their legality is unclear. 

“The civil action in California could potentially apply here, but it’s always difficult for victims to identify who the perpetrators are, get the legal assistance they need, and then actually pursue the case,” Franks said.

“It’s hard to think about what justice would be for the students,” she continued. “The problem with image-based abuse is once the material is created and out there, even if you punish the people who created them, these images could be circulating forever.”

The technology to create fake nude images has rapidly become more sophisticated and accessible over the past several years, and high-profile incidents of celebrity deepfakes — like ones of Taylor Swift that went viral in January — have brought even more attention to consumer apps that allow users to swap victim’s faces into pornographic content and “undress” photos.

In deepfake sexual abuse cases involving underage perpetrators and victims, the laws have not always been applied. 

Digital news outlet 404 Media investigated a 2023 case involving a high school in Washington state, where police documentation revealed that high school administrators did not report students making AI-generated nude photos from their classmates’ Instagram photos. The incident was a possible sex crime against children, the Washington police report said, but administrators attempted to handle the situation internally before multiple parents filed police reports. After police investigated the incident, a prosecutor declined to press charges against the perpetrator.

“My hope is that legislators start realizing that while civil penalties may be useful for certain victims, it’s only going to be a partial solution,” Franks said. “What we should be focusing on are deterrents. This is unacceptable behavior and should be punished accordingly.”

Source: | This article originally belongs to

You May Also Like

Former USA gymnastics doctor Larry Nassar stabbed multiple times in Florida prison

IE 11 is not supported. For an optimal experience visit our site…

Shelton’s Fed Candidacy Goes Down to Wire With Close Vote Expected

WASHINGTON—Senate Republicans were preparing to advance Tuesday through the final steps needed…

Removing the %&*@ from Maine’s vanity plates will take time

Removing the flipping obscenities from license plates on Maine’s roads and highways…

Top private law firms plan ‘SWAT teams’ to fight voting laws in court

First, it was the businesses. Now, it’s the bar. More than a…