MJP –
In particular, when the subjects are children, it is extremely immoral to use an AI-powered program to generate false nude photos of them without their permission.
At this time, nevertheless, it would not be illegal in California. Unfortunately, that will change very shortly.
The production, ownership, and dissemination of sexually explicit photos of children, regardless of whether they are created using cameras or computers, is now illegal according to two measures that Governor Gavin Newsom recently signed. Jan. 1 is the start date for the changes.
At the same time that more and more states are outlawing the practice, students are falling prey to apps that use AI to either digitally create a nude body from a photo of a fully clothed real person (“undresser” apps) or to seamlessly overlay a person’s face onto a nude body from a pornographic video.
Forty percent of students in a recent survey by the Center for Democracy & Technology reported hearing rumors regarding the use of deepfake images on campus. Within that demographic, roughly 38% reported that the photos were intimate or sexually explicit and taken without consent.
The report from the center stated that few teachers reported that their schools had measures to prevent the spread of nonconsensual deepfakes. It went on to say, “This unfortunately leaves many students and parents in the dark and seeking answers from schools that are ill-equipped to provide them.”
California Law Enforcement Launches Historic Crackdown, 42 Arrested in White Supremacist Gang Bust
The organization claims that schools typically punish students responsible for creating and sharing deepfakes through expulsion or other forms of disciplinary action. As an example, in February, five eighth pupils from the Beverly Hills school system were expelled for sharing fake nudes of sixteen other eighth graders.
Legal experts expressed concern at the time that state prosecutors did not have the authority to prosecute cases involving computer-generated photos of sexual abuse of children, regardless of whether the images were created and distributed by adults. As a result, the Beverly Hills case was sent to the police.
The fact that computer-generated photographs were not included in the state’s legal definition of child pornography is one reason for the difference. “It would appear that a real child must have been used in production and actually engaged in or simulated the sexual conduct depicted” in order for a production to breach California law, according to a 2011 ruling by a state appeals court.
California Law Change: Tenants Now Have More Time to Address Eviction Notices
Marc Berman (D-Menlo Park) introduced Assembly Bill 1831, which would broaden the scope of the state’s child-porn ban to include content that “contains a digitally altered or artificial-intelligence-generated depiction [of] what appears to be a person under 18 years of age” engaging in or mimicking sexual conduct. Graphic depictions of naked bodies or bodily processes intended to stimulate sexual desire are also considered sexual behavior under state law.
It will be illegal to knowingly possess, sell to adults, or disseminate to kids AI-generated and digitally altered content after AB 1831 goes into force next year, much like other forms of filthy child pornography. Involvement in the noncommercial distribution or trade of such goods to adults knowing they contain child pornography is therefore prohibited, regardless of whether the commodities are obscene or not.
Similarly, Sen. Aisha Wahab’s (D-Hayward) Senate Bill 1381 amends state law to expressly forbid the use of artificial intelligence to generate images of actual children engaging in sexual conduct or to use children as models for such images.
After going through the issue herself, 16-year-old Kaylin Hayman of Ventura County and former Disney actor was an outspoken supporter of AB 1831. The district attorney’s office in Ventura County has stated that a man from Pennsylvania produced photographs that composited her likeness onto sexually graphic bodies; these images were not considered criminal under California law when they were generated. According to the district attorney’s office, the guy was instead tried in federal court, found guilty, and handed a 14-year jail term.
The bill’s advocacy has been incredibly powerful, and Hayman expressed gratitude to the district attorney’s office and her parents for their support throughout the process, according to a news release. Justice will be provided to future victims, and this law will be revolutionary.
According to Dist. Atty. Erik Nasarenko, “the real-life evils of computer-generated images of child sexual abuse” were revealed after collaborating with victim Kaylin Hayman, who bravely recounted her story. “The tenacity and resolve shown by Kaylin in her advocacy for this bill were instrumental in its passage, and it will serve to safeguard children going forward.”
Juniper Calloway is a dedicated journalist with 3 years of experience in covering hard-hitting stories. Known for her commitment to delivering timely and accurate updates, she currently works with MikeandJon Podcast, where she focuses on reporting critical topics such as crime, local news, and national developments across the United States. Her ability to break down complex issues and keep audiences informed has established her as a trusted voice in journalism.