Parents Alerted After Alleged Deepfake AI Porn Made Of Sydney High School Students

Parents Alerted After Alleged Deepfake AI Porn Made Of Sydney High School Students
Image: Stock image of students. Photo: Goerge Pak / Pexels

Just weeks before the first school term of 2025, a south-west Sydney high school is facing an alleged AI deepfake pornographic incident.

It’s alleged that a Year 12 student made deepfake pornographic images of female students, using real photos of them. Fake social media profiles were also allegedly created by the Year 12 student, using the fake imagery.

Some of the deepfake porn images were also reportedly made from photos of school activities.

Media have been asked not to name the school involved, nor Year 12 student in question, in order to protect the victims of the alleged incident.

No decision has reportedly been made on what action will be taken against the alleged creator of the deepfake porn images, but the matter has been referred to the police.

Parents informed of alleged deepfake porn at Sydney high school

Parents were alerted to the alleged incident by an email sent from the NSW Department of Education, which informed them of what was being done.

“The school has been made aware that a year 12 male student has allegedly used artificial intelligence to create a profile that resembles your daughters and others,” read the email.

“Unfortunately innocent photos from social media and school events have been used.

“We want to emphasise that your daughters have done nothing wrong, there are no inappropriate real photos of them being used.

“Please understand that your daughters have not engaged in any inappropriate posting, but have been the victims of this situation.”

The parents were asked to report any explicit content they happen to see on social media straight to police.

They were also urged not to make contact with the student who is alleged to have created the images.

The Department of Education has referred the incident to the eSafety Commissioner, as well as the police.

“This matter has been reported to police and we are working closely to assist their investigation,” a spokesperson for the Department of Education told City Hub.

“We do not tolerate such behaviour and will take the appropriate action.

“Our highest priority is to ensure our students feel safe and any decision about this student’s future involvement in the school will be based on that.

“We are helping affected students with appropriate wellbeing support and will do so as long as required.”

Laws on sharing of non-consensual AI and deepfake porn passed in 2024

Attorney General Mark Dreyfus introduced legislation in June last year on behalf of the Albanese Labor Government to create a new criminal offence of sharing non-consensual sexually explicit images  made using AI or similar.

The The Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 , which passed in August 2024, makes it illegal to share any non-consensual deepfake pornographic imagery with another person, by email or other forms of messaging, both to an individual or to a mass audience, privately or on an open platform like social media.

“Digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse,” Dreyfus told The Guardian.

“We know it overwhelmingly affects women and girls who are the target of this kind of deeply offensive and harmful behaviour. It can inflict deep, long-lasting harm on victims.”

Comments are closed.