Criminal Code Amendment (Deepfake Sexual Material) Bill 2024

As has been said, the coalition supports the intentions behind the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024, as would, I think, every member of the House as we deal with some of the multitude of risks that we see online.

There is no doubt that there will be serious and damaging consequences to people through AI generated sexual material; it knows almost no bounds in this space. There is a proliferation of artificial intelligence that creates just extraordinary possibilities. It is, in some ways, so useful, but it comes with enormous risks, particularly when we’re talking about young people. The risk we are dealing with here arises from that intersection of AI with its tremendous capability to generate material that actually appears to be real. But this will bring an extraordinary capacity to inflict real harm by distributing sexual material online.

I’ve delivered hundreds of cybersafety presentations to young school children, and teenagers as well. I’ve listened to what they’ve already had to deal with, with their faces—without the AI piece, just their faces—being replaced on other bodies and used in a sexual way. The harm that has caused them and the distress it has brought to them and their families was done even without AI. So in this space, the non-consensual sharing of deepfakes and sexual content is extraordinarily dangerous in my view. It’s dangerous to the individual, and dangerous more broadly.

We know that this bill will replace some of the existing laws. We certainly took these laws very seriously when in government. Technology has constantly changed and will keep changing. There was a real need for our Enhancing Online Safety (Non-consensual Sharing of Intimate Images ) Act in 2018. We really needed that very effective civil take-down capability. One of the most common things for parents who ring my office—because they know of my work in this space—is that they want material removed very quickly. That’s the first thing: they don’t want it shared, they don’t want it to go viral and they’re panicking. Often, if it’s a young child, that’s their greatest worry—and the risk of that child being bullied as a result of that content being online, or being out there when they don’t know about it.

We’ve seen a lot of work in this space previously, but I am concerned about the current bill and the removal of the current definition of ‘consent’. Currently, the law says explicitly that consent means ‘free and voluntary agreement’. But government has actually removed that definition. There are just a few issues with that—given the real focus on the need to keep people safe and what will happen in the AI generated world. How will the courts adjudicate the consent rule if it’s not clearly defined in the legislation? That’s what we’re going to need, and that’s what those who seek to use these laws will need: very clear definitions. The courts will need those as well.

The coalition have also announced a real plan to lift the minimum age for kids accessing social media from 13 to 16. It is a top priority for us and it’s very important that this is the case. Members have recently heard Meta giving evidence at hearings, where there was a claim made that there’s no damage being done to young people on their platforms or in this social media space. Well, from my years of doing this, let me tell you that I have seen a significant deterioration in the mental and emotional health and wellbeing of young people because of what’s happening online and on social media. So I don’t swallow that for a minute. Every one of those social media platforms creates a harmful place for young people—even for adults as well—and certainly could in the deepfake non-consensual sharing of sexual images. Ever since the inception of online social media platforms, the creators and executives in charge of these platforms decided—apparently, because of what we’ve seen since —that it was perfectly fine to expose our very young children to what I see as, and it really is, a free-for-all in an online paedophiles’ paradise. That’s what this is. That’s what they have done. Who would have thought there’d be a platform that allowed our children to be groomed online by sexual predators and exposed to extreme and violent pornography—because that’s what’s happening—and to be exposed randomly to billions of people of all ages on the Internet? But that’s what the platforms allow. It is part of these platforms’ business model and is what is available to very vulnerable young people—

The DEPUTY SPEAKER: The debate is interrupted in accordance with standing order 43. The debate may be resumed at a later hour, and the member will have leave to continue speaking when the debate is resumed.


In continuation, I was talking about the inception of online social media platforms and I wondered which one of the creators or executives in charge of these platforms decided that it was perfectly fine to expose our very young children to a free-for-all in what is an online paedophile’s paradise. And I’ll repeat that: a paradise where our children would be groomed online by sexual predators and exposed to extreme and violent pornography. That’s what’s available on their platforms to be exposed to billions of random people of all ages. But that is part of the platform business model and what’s available to vulnerable young people on these platforms and these apps, exposing our young people 24 hours a day, seven days a week. The harm is there for all to see and for our families to have to deal with—from the bullying with the sharing of naked and semi-naked photos.

On Snapchat, which young people were sucked into—that’s a nice way of putting it—they thought they could take provocative images and they would disappear. Well, they didn’t. There was an app called SnapSave that automatically saved all of these photos, and people took screenshots. So many young people I’ve dealt with have been affected by this and the image based abuse that follows that—the suicides because of what’s going on on social media.

In my electorate, we have seen young boys with image based abuse. We have seen increased rates of self harm over the years, as well as the dramatic increase of hospitalisations of girls in the last decade because of this and the group chats and sites promoting everything from eating disorders to risky behaviours. We did see, some years ago, that dreadful choking game. We’ve seen such a rise in mental health problems, which I’ve repeatedly spoken about in this House before. We now see the online gaming addiction and the clinics that have had to be set up to help families deal with online addictions, as well as the risks on these apps with tracking and mapping. There are young girls presenting to GPs with internal damage from aggressive sexual experiences learned from online pornography sites.

These same platforms and representatives still believe, according to some of the evidence we saw recently, that they’re doing no harm. I can tell them differently. I do cybersafety presentations in schools, and how dreadful it is that, through that interaction on social media and through these sites, the person of the youngest age so far, in all of the years I have been doing this, who will admit to me they’ve been to meet people in person that they’ve only first met online is a year 3 girl. I’ll let that sink in—it was a year 3 girl, in all those classes and sessions that I’ve done over the years. When I do my silent survey and they put their hands up for the three critical questions, it is completely unusual to come across a class, of various ages—I’ve done from preschool through to year 12— in which there are not some children who have gone to meet people in person that they’ve only first met online.

I’ve dealt with an extraordinary amount of issues created by these online platforms, who claim that they’re doing no harm. That is entirely incorrect. The harm to these young people is done every single day and night. It isn’t okay when we see GPS or location services and geotagging turned on and embedded in the photos that young people share. Then they get tracked and mapped. It is not okay that we have four-year-olds being encouraged and conned into uploading totally inappropriate content. It’s not okay when the mum of a nine-year-old girl rings my office because her daughter has uploaded an explicit video on one of these sites. I can just see the risks with AI generated images and particularly sexual images. This legislation is so important in that space because we will see a rise in the instances of this.

The harm to young people is constant. When I listen to these young people and ask them questions about what they do, how much time they spend online, where their devices are and how much time they can spend on them, it is almost unlimited. And so their level of risk in this space is extreme. Of course, the gaming chats in so many of the messaging platforms—it is just ongoing, and they’re asked all sorts of questions in this space. When I look at the ages of the young people I’ve dealt with—and it’s much younger than people think—one of the things I say to these good, young people is: ‘Can you help your families? If you’ve got younger brothers and sisters, they are at risk in this space. You are in this as a family, and the whole family needs to be involved in what’s happening online to help keep your family safe.’

I’ve dealt with some terrible experiences that young people have had. When you meet these young people—I had a group of great young kids that were 15 and 16, and they came to talk to me to tell me how concerned they were about the nine-year-olds in their school that they knew were watching and live-streaming sex acts in school time. What those young people were seeing and having access to is why age verification is so important, which we have been pushing for for some time. It’s a start. It doesn’t give all of the answers, but it is certainly a start. There have been extraordinary experiences that young people have been subjected to, and it has really harmed their whole lives.

Here is one of the other issues that I just want to warn people about as well. I had a visit recently from the optometrists at Optometry Australia. We are seeing an increase in myopia, shortsightedness, because of the amount of time young people are spending behind a screen. Not only is there physical harm and emotional harm, with young people going to meet people in person they only first met online, but what’s happening to them with bullying and image based abuse is constant. When I spoke to one principal and said, ‘What is the biggest issue facing your year 12 students?’ he held up the phone and said, ‘They’re just not sleeping enough.’ And they cannot cope with every other issue in their lives—their relationships, what’s going on at school, what they intend to do when they leave school—because of what’s happening online. They’re engaging with it for so much of the night that they’re simply not sleeping enough.

So I’ve been very active in this space for a lot of years. What the government is proposing here in relation to deepfake images—I think what we’re going to see ahead is even worse because of this. I hope that people use these laws and that the laws are suitable for them to be used far more often. The issues that these young people have to deal with, with deepfakes, the issues around consent and how the courts will interpret and act on this, are very important to our young people and people of all ages.