The speech made by James Cartlidge, the Parliamentary Under-Secretary of State for Justice, in the House of Commons on 2 December 2021.
I congratulate my right hon. Friend the Member for Basingstoke (Mrs Miller) on bringing forward this incredibly important subject for debate. I know that she has long championed the victims of online abuse, and I would like to thank her for her efforts, which have contributed greatly to the Government’s thinking in this sensitive area and indeed to the reform of the criminal law itself. I shall try to summarise how I feel on hearing her speech and the contribution she has made by saying that she has an enduring passion for protecting society, especially women and girls, from the dark side of digital. I think that is the best way to describe what is so powerful about the way she speaks on this subject.
Turning to the specifics, deepfake is a term used broadly to describe software or processes used to splice or superimpose existing images on to source photographs or videos. My right hon. Friend has explained that this technology is now being used to create fake pornography, often without the agreement or knowledge of the victim. The images, though in themselves fake, can appear realistic, and their sharing can understandably cause deep distress. She rightly used the word “grotesque” to describe this practice, and she spoke movingly about the real-world impact that it has. She referred to the cases of constituents, and let me express my sympathy for every one of them. This must be harrowing and distressing for them, and we need to tackle it and stamp it out.
I should like to begin by assuring the House that the existing criminal law is fully equipped to deal with instances where the manipulated images depict children, who of course are the most vulnerable. These pseudo-images can cause real-life harm, as I said. They can be created from real images where a child was abused, and viewing the images creates a demand for those pictures, which leads to further abuse. I realise that my right hon. Friend did not primarily talk about children, but because of the issue of vulnerability, I think I should put this on record.
The Protection of Children Act 1978 criminalises the making, taking, sharing and distribution of indecent photographs and indecent pseudo-photographs of children. A pseudo-photograph is an image, whether made by computer graphics or otherwise, that appears to be a photograph. This offence carries a robust maximum 10-year prison sentence. Similarly, section 160 of the Criminal Justice Act 1988 captures the mere possession of such images, and that offence is subject to a five-year prison sentence. Section 62 of the Coroners and Justice Act 2009 created a new offence of possession of a prohibited image of a child, punishable by up to three years’ imprisonment. In addition, it created a new criminal offence to make illegal the possession of non-photographic visual depictions of child sexual abuse, including cartoons and computer-generated images of child abuse, with a three-year maximum prison sentence.
Although it is right that there are specific and robust provisions in relation to faked indecent images of children, I share my right hon. Friend’s concern at the distress that the non-consensual creation and sharing of deepfake images can cause to adult victims. I assure the House that a person who shares such images of adults may—I stress may—in some circumstances be committing an existing offence. A person who shares deepfake images of adults may in some circumstances be committing an existing offence. For example, against a background of domestic abuse, the posting or sharing of faked images could be captured under section 76 of the Serious Crime Act 2015. That offence was created specifically to target controlling or coercive behaviour in an intimate or family relationship, including when the victim is an ex-partner. We are aware that deepfake images are being used for such disturbing and cruel purposes.
In addition, section 1 of the Malicious Communications Act 1988 prohibits the sending of an electronic communication that is indecent, grossly offensive or false, or that the sender believes to be false, if the purpose, or one of the purposes, of the sender is to cause distress or anxiety to the recipient. Furthermore, section 127 of the Communications Act 2003 makes it an offence to send or cause to be sent through a
“a public electronic communications network a message or other matter that is grossly offensive or of an indecent, obscene or menacing character”.
The same section also provides that it is an offence to send or cause to be sent a false message
“for the purpose of causing annoyance, inconvenience or needless anxiety to another”.
Such behaviour may also amount to harassment, which is also already an offence.
There has been a successful conviction in which a person was found guilty of harassment after uploading images of a colleague, fully clothed, alongside images on a porn site of women of a similar shape and build as the colleague. Additionally, those who encourage others to commit an existing communications offence may be charged with encouraging an offence under the Serious Crime Act 2007.
I stress, though, that the Government recognise the concerns, set out so eloquently and clearly by my right hon. Friend, about the existing communications offences. The Law Commission considered the specific offences I have set out as part of its “Modernising the Communications Offences” review, to understand whether they needed to be reformed to better tackle abusive and harmful behaviours online. The Commission has now published its final report and recommendations for reform, and my right hon. Friend the Secretary of State for Digital, Culture, Media and Sport has indicated that she is minded to adopt the harm-based offence, the false-communications offence and the threatening-communications offence.
Alongside the use of existing and established criminal sanctions, there is a major role for the websites that host the images. It is encouraging that sites such as Pornhub, Twitter, Reddit and several others have all announced bans on deepfake images. Such images already violate community standards on major social media platforms such as Facebook. Some sites are already beginning to turn to artificial intelligence to police the images, rather than rely on users reporting them—an example of the determination to find effective and new ways to restrict the practice. For example, Facebook uses machine learning and AI to detect near-nude images or videos shared without permission on its platforms. Bumble, a dating app, has its own “Private Detector” safety feature, which automatically blurs a nude image shared in a chat. These are important steps to protect user safety and ensure that the images are tackled head on.
I hope that my right hon. Friend is satisfied that the law can, in most scenarios, deal with this behaviour, and that non-criminal interventions are developing all the time, but it is of course crucial that the criminal law keeps pace with new technologies as they emerge. We continue to keep these issues under review and when we see a problem with the criminal law, we act.
This Government have a strong record when it comes to protecting the public from the abuse of private, intimate imagery. For example, much as a result of my right hon. Friend’s assiduous campaigning, as she said earlier, in 2015 we created the so-called revenge porn offence at section 33 of the Criminal Justice and Courts Act 2015, and only recently, during the passage of the Domestic Abuse Act 2021, we listened to the voices of victims of image abuse and supported provisions to extend that offence to capture those who threaten to disclose private sexual images with an intent to cause distress. That change has now been implemented and I am sure that my right hon. Friend, having fought so hard for the creation of the original offence, welcomes that significant extension of the protection of victims from image-based abuse. In addition, after listening to the victims of upskirting and the excellent campaign for change headed by Ms Gina Martin, we created new criminal offences in the Voyeurism (Offences) Act 2019 specifically to address that intrusive and distressing behaviour. Offenders now face up to two years behind bars, and the most serious among them will be subject to sex offender notification requirements. We do listen and we do respond.
My hon. Friend has clearly gone through the shopping list of laws that can be used to try to guard against the misuse of intimate images, but in having a shopping list we have created a lot of gaps, too. For instance, upskirting may be unlawful but down-blousing is not. It is very difficult when we have law that is so prescriptive. Does he have sympathy with the need to have something more encompassing so that we can capture all forms of intimate image abuse and not have to play whack-a-mole by outlawing the latest devious way in which people try to abuse women and girls online?
My right hon. Friend makes an excellent point, and once again she highlights her incredible expertise on these matters. She will be aware that the way Parliament often works is that individual campaigns generate momentum and become specific offences—I would not use the phrase “ad hoc,” which is almost demeaning to those campaigns, which are incredibly important and powerful. That is the reality of how this place makes law at times, but she is right that we need to consider the broader picture. I know where her focus is, and I will be coming to the Law Commission, which will feed into that point.
My colleagues in the Department for Digital, Culture, Media and Sport are busy preparing the online safety Bill, which will include provisions to tackle illegal and legal-but-harmful content, including criminal deepfake pornography, sexual harassment and abuse that does not cross a legal threshold. Under the Bill all companies will need to take action against illegal content and ensure that children are protected from inappropriate material. Major platforms will also need to address legal-but-harmful content for adults, which will likely include online abuse. Ofcom will have a suite of enforcement powers to deal with non-compliance, including fines of up to £18 million or 10% of qualifying annual turnover.
The Joint Committee that is scrutinising the Bill is due to report before recess—by 10 December. We will table the Bill as soon as possible, subject to the parliamentary timetable, but we must not rest. I assure the House that we do not take concerns in this sensitive area lightly.
It was with those concerns in mind that the Government asked the Law Commission to review the law on the taking, making and sharing of intimate images without consent, to identify whether there are any gaps in the scope of protection already offered to victims. Importantly, the review has considered the law on manipulated images such as those created by deepfake technology and the protection that the existing law affords.
On 27 February 2021 the Law Commission published the consultation paper on its review, and the consultation ended on 27 May and put forward a number of proposals for public discussion. I understand the Law Commission is due to publish its final recommendations by spring 2022.
Although I welcome this opportunity to discuss the nature of developing technology and the production and sharing of explicit manipulated images and other offences, this is a complex area and it is right and proper that we should take time to consider the law carefully before deciding whether to add further to the raft of existing legislation that already addresses these issues. It is important, therefore, to allow the Law Commission to finish its work and to consider in detail and with care any recommendations it produces. The Government await the Law Commission’s findings with interest and will consider them carefully.
I believe my right hon. Friend has previously met the Law Commission but, if it would be of interest, I would be more than happy to arrange for her to do so again, based on its latest position.
I am slightly taking advantage of the fact that we have a little more time this evening. The Minister will know that the Law Commission has made its recommendations, which have gone out for consultation. That consultation finished a month or two ago, so it is not that the Law Commission will finish its deliberations in the spring; it has already finished its deliberations. Those recommendations, subject to any input from the consultation, should be available shortly. I still do not understand why he is not able to bring these recommendations forward at the same time as the online safety Bill.
My right hon. Friend makes a good point. I wish to clarify this, as a lot of Law Commission reviews are taking place over time. There are two in this regard. The one I believe she is referring to is the one I mentioned earlier, which is the Department for Digital, Culture, Media and Sport one. I believe that has reported and that the Department is now considering it, and it concerns malicious communications and other offences to which I referred earlier. The review on taking, making and sharing is ongoing and will report in spring next year. The point I was making to her was that if she wanted to contribute to that and meet the Law Commission—
Mrs Miller indicated assent.
My officials have noted her positive nodding of the head, and so I would be more than happy to set that meeting up, because she has great expertise. I can assure her that her concerns, and the views and issues raised by this House, will be taken fully into account when the Government consider those findings and the issue of whether reform to the criminal law is necessary.