Category: Culture

  • Liz Twist – 2022 Speech on the Online Safety Bill

    Liz Twist – 2022 Speech on the Online Safety Bill

    The speech made by Liz Twist, the Labour MP for Blaydon, in the House of Commons on 5 December 2022.

    I wish to address new clauses 16 and 28 to 30, and perhaps make a few passing comments on some others along the way. Many others who, like me, were in the Chamber for the start of the debate will I suspect feel like a broken record, because we keep revisiting the same issues and raising the same points again and again, and I am going to do exactly that.

    First, I will speak about new clause 16, which would create a new offence of encouraging or assisting serious self-harm. I am going to do so because I am the chair of the all-party parliamentary group on suicide and self-harm prevention, and we have done a good deal of work on looking at the issue of self-harm and young people in the last two years. We know that suicide is the leading cause of death in men aged under 50 years and females aged under 35 years, with the latest available figures confirming that 5,583 people in England and Wales tragically took their own lives in 2021. We know that self-harm is a strong risk factor for future suicidal ideation, so it is really important that we tackle this issue.

    The internet can be an invaluable and very supportive place for some people who are given the opportunity to access support, but for other people it is difficult. The information they see may provide access to content that acts to encourage, maintain or exacerbate self-harm and suicidal behaviours. Detailed information about methods can also increase the likelihood of imitative and copycat suicide, with risks such as contagion effects also present in the online environment.

    Richard Burgon (Leeds East) (Lab)

    I pay tribute to my hon. Friend for the work she has done. She will be aware of the case of my constituent Joe Nihill, who at the age of 23 took his own life after accessing suicide-related material on the internet. Of course, we fully support new clause 16 and amendment 159. A lot of content about suicide is harmful, but not illegal, so does my hon. Friend agree that what we really need is assurances from the Minister that, when this Bill comes back, it will include protections to ensure that adults such as Joe, who was aged 23, and adults accessing these materials through smaller platforms are fully protected and get the protection they really need?

    Liz Twist

    I thank my hon. Friend for those comments, and I most definitely agree with him. One of the points we should not lose sight of is that his constituent was 23 years of age—not a child, but still liable to be influenced by the material on the internet. That is one of the points we need to take forward.

    It is really important that we look at the new self-harm offence to make sure that this issue is addressed. That is something that the Samaritans, which I work with, has been campaigning for. The Government have said they will create a new offence, which we will discuss at a future date, but there is real concern that we need to address this issue as soon as possible through new clause 16. I ask the Minister to comment on that so that we can deal with the issue of self-harm straightaway.

    I now want to talk about internet and media literacy in relation to new clauses 29 and 30. YoungMinds, which works with young people, is supported by the Royal College of Psychiatrists, the British Psychological Society and the Mental Health Foundation in its proposals to promote the public’s media literacy for both regulated user-to-user services and search services, and to create a strategy to do this. Young people, when asked by YoungMinds what they thought, said they wanted the Online Safety Bill to include a requirement for such initiatives. YoungMinds also found that young people were frustrated by very broad, generalised and outdated messages, and that they want much more nuanced information—not generalised fearmongering, but practical ways in which they can address the issue. I do hope that the Government will take that on board, because if people are to be protected, it is important that we have a more sophisticated media literacy than is reflected in the broad messages we sometimes get at present.

    On new clause 28, I do believe there is a need for advocacy services to be supported by the Government to assist and support young people—not to take responsibilities away from them, but to assist and protect them. I want to make two other points. I see that the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) has left the Chamber again, but he raised an interesting and important point about the size of platforms covered by the Bill. I believe the Bill needs to cover those smaller or specialised platforms that people might have been pushed on to by changes to the larger platforms. I hope the Government will address that important issue in future, together with the issue of age, so that protection does not stop just with children, and we ensure that others who may have vulnerabilities are also protected.

    I will not talk about “legal but harmful” because that is not for today, but there is a lot of concern about those provisions, which we thought were sorted out and agreed on, suddenly being changed. There is a lot of trepidation about what might come in future, and the Minister must understand that we will be looking closely at any proposed changes.

    We have been talking about this issue for many years—indeed, since I first came to the House—and during the debate I saw several former Ministers and Secretaries of State with whom I have raised these issues. It is about time that we passed the Bill. People out there, including young people, are concerned and affected by these issues. The internet and social media are not going to stop because we want to make the Bill perfect. We must ensure that we have something in place. The legislation might be capable of revision in future, but we need it now for the sake of our young people and other vulnerable people who are accessing online information.

  • John Penrose – 2022 Speech on the Online Safety Bill

    John Penrose – 2022 Speech on the Online Safety Bill

    The speech made by John Penrose, the Conservative MP for Weston-super-Mare, in the House of Commons on 5 December 2022.

    It is a pleasure to follow Zach’s MP, the hon. Member for Batley and Spen (Kim Leadbeater). I particularly want to pick up on her final comments about the difficulties of platforms—not just small platforms, but larger ones—hosting extremist content, be it incels, the alt-right, the radical left or any other kind.

    I will speak to my new clauses 34 and 35, which seek to deal with both disinformation and misinformation. They are important amendments, because although the Bill has taken huge steps forward—we are led to believe that it may take a couple more in due course when the revised version comes back if the recommittal is passed—there are still whole categories of harm that it does not yet address. In particular, it focuses, rightly and understandably, on individual harms to children and illegal activities as they relate to adults, but it does not yet deal with anything to do with collective harms to our society and our democracy, which matter too.

    We have heard from former journalists in this debate. Journalists know it takes time and money to come up with a properly researched, authoritatively correct, accurate piece of journalism, but it takes a fraction of that time and cost to invent a lie. A lie will get halfway around the world before the truth has got its boots on, as the saying rightly goes. Incidentally, the hon. Member for Rotherham (Sarah Champion) said that it is wonderful that we are all learning so much. I share that sentiment; it is marvellous that we are all comparing and sharing our particular areas of expertise.

    One person who seems to have all areas of expertise under his belt is my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who chaired the Joint Committee. He rightly pointed out that this is a systems Bill, and it therefore deals with trying to prevent some things from happening—and yet it is completely silent on misinformation and disinformation, and their effect on us collectively, as a society and a democracy. New clauses 34 and 35 are an attempt to begin to address those collective harms alongside some individual harms we face. One of them deals with a duty of balance; the other deals with factual accuracy.

    The duty of balance is an attempt to address the problem as it relates to filter bubbles, because this is a systems Bill and because each of us has a tailored filter bubble, by which each of the major platforms, and some of the minor ones, work out what we are interested in and feed us more of the same. That is fine for people who are interested in fishing tackle; that is super. But if someone is interested in incels and they get fed more and more incel stuff, or they are vaguely left wing and get taken down a rabbit hole into the increasingly radical left—or alternatively alt-right, religious extremism or whatever it may be—pretty soon they get into echo chambers, and from echo chambers they get into radicalisation, and from radicalisation they can pretty soon end up in some very murky, dark and deep waters.

    There are existing rules for other old-world broadcasters; the BBC, ITV and all the other existing broadcasters have a duty of balance and undue prominence imposed on them by Ofcom. My argument is that we should consider ways to impose a similar duty of balance on the people who put together the programs that create our own individual filter bubbles, so that when someone is shown an awful lot of stuff about incels, or alt-right or radical left politics, somewhere in that filter bubble they will be sent something saying, “You do know that this is only part of the argument, don’t you? Do you know that there is another side to this? Here’s the alternative; here’s the balancing point.” We are not doing that at the moment, which is one of the reasons we have an increasingly divided societal and political debate, and that our public square as a society is becoming increasingly more fractious—and dangerous, in some cases. New clause 35 would fix that particular problem.

    New clause 34 would deal with the other point—the fact that a lie will get halfway around the world before the truth has got its boots on. It tries to deal with factual accuracy. Factual accuracy is not quite the same thing as truth. Truth is an altogether larger and more philosophical concept to get one’s head around. It is how we string together accurate and correct facts to create a narrative or an explanation. Factual accuracy is an essential building block for truth. We must at least try to ensure that we can all see when someone has made something up or invented something, whether it is that bleach is a good way to cure covid or whatever. When somebody makes something up, we need to know and it needs to be clear. In many cases that is clear, but in many cases, if it is a plausible lie, a deepfake or whatever it may be, it is not clear. We need to be able to see that easily, quickly and immediately, and say, “I can discount this, because I know that the person producing it is a serial liar and tells huge great big porkies, and I shouldn’t be trusting what they are sending me, or I can see that the actual item itself is clearly made up.”

    The duty of achieving balance already exists in rules and law in other parts of our society and is tried and tested—it has stood us very well and done a good job for us for 40 or 50 years, since TV and radio became ubiquitous—and the same is true, although not for quite such a long time, for factual accuracy. There are increasingly good methods of checking the factual accuracy of individual bits of content, and if necessary, in some cases of doing so in real time, too. For example, Adobe is leading a very large global grouping producing something called the Content Authenticity Initiative, which can tell if something is a deepfake, because it has an audit trail of where the image, the item or whatever it may be came from and how it has been updated, modified or changed during the course of its life.

    Dean Russell

    On that point, I want to raise the work that my hon. Friend the Member for Bosworth (Dr Evans), who is not in the Chamber at the moment, has done on body image. When images are photo-shopped and changed to give an idea of beauty that is very different from what is possible in the real world, that very much falls into the idea of truth. What are my hon. Friend’s thoughts on that point?

    John Penrose

    Addressing that is absolutely essential. That goes for any of the deepfake examples we have heard about, including from my right hon. Friend the Member for Basingstoke (Dame Maria Miller), because if we know that something has been changed—and the whole point about deepfake is that it is hard to tell—we can tell easily and say, “I know that is not right, I know that is not true, I know that is false, and I can aim away from it and treat it accordingly”.

    Just to make sure that everybody understands, this is not some piece of new tech magic; it is already established. Adobe, as I have said, is doing it with the Content Authenticity Initiative, which is widely backed by other very serious tech firms. Others in the journalism world are doing the same thing, with the Journalism Trust Initiative. There is NewsGuard, which produces trust ratings; the Trust Project, which produces trust indicators; and we of course have our own press regulators in this country, the Independent Press Standards Organisation and IMPRESS.

    I urge the Government and all here present not to be satisfied with where this Bill stands now. We have all heard how it can be improved. We have all heard that this is a new, groundbreaking and difficult area in which many other countries have not even got as far as we have, but we should not be in any way satisfied with where we are now. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said earlier that we need to approach this Bill in a spirit of being humble, and this is an area in which humility is absolutely essential. I hope all of us realise how much further we have to go, and I hope the Minister will say how he proposes to address these important and so far uncovered issues in due course.

  • Kim Leadbeater – 2022 Speech on the Online Safety Bill

    Kim Leadbeater – 2022 Speech on the Online Safety Bill

    The speech made by Kim Leadbeater, the Labour MP for Batley and Spen, in the House of Commons on 5 December 2022.

    I apologise for having left the debate for a short time; I had committed to speaking to a room full of young people about the importance of political education, which felt like the right thing to do, given the nature of the debate and the impact that the Bill will have on our young people.

    I am extremely relieved that we are continuing to debate the Bill, despite the considerable delays that we have seen; as I mentioned in this House previously, it is long overdue. I acknowledge that it is still groundbreaking in its scope and extremely important, but we must now ensure that it works, particularly for children and vulnerable adults, and that it goes some way to cleaning up the internet for everyone by putting users first and holding platforms to account.

    On new clause 53, I put on record my thanks to the Government for following through with their commitments to me in Committee to write Zach’s law in full into the Bill. My constituent Zach Eagling and his mum Clare came into Parliament a few weeks ago, and I know that hon. Members from both sides of the House were pleased to meet him to thank him for his incredible campaign to make the vile practice of epilepsy trolling completely illegal, with a maximum penalty of a five-year prison sentence. The inspirational Zach, his mum and the Epilepsy Society deserve enormous praise and credit for their incredible campaign, which will now protect the 600,000 people living with epilepsy in the UK. I am delighted to report that Zach and his mum have texted me to thank all hon. Members for their work on that.

    I will raise three areas of particular concern with the parts of the Bill that we are focusing on. First, on director liability, the Bill includes stiff financial penalties for platforms that I hope will force them to comply with these regulations, but until the directors of these companies are liable and accountable for ensuring that their platforms comply and treat the subject with the seriousness it requires, I do not believe that we will see the action needed to protect children and all internet users.

    Ultimately, if platforms enforce their own terms and conditions, remove illegal content and comply with the legal but harmful regulations—as they consistently tell us that they will—they have nothing to worry about. When we hear the stories of harm committed online, however, and when we hear from the victims and their families about the devastation that it causes, we must be absolutely watertight in ensuring that those who manage and operate the platforms take every possible step to protect every user on their platform.

    We must ensure that, to the directors of those companies, this is a personal commitment as part of their role and responsibility. As we saw with health and safety regulations, direct liability is the most effective way to ensure that companies implement such measures and are scrupulous in reviewing them. That is why I support new clause 17 and thank my right hon. Friend the Member for Barking (Dame Margaret Hodge) for her tireless and invaluable work on this subject.

    Let me turn to media literacy—a subject that I raised repeatedly in Committee. I am deeply disappointed that the Government have removed the media literacy duty that they previously committed to introducing. Platforms can boast of all the safety tools they have to protect users, talk about them in meetings, publicise them in press releases and defend them during Committee hearings, but unless users know that they are there and know exactly how to use them, and unless they are being used, their existence is pointless.

    Ofcom recently found that more than a third of children aged eight to 17 said they had seen something “worrying or nasty” online in the past 12 months, but only a third of children knew how to use online reporting or flagging functions. Among adults, a third of internet users were unaware of the potential for inaccurate or biased information online, and just over a third made no appropriate checks before registering their personal details online. Clearly, far more needs to be done to ensure that internet users of all ages are aware of online dangers and of the tools available to keep them safe.

    Although programmes such as Google’s “Be Internet Legends” assemblies are a great resource in schools—I was pleased to visit one at Park Road Junior Infant and Nursery School in Batley recently—we cannot rely on platforms to do this themselves. We have had public information campaigns on the importance of wearing seatbelts, and on the dangers of drink-driving and smoking, and the digital world is now one of the largest dangers most people face in their daily lives. The public sector clearly has a role to warn of the dangers and promote healthy digital habits.

    Let me give one example from the territory of legal but harmful content, which members have spoken about as opaque, challenging and thorny. I agree with all those comments, but if platforms have a tool within them that switches off legal but harmful content, it strikes me as incredibly important that users know what that tool does—that is, they know what information they may be subjected to if it is switched on, and they know exactly how to turn it off. Yet I have heard nothing from the Government since their announcement last week that suggests they will be taking steps to ensure that this tool is easily accessible to users of all ages and digital abilities, and that is exactly why there is a need for a proper digital media literacy strategy.

    I therefore support new clauses 29 and 30, tabled by my colleagues in the SNP, which would empower Ofcom to publish a strategy at least every three years that sets out the measures it is taking to promote media literacy among the public, including through educational initiatives and by ensuring that platforms take the steps needed to make their users aware of online safety tools.

    Finally, I turn to the categorisation of platforms under part 7 of the Bill. I feel extremely strongly about this subject and agree with many comments made by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). The categorisation system listed in the Bill is not fit for purpose. I appreciate that categorisation is largely covered in part 3 and schedule 10, but amendment 159, which we will be discussing in Committee, and new clause 1, which we are discussing today, are important steps towards addressing the Government’s implausible position—that the size of a platform equates to the level of risk. As a number of witnesses stated in Committee, that is simply not the case.

    It is completely irresponsible and narrow-minded to believe that there are no blind spots in which small, high-risk platforms can fester. I speak in particular about platforms relating to dangerous, extremist content —be it Islamist, right wing, incel or any other. These platforms, which may fall out of the scope of the Bill, will be allowed to continue to host extremist individuals and organisations, and their deeply dangerous material. I hope the Government will urgently reconsider that approach, as it risks inadvertently pushing people, including young people, towards greater harm online—either for individuals or for society as a whole.

    Although I am pleased that the Bill is back before us today, I am disappointed that aspects have been weakened since we last considered it, and urge the Government to consider closely some proposals we will vote on this evening, which would go a considerable way to ensuring that the online world is a safer place for children and adults, works in the interests of users, and holds platforms accountable and responsible for protecting us all online.

  • Adam Afriyie – 2022 Speech on the Online Safety Bill

    Adam Afriyie – 2022 Speech on the Online Safety Bill

    The speech made by Adam Afriyie, the Conservative MP for Windsor, in the House of Commons on 5 December 2022.

    I am pleased to follow my fairly close neighbour from Berkshire, the hon. Member for Reading East (Matt Rodda). He raised the issue of legal but harmful content, which I will come to, as I address some of the amendments before us.

    I very much welcome the new shape and focus of the Bill. Our primary duty in this place has to be to protect children, above almost all else. The refocusing of the Bill certainly does that, and it is now in a position where hon. Members from all political parties recognise that it is so close to fulfilling its function that we want it to get through this place as quickly as possible with today’s amendments and those that are forthcoming in the Lords and elsewhere in future weeks.

    The emerging piece of legislation is better and more streamlined. I will come on to further points about legal but harmful, but I am pleased to see that removed from the Bill for adults and I will explain why, given the sensitive case that the hon. Member for Reading East mentioned. The information that he talked about being published online should be illegal, so it would be covered by the Bill. Illegal information should not be published and, within the framework of the Bill, would be taken down quickly. We in this place should not shirk our responsibilities; we should make illegal the things that we and our constituents believe to be deeply harmful. If we are not prepared to do that, we cannot say that some other third party has a responsibility to do it on our behalf and we are not going to have anything to do with it, and they can begin to make the rules, whether they are a commercial company or a regulator without those specific powers.

    I welcome the shape of the Bill, but some great new clauses have been tabled. New clause 16 suggests that we should make it an offence to encourage self-harm, which is fantastic. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) has indicated that he will not press it to a vote, because the Government and all of us acknowledge that that needs to be dealt with at some point, so hopefully an amendment will be forthcoming in the near future.

    On new clause 23, it is clear that if a commercial company is perpetrating an illegal act or is causing harm, it should pay for it, and a proportion of that payment must certainly support the payments to victims of that crime or breach of the regulations. New clauses 45 to 50 have been articulately discussed by my right hon. Friend the Member for Basingstoke (Dame Maria Miller). The technology around revenge pornography and deepfakes is moving forward every day. With some of the fakes online today, it is not possible to tell that they are fakes, even if they are looked at under a microscope. Those areas need to be dealt with, but it is welcome that she will not necessarily press the new clauses to a vote, because those matters must be picked up and defined in primary legislation as criminal acts. There will then be no lack of clarity and we will not need the legal but harmful concept—that will not need to exist. Something will either be illegal, because it is harmful, or not.

    The Bill is great because it provides a framework that enables everything else that hon. Members in the House and people across the country may want to be enacted at a future date. It also enables the power to make those judgments to remain with this House—the democratically elected representatives of the people—rather than some grey bureaucratic body or commercial company whose primary interest is rightly to make vast sums of money for its shareholders. It is not for them to decide; it is for us to decide what is legal and what should be allowed to be viewed in public.

    On amendment 152, which interacts with new clause 11, I was in the IT industry for about 15 to 20 years before coming to this place, albeit with a previous generation of technology. When it comes to end-to-end encryption, I am reminded of King Canute, who said, “I’m going to pass a law so that the tide doesn’t come in.” Frankly, we cannot pass a law that bans mathematics, which is effectively what we would be trying to do if we tried to ban encryption. The nefarious types or evildoers who want to hide their criminal activity will simply use mathematics to do that, whether in mainstream social media companies or through a nefarious route. We have to be careful about getting rid of all the benefits of secure end-to-end encryption for democracy, safety and protection from domestic abuse—all the good things that we want in society—on the basis of a tiny minority of very bad people who need to be caught. We should not be seeking to ban encryption; we should be seeking to catch those criminals, and there are ways of doing so.

    I welcome the Bill; I am pleased with the new approach and I think it can pass through this House swiftly if we stick together and make the amendments that we need. I have had conversations with the Minister about what I am asking for today: I am looking for an assurance that the Government will enable further debate and table the amendments that they have suggested. I also hope that they will be humble, as my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said, and open to some minor adjustments, even to the current thinking, to make the Bill pass smoothly through the Commons and the Lords.

    I would like the Government to confirm that it is part of their vision that it will be this place, not a Minister of State, that decides every year—or perhaps every few months, because technology moves quickly—what new offences need to be identified in law. That will mean that Ofcom and the criminal justice system can get on to that quickly to ensure that the online world is a safer place for our children and a more pleasant place for all of us.

  • Matt Rodda – 2022 Speech on the Online Safety Bill

    Matt Rodda – 2022 Speech on the Online Safety Bill

    The speech made by Matt Rodda, the Labour MP for Reading East, in the House of Commons on 5 December 2022.

    I am grateful to have the opportunity to speak in this debate. I commend the right hon. Member for Basingstoke (Dame Maria Miller) on her work in this important area. I would like to focus my remarks on legal but harmful content and its relationship to knife crime, and to mention a very harrowing and difficult constituency case of mine. As we have heard, legal but harmful content can have a truly dreadful effect. I pay tribute to the families of the children who have been lost, who have attended the debate, a number of whom are still in the Public Gallery.

    Madam Deputy Speaker (Dame Rosie Winterton)

    Just to be clear, the hon. Gentleman’s speech must relate to the amendments before us today.

    Matt Rodda

    Thank you, Madam Deputy Speaker. A boy called Olly Stephens in my constituency was just 13 years old when he was stabbed and brutally murdered in an attack linked to online bullying. He died, sadly, very near his home. His parents had little idea of the social media activity in his life. It is impossible to imagine what they have been through. Our hearts go out to them.

    Harmful but legal content had a terrible effect on the attack on Olly. The two boys who attacked and stabbed him had been sharing enormous numbers of pictures and videos of knives, repeatedly, over a long period of time. There were often videos of teenagers playing with knives, waving them or holding them. They circulated them on 11 different social media platforms over a long period of time. None of those platforms took any action to take the content down. We all need to learn more about such cases to fully understand the impact of legal but harmful content. Even at this late stage, I hope that the Government will think again about the changes they have made to the Bill and include this area again in the Bill.

    There is a second aspect of this very difficult case that I want to mention: the fact that Olly’s murder was discussed on social media and was planned to some extent beforehand. The wider issues here underline the need for far greater regulation and moderation of social media, in particular teenagers’ use of these powerful sites. I am finding it difficult to talk about some of these matters, but I hope that the Government will take my points on board and address the issue of legal but harmful content, and that the Minister will think again about these important matters. Perhaps we will have an opportunity to discuss it in the Bill’s later stages.

  • Maria Miller – 2022 Speech on the Online Safety Bill

    Maria Miller – 2022 Speech on the Online Safety Bill

    The speech made by Maria Miller, the Conservative MP for Basingstoke, in the House of Commons on 5 December 2022.

    I rise to speak to the seven new clauses in my name and those of right hon. and hon. Members from across the House. The Government have kindly said publicly that they are minded to listen to six of the seven amendments that I have tabled on Report. I hope they will listen to the seventh, too, once they have heard my compelling arguments.

    First, I believe it is important that we discuss these amendments, because the Government have not yet tabled amendments. It is important that we in this place understand the Government’s true intention on implementing the Law Commission review in full before the Bill completes its consideration.

    Secondly, the law simply does not properly recognise as a criminal offence the posting online of intimate images—whether real or fake—without consent. Victims say that having a sexual image of them posted online without their consent is akin to a sexual assault. Indeed, Clare McGlynn went even further by saying that there is a big difference between a physical sexual assault and one committed online: victims are always rediscovering the online images and waiting for them to be redistributed, and cannot see when the abuse will be over. In many ways, it is even more acute.

    Just in case anybody in the Chamber is unaware of the scale of the problem after the various contributions that have been made, in the past five years more than 12,000 people reported to the revenge porn helpline almost 200,000 pieces of content that fall into that category. Indeed, since 2014 there have been 28,000 reports to the police of intimate images being distributed without consent.

    The final reason why I believe it is important that we discuss the new clauses is that Ofcom will be regulating online platforms based on their adherence to the criminal law, among other things. It is so important that the criminal law actually recognises where criminal harm is done, but at the moment, when it comes to intimate image abuse, it does not. Throughout all the stages of the Bill’s passage, successive Ministers have said very positive things to me about the need to address this issue in the criminal law, but we still have not seen pen being put to paper, so I hope the Minister will forgive me for raising this yet again so that he can respond.

    New clauses 45 to 50 simply seek to take the Law Commission’s recommendations on intimate image abuse and put them into law as far as the scope of the Bill will allow. New clause 45 would create a base offence for posting explicit images online without consent. Basing the offence on consent, or the lack of it, makes it comparable with three out of four offences already recognised in the Sexual Offences Act 2003. Subsection (10) of the new clause recognises that it is a criminal offence to distribute fake images, deepfakes or images using nudification software, which are currently not covered in law at all.

    New clauses 46 and 47 recognise cases where there is a higher level of culpability for the perpetrator, where they intend to cause alarm, distress or humiliation. Two in three victims report that they know the perpetrators, as a current or former partner. In evidence to the Public Bill Committee, on which I was very pleased to serve, we heard from the Anjelou Centre and Imkaan that some survivors of this dreadful form of abuse are also at risk of honour-based violence. There are yet more layers of abuse.

    New clause 48 would make it a crime to threaten to share an intimate image—this can be just as psychologically destructive as actually sharing it—and using the image to coerce, control or manipulate the victim. I pay real tribute to the team from the Law Commission, under the leadership of Penney Lewis, who did an amazing job of work over three years on their enquiry to collect this information. In the responses to the enquiry there were four mentions of suicide or contemplated suicide as a result of threats to share these sorts of images online without consent. Around one in seven young women and one in nine young men have experienced a threat to share an intimate or sexual image. One in four calls to the Revenge Porn Helpline relate to threats to share. The list of issues goes on. In 2020 almost 3,000 people, mostly men, received demands for money related to sexual images—“sextorsion”, as it is called. This new clause would make it clear that such threats are criminal, the police need to take action and there will be proper protection for victims in law.

    New clauses 49 and 50 would go further. The Law Commission is clear that intimate image abuse is a type of sexual offending. Therefore, victims should have the same protection afforded to those of other sexual offences. That is backed up by the legal committee of the Council of His Majesty’s District Judges, which argues that it is appropriate to extend automatic lifetime anonymity protections to victims, just as they would be extended to victims of offences under the Modern Slavery Act 2015. Women’s Aid underlined that point, recognising that black and minoritised women are also at risk of being disowned, ostracised or even killed if they cannot remain anonymous. The special measures in these new clauses provide for victims in the same way as the Domestic Abuse Act 2021.

    I hope that my hon. Friend the Minister can confirm that the Government intend to introduce the Law Commission’s full recommendations into the Bill, and that those in scope will be included before the Bill reaches its next stage in the other place. I also hope that he will outline how those measures not in scope of the Bill—specifically on the taking and making of sexual images without consent, which formed part of the Law Commission’s recommendations—will be addressed in legislation swiftly. I will be happy to withdraw my new clauses if those undertakings are made today.

    Finally, new clause 23, which also stands in my name, is separate from the Law Commission’s recommendations. It would require a proportion of the fines secured by Ofcom to be used to fund victims’ services. I am sure that the Treasury thinks that it is an innovative way of handling things, although one could argue that it did something similar only a few days ago with regard to the pollution of waterways by water companies. I am sure that the Minister might want to refer to that.

    The Bill identifies that many thousands more offences are committed as crimes than are currently recognised within law. I hope that the Minister can outline how appropriate measures will be put in place to ensure support for victims, who will now, possibly for the first time, have some measures in place to assist them. I raised earlier the importance of keeping the Bill and its effectiveness under review. I hope that the House will think about how we do that materially, so we do not end up having another five or 10 years without such a Bill and having to play catch-up in such a complex area.

  • Neale Hanvey – 2022 Speech on the Online Safety Bill

    Neale Hanvey – 2022 Speech on the Online Safety Bill

    The speech made by Neale Hanvey, the Alba MP for Kirkcaldy and Cowdenbeath, in the House of Commons on 5 December 2022.

    I approach my contribution from the perspective of the general principle, the thread that runs through all the amendments on the paper today on safety, reform of speech, illegal content and so on. That thread is how we deal with the harm landscape and the real-world impact of issues such as cyber-bullying, revenge porn, predatory grooming, self-harm or indeed suicide forums.

    There is a serious risk to children and young people, particularly women and girls, on which there has been no debate allowed: the promulgation of gender ideology pushed by Mermaids and other so-called charities, which has created a toxic online environment that silences genuine professional concern, amplifies unquestioned affirmation and brands professional therapeutic concern, such as that of James Esses, a therapist and co-founder of Thoughtful Therapists, as transphobic. That approach, a non-therapeutic and affirmative model, has been promoted and fostered online.

    The reality is that adolescent dysphoria is a completely normal thing. It can be a response to disruption from adverse childhood experiences or trauma, it can be a feature of autism or personality disorders or it can be a response to the persistence of misogynistic social attitudes. Dysphoria can present and manifest in many different ways, not just gender. If someone’s gender dysphoria persists even after therapeutic support, I am first in the queue to defend that person and ensure their wishes are respected and protected, but it is an absolute falsity to give young people information that suggests there is a quick-fix solution.

    It is not normal to resolve dysphoria with irreversible so-called puberty blockers and cross-sex hormones, or with radical, irreversible, mutilating surgery. Gender ideology is being reinforced everywhere online and, indeed, in our public services and education system, but it is anything but progressive. It attempts to stuff dysphoric or gender non-conforming young people into antiquated, regressive boxes of what a woman is and what a man is, and it takes no account of the fact that it is fine to be a butch or feminine lesbian, a femboy or a boy next door, an old duffer like me, an elite gay sportsman or woman, or anything in between.

    Transitioning will be right for some, but accelerating young people into an affirmative model is absolutely reckless. What do those who perpetuate this myth want to achieve? What is in it for them? Those are fundamental questions that we have to ask. The reality is that the affirmative model is the true conversion therapy—trans-ing away the gay and nullifying same-sex attraction.

    I urge all right hon. and hon. Members to watch the four-part documentary “Dysphoric” on YouTube. It is so powerful and shows the growing number of young people who have been transitioned rapidly into those services, and the pain, torment and regret that they have experienced through the irreversible effects of their surgery and treatments. The de-transitioners are bearing the impacts. There is no follow-up to such services, and those people are just left to get on with it. Quite often, their friends in the trans community completely abandon them when they detransition.

    I pay particular tribute to Sinead Watson and Ritchie Herron, who are both de-transitioners, for their courage and absolutely incredible resilience in dealing with this issue online and shining a light on this outrage. I also pay tribute to the LGB Alliance, For Women Scotland, and Sex Matters, which have done a huge amount of work to bring this matter to the fore.

    Mermaids—the organisation—continues to deny that there is any harm, co-morbidities or serious iatrogenic impacts from hormone treatment or radical surgery. That is a lie; it is not true. Mermaids has promoted the illegal availability of online medicines that do lasting, irreversible damage to young people.

    I pay tribute to the Government for the Cass review, which is beginning to shine a light on the matter. I welcome the interim report, but we as legislators must make a connection between what is happening online, how it is policed in society and the message that is given out there. We must link harm to online forums and organisations, as well as to frontline services.

    I point out with real regret that I came across a document being distributed through King’s College Hospital NHS Foundation Trust from an organisation called CliniQ, which runs an NHS clinic for the trans community. The document has lots of important safety and health advice, but it normalises self-harm as sexual

    “Play that involves blood, cutting and piercing.”

    It advises that trans-identifying females can go in

    “stealth if it is possible for them”

    to private gay clubs, and gives examples of how to obtain sex by deception. It is unacceptable that such information is provided on NHS grounds.

    Speaking out about this in Scotland has been a very painful experience for many of us. We have faced doxing, threats, harassment and vilification. In 2019, I raised my concerns about safeguarding with my colleagues in Government. A paper I wrote had this simple message: women are not being listened to in the gender recognition reform debate. I approached the then Cabinet Secretary for Social Security and Older People, Shirley-Anne Somerville, whose brief included equality. She was someone I had known for years and considered a friend; she knew my professional background, my family and, of course, my children. She told me she that she shared my concerns—she has children of her own—but she instructed me to be silent. She personally threatened and attempted to bully friends of mine, insisting that they abandon me. I pay great tribute to Danny Stone and the Antisemitism Policy Trust for their support in guiding me through what was an incredibly difficult period of my life. I also pay tribute to the hon. Member for Brigg and Goole (Andrew Percy).

    I can see that you are anxious for me close, Madam Deputy Speaker, so I will—[Interruption.] I will chance my arm a bit further, then.

    I am not on my pity pot here; this is not about me. It is happening all over Scotland. Women in work are being forced out of employment. If Governments north and south of the border are to tackle online harms, we must follow through with responsible legislation. Only last week, the First Minister of Scotland, who denied any validity to the concerns I raised in 2019, eventually admitted they were true. But her response must be to halt her premature and misguided legislation, which is without any protection for the trans community, women or girls. We must make the connection from online harms all the way through to meaningful legislation at every stage.

  • Jeremy Wright – 2022 Speech on the Online Safety Bill

    Jeremy Wright – 2022 Speech on the Online Safety Bill

    The speech made by Jeremy Wright, the Conservative MP for Kenilworth and Southam, in the House of Commons on 5 December 2022.

    I rise to speak to amendments 1 to 9 and new clause 1 in my name and the names of other hon. and right hon. Members. They all relate to the process of categorisation of online services, particularly the designation of some user-to-user services as category 1 services. There is some significance in that designation. In the Bill as it stands, perhaps the greatest significance is that only category 1 services have to concern themselves with so-called “legal but harmful” content as far as adults are concerned. I recognise that the Government have advertised their intention to modify the Bill so that users are offered instead mechanisms by which they can insulate themselves from such content, but that requirement, too, would only apply to category 1 services. There are also other obligations to which only category 1 services are subject—to protect content of democratic importance and journalistic content, and extra duties to assess the impact of their policies and safety measures on rights of freedom of expression and privacy.

    Category 1 status matters. The Bill requires Ofcom to maintain a register of services that qualify as category 1 based on threshold criteria set out in regulations under schedule 11 of the Bill. As schedule 11 stands, the Secretary of State must make those regulations, specifying threshold conditions, which Ofcom must then apply to designate a service as category 1. That is based only on the number of users of the service and its functionalities, which are defined in clause 189.

    Amendments 2 to 8 would replace the word “functionalities” with the word “characteristics”. This term is defined in amendment 1 to include not only functionalities —in other words what can be done on the platform—but other aspects of the service: its user base; its business model; governance and other systems and processes. Incidentally, that definition of the term “characteristics” is already in the Bill in clause 84 dealing with risk profiles, so it is a definition that the Government have used themselves.

    Categorisation is about risk, so the amendments ask more of platforms and services where the greatest risk is concentrated; but the greatest risk will not always be concentrated in the functionality of an online service. For example, its user base and business model will also disclose a significant risk in some cases. I suggest that there should be broader criteria available to Ofcom to enable it to categorise. I also argue that the greatest risk is not always concentrated on the platforms with the most users. Amendment 9 would change schedule 11 from its current wording, which requires the meeting of both a scale and a functionality threshold for a service to be designated as category 1, to instead require only one or the other.

    Very harmful content being located on smaller platforms is an issue that has been discussed many times in consideration of the Bill. That could arise organically or deliberately, with harmful content migrating to smaller platforms to escape more onerous regulatory requirements. Amendment 9 would resolve that problem by allowing Ofcom to designate a service as category 1 based on its size or on its functionalities—or, better yet, on its broader characteristics.

    I do not want to take too many risks, but I think the Government have some sympathy with my position, based on the indicative amendments they have published for the further Committee stage they would like this Bill to have. I appreciate entirely that we are not discussing those amendments today, but I hope, Madam Deputy Speaker, you will permit me to make some brief reference to them, as some of them are on exactly the same territory as my amendments here.

    Some of those amendments that the Government have published would add the words “any other characteristics” to schedule 11 provisions on threshold conditions for categorisation, and define them in a very similar way to my amendment 1. They may ask whether that will answer my concerns, and the answer is, “Nearly.” I welcome the Government’s adding other characteristics to the consideration, not just of threshold criteria, but to the research Ofcom will carry out on how threshold conditions will be set in the first place, but I am afraid that they do not propose to change schedule 11, paragraph 1(4), which requires regulations made on threshold conditions to include,

    “at least one specified condition about number of users and at least one specified condition about functionality.”

    That means that to be category 1, a service must still be big.

    I ask the Minister to consider again very carefully a way in which we can meet the genuine concern about high harm on small platforms. The amendment that he is likely to bring forward in Committee will not yet do so comprehensively. I also observe in passing that the reference the Government make in those amendments to any other characteristics are those that the Secretary of State considers relevant, not that Ofcom considers relevant—but that is perhaps a conversation for another day.

    Secondly, I come on to the process of re-categorisation and new clause 1. It is broadly agreed in this debate that this is a fast-changing landscape; platforms can grow quickly, and the nature and scale of the content on them can change fast as well. If the Government are wedded to categorisation processes with an emphasis on scale, then the capacity to re-categorise a platform that is now category 2B but might become category 1 in the future will be very important.

    That process is described in clause 83 of the Bill, but there are no timeframes or time limits for the re-categorisation process set out. We can surely anticipate that some category 2B platforms might be reluctant to take on the additional applications of category 1 status, and may not readily acquiesce in re-categorisation but instead dispute it, including through an appeal to the tribunal provided for in clause 139. That would mean that re-categorisation could take some time after Ofcom has decided to commence it and communicate it to the relevant service. New clause 1 is concerned with what happens in the meantime.

    To be clear, I would not expect the powers that new clause 1 would create to be used often, but I can envisage circumstances where they would be beneficial. Let us imagine that the general election is under way—some of us will do that with more pleasure than others. Category 1 services have a particular obligation to protect content of democratic importance, including of course by applying their systems and processes for moderating content even-handedly across all shades of political opinion. There will not be a more important time for that obligation than during an election.

    Let us assume also that a service subject to ongoing re-categorisation, because in Ofcom’s opinion it now has considerable reach, is not applying that even-handedness to the moderation of content or even to its removal. Formal re-categorisation and Ofcom powers to enforce a duty to protect democratic content could be months away, but the election will be over in weeks, and any failure to correct disinformation against a particular political viewpoint will be difficult or impossible to fully remedy by retrospective penalties at that point.

    New clause 1 would give Ofcom injunction-style powers in such a scenario to act as if the platform is a category 1 service where that is,

    “necessary to avoid or mitigate significant harm.”

    It is analogous in some ways to the powers that the Government have already given to Ofcom to require a service to address a risk that it should have identified in its risk assessment but did not because that risk assessment was inadequate, and to do so before the revised risk assessment has been done.

    Again, the Minister may say that there is an answer to that in a proposed Committee stage amendment to come, but I think the proposal that is being made is for a list of emerging category 1 services—those on a watchlist, as it were, as being borderline category 1—but that in itself will not speed up the re-categorisation process. It is the time that that process might take that gives rise to the potential problem that new clause 1 seeks to address.

    I hope that my hon. Friend the Minister will consider the amendments in the spirit they are offered. He has probably heard me say before—though perhaps not, because he is new to this, although I do not think anyone else in the room is—that the right way to approach this groundbreaking, complex and difficult Bill is with a degree of humility. That is never an easy sell in this institution, but I none the less think that if we are prepared to approach this with humility, we will all accept, whether Front Bench or Back Bench, Opposition or Government, that we will not necessarily get everything right first time.

    Therefore, these Report stages in this Bill of all Bills are particularly important to ensure that where we can offer positive improvements, we do so, and that the Government consider them in that spirit of positive improvement. We owe that to this process, but we also owe it to the families who have been present for part of this debate, who have lost far more than we can possibly imagine. We owe it to them to make sure that where we can make the Bill better, we make it better, but that we do not lose the forward momentum that I hope it will now have.

  • John McDonnell – 2022 Speech on the Online Safety Bill

    John McDonnell – 2022 Speech on the Online Safety Bill

    The speech made by John McDonnell, the Labour MP for Hayes and Harlington, in the House of Commons on 5 December 2022.

    The debate so far has been serious, and it has respected the views that have been expressed not only by Members from across the House, on a whole range of issues, but by the families joining us today who have suffered such a sad loss.

    I wish to address one detailed element of the Bill, and I do so in my role as secretary of the National Union of Journalists’ cross-party parliamentary group. It is an issue to which we have returned time and again when we have been debating legislation of this sort. I just want to bring it to the attention of the House; I do not intend to divide the House on this matter. I hope that the Government will take up the issue, and then, perhaps, when it goes to the other place, it will be resolved more effectively than it has been in this place. I am happy to offer the NUJ’s services in seeking to provide a way forward on this matter.

    Many investigative journalists base their stories on confidential information, disclosed often by whistleblowers. There has always been an historic commitment—in this House as well—to protect journalists’ right to protect their sources. It has been at the core of the journalists’ code of practice, promoted by the NUJ. As Members know, in some instances, journalists have even gone to prison to protect their sources, because they believe that it is a fundamental principle of journalism, and also a fundamental principle of the role of journalism in protecting our democracy.

    The growth in the use of digital technology in journalism has raised real challenges in protecting sources. In the case of traditional material, a journalist has possession of it, whereas with digital technology a journalist does not own or control the data in the same way. Whenever legislation of this nature is discussed, there has been a long-standing, cross-party campaign in the House to seek to protect this code of practice of the NUJ and to provide protection for journalists to protect their sources and their information. It goes back as far as the Police and Criminal Evidence Act 1984. If Members can remember the operation of that Act, they will know that it requires the police or the investigatory bodies to produce a production order, and requires notice to be given to journalists of any attempt to access information. We then looked at it again in the Investigatory Powers Act 2016. Again, what we secured there were arrangements by which there should be prior approval by a judicial commissioner before an investigatory power can seek communications data likely to compromise a journalists’ sources. There has been a consistent pattern.

    To comply with Madam Deputy Speaker’s attempt to constrain the length of our speeches, let me briefly explain to Members what amendment 204 would do. It is a moderate probing amendment, which seeks to ask the Government to look again at this matter. When Ofcom is determining whether to issue a notice to intervene or when it is issuing a notice to that tech platform to monitor user-to-user content, the amendment asks it to consider the level of risk of the specified technology accessing, retaining or disclosing the identity of any confidential journalistic source or confidential journalistic material. The amendment stands in the tradition of the other amendments that have been tabled in this House and that successive Government have agreed to. It puts the onus on Ofcom to consider how to ensure that technologies can be limited to the purpose that was intended. It should not result in massive data harvesting operations, which was referred to earlier, or become a back door way for investigating authorities to obtain journalistic data, or material, without official judicial approval.

    Mr Davis

    I rise in support of the right hon. Gentleman. The production order structure, as it stands, is already being abused: I know of a case in place today. The measure should be stronger and clearer—the Bill contains almost nothing on this—on the protection of journalists, whistleblowers and all people for public interest reasons.

    John McDonnell

    The right hon. Gentleman and I have some form on this matter going back a number of years. The amendment is in the tradition that this House has followed of passing legislation to protect journalists, their sources and their material. I make this offer again to the Minister: the NUJ is happy to meet and discuss how the matter can be resolved effectively through the tabling of an amendment in the other place or discussions around codes of practice. However, I emphasise to the Minister that, as we have found previously, the stronger protection is through a measure in the Bill itself.

  • David Davis – 2022 Speech on the Online Safety Bill

    David Davis – 2022 Speech on the Online Safety Bill

    The speech made by David Davis, the Conservative MP for Haltemprice and Howden, in the House of Commons on 5 December 2022.

    I do not agree with every detail of what the hon. Member for Rotherham (Sarah Champion) said, but I share her aims. She has exactly the right surname for what she does in standing up for children.

    To avoid the risk of giving my Whip a seizure, I congratulate the Government and the Minister on all they have done so far, both in delaying the Bill and in modifying their stance.

    My hon. Friend the Member for Solihull (Julian Knight), who is no longer in the Chamber, said that this is five Bills in one and should have had massively more time. At the risk of sounding like a very old man, there was a time when this Bill would have had five days on Report. That is what should have happened with such a big Bill.

    Opposition Members will not agree, but I am grateful that the Government decided to remove the legal but harmful clause. The simple fact is that the hon. Member for Pontypridd (Alex Davies-Jones) and I differ not in our aim—my new clause 16 is specifically designed to protect children—but on the method of achieving it. Once upon a time, there was a tradition that this Chamber would consider a Companies Bill every year, because things change over time. We ought to have a digital Bill every year, specifically to address not legal but harmful but, “Is it harmful enough to be made illegal?” Obviously, self-harm material is harmful enough to be made illegal.

    The hon. Lady and I have similar aims, but we have different perspectives on how to attack this. My perspective is as someone who has seen many pieces of legislation go badly wrong despite the best of intentions.

    The Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), knows he is a favourite of mine. He did a fantastic job in his previous role. I think this Bill is a huge improvement, but he has a lot more to do, as he recognises with the Bill returning to Committee.

    One area on which I disagree with many of my hon. and right hon. Friends is the question of encryption. The Bill allows Ofcom to issue notices directing companies to use “accredited technology,” but it might as well say “magic,” because we do not know what is meant by “accredited technology.” Clause 104 will create a pressure to undermine the end-to-end encryption that is not only desirable but crucial to our telecommunications. The clause sounds innocuous and legalistic, especially given that the notices will be issued to remove terrorist or child sexual exploitation content, which we all agree has no place online.

    Damian Collins

    Rather than it being magic, does my right hon. Friend agree that a company could not ignore it if we demystified the process? By saying there is an existing technology that is available and proven to work, the company would have to explain why it is not using that technology or something better.

    Mr Davis

    I will come back to that in some detail.

    The first time I used encryption it was one-time pads and Morse, so it was a long time ago. The last time was much more recent. The issue here is that clause 104 causes pressure by requiring real-time decryption. The only way to do that is by either having it unencrypted on the server, having it weakly encrypted or creating a back door. I am talking not about metadata, which I will come back to in a second, but about content. In that context, if the content needs to be rapidly accessible, it is bound to lead to weakened encryption.

    This is perhaps a debate for a specialist forum, but it is very dangerous in a whole series of areas. What do we use encryption for? We use it for banking, for legal and privileged conversations, and for conversations with our constituents and families. I could go on and on about the areas in which encryption matters.

    Adam Afriyie (Windsor) (Con)

    My right hon. Friend will be aware that the measure will encompass every single telephone conversation when it switches to IP. That is data, too.

    Mr Davis

    That is correct. The companies cannot easily focus the measure on malicious content alone, and that is the problem. With everything we do in dealing with enforcing the law, we have to balance the extent to which we make the job of the law enforcement agency possible—ideally, easy—against the rights we take away from innocent citizens. That is the key balance. Many bad things happen in households but we do not require people to live in houses with glass walls. That shows the intrinsic problem we have.

    That imposition on privacy cannot sit comfortably with anybody who takes privacy rights seriously. As an aside, let me say to the House that the last thing we need, given that we want something to happen quickly, or at least effectively and soon, is to find ourselves in a Supreme Court case or a European Court case on privacy imposition. I do not think that is necessary. That is where I think the argument stands. If we end up in a case like that, it will not be about paedophiles or criminals; it will be about the weakening of the encryption of the data of an investigative journalist or a whistleblower. That is where it will come back to haunt us and we have to put that test on it. That is my main opening gambit.

    I am conscious that everybody has spoken for quite a long time, so I am trying to make this short. However, the other thing I wish to say is that we have weapons, particularly in terms of metadata. If I recall correctly, Facebook takes down about 300,000 or so sites for paedophile content alone and millions for other reasons; so the use of metadata is very important. Europol carried out a survey of what was useful in terms of the data arising from the internet, social media and the like, and content was put at No. 7, after all sorts of other data. I will not labour the point, but I just worry about this. We need to get it right and so far we have taken more of a blunderbuss approach than a rifle shot. We need to correct that, which is what my two amendments are about.

    The other thing I briefly wish to talk about is new clause 16, which a number of people have mentioned in favourable terms. It will make it an offence to encourage or assist another person to self-harm—that includes suicide. I know that the Government have difficulties getting their proposed provisions right in how they interact with other legislation—the suicide legislation and so on. I will be pressing the new clause to a vote. I urge the Government to take this new clause and to amend the Bill again in the Lords if it is not quite perfect. I want to be sure that this provision goes into the legislation. It comes back to the philosophical distinction involving “legal but harmful”, a decision put first in the hands of a Minister and then in the hands of an entirely Whip-chosen statutory instrument Committee, neither of which are trustworthy vehicles for the protection of free speech. My approach will take it from there and put it in the hands of this Chamber and the other place. Our control, in as much as we control the internet, should be through primary legislation, with maximum scrutiny, exposure and democratic content. If we do it in that way, nobody can argue with us and we will be world leaders, because we are pretty much the only people who can do that.

    As I say, we should come back to this area time and time again, because this Bill will not be the last shot at it. People have talked about the “grey area”. How do we assess a grey area? Do I trust Whitehall to do it? No, I do not; good Minister though we have, he will not always be there and another Minister will be in place. We may have the British equivalent of Trump one day, who knows, and we do not want to leave this provision in that context. We want this House, and the public scrutiny that this Chamber gets, to be in control of it.

    Sir William Cash (Stone) (Con)

    Many years ago, in the 1970s, I was much involved in the Protection of Children Bill, which was one of the first steps in condemning and making illegal explicit imagery of children and their involvement in the making of such films. We then had the broadcasting Acts and the video Acts, and I was very much involved at that time in saying that we ought to prohibit such things in videos and so on. I got an enormous amount of flack for that. We have now moved right the way forward and it is tremendous to see not only the Government but the Opposition co-operating together on this theme. I very much sympathise with not only what my right hon. Friend has just said—I am very inclined to support his new clause for that reason— but with what the right hon. Member for Barking (Dame Margaret Hodge) said. I was deeply impressed by the way in which she presented the argument about the personal liability of directors. We cannot distinguish between a company and the people who run it, and I am interested to hear what the Government have to say in reply to that.

    Mr Davis

    I very much agree with my hon. Friend on that. He and I have been allies in the past—and sometimes opponents—and he has often been far ahead of other people. I am afraid that I do not remember the example from the 1970s, as that was before even my time here, but I remember the intervention he made in the 1990s and the fuss it caused. From that point of view, I absolutely agree with him. My new clause is clearly worded and I hope the House will give it proper consideration. It is important that we put something in the Bill on this issue, even if the Government, quite properly, amend it later.

    I wish to raise one last point, which has come up as we have talked through these issues. I refer to the question of individual responsibility. One or two hon. Ladies on the Opposition Benches have cited algorithmic outcomes. As I said to the right hon. Member for Barking, I am worried about how we place the responsibility, and how it would lead the courts to behave, and so on. We will debate that in the next few days and when the Bill comes back again.

    There is one other issue that nothing in this Bill covers, and I am not entirely sure why. Much of the behaviour pattern is algorithmic and it is algorithmic with an explicit design. As a number of people have said, it is designed as clickbait; it is designed to bring people back. We may get to a point, particularly if we come back to this year after year, of saying, “There are going to be rules about your algorithms, so you have to write it into the algorithm. You will not use certain sorts of content, pornographic content and so on, as clickbait.” We need to think about that in a sophisticated and subtle way. I am looking at my hon. Friend the Member for Folkestone and Hythe (Damian Collins), the ex-Chairman of the Select Committee, on this issue. If we are going to be the innovators—and we are the digital world innovators— we have to get this right.

    Damian Collins

    My right hon. Friend is right to raise this important point. The big area here is not only clickbait, but AI-generated recommendation tools, such as a news feed on Facebook or “next up” on YouTube. Mitigating the illegal content on the platforms is not just about content moderation and removal; it is about not promoting.

    Mr Davis

    My hon. Friend is exactly right about that. I used the example of clickbait as shorthand. The simple truth is that “AI-generated” is also a misnomer, because these things are not normally AI; they are normally algorithms written specifically to recommend and to maximise returns and revenue. We are not surprised at that. Why should we be? After all, these are commercial companies we are talking about and that is what they are going to do. Every commercial company in the world operates within a regulatory framework that prevents them from making profits out of antisocial behaviour.

    Aaron Bell (Newcastle-under-Lyme) (Con)

    On the AI point, let me say that the advances we have seen over the weekend are remarkable. I have just asked OpenAI.com to write a speech in favour of the Bill and it is not bad. That goes to show that the risks to people are not just going to come from algorithms; people are going to be increasingly scammed by AI. We need a Bill that can adapt with the times as we move forward.

    Mr Davis

    Perhaps we should run my speech against—[Laughter.] I am teasing. I am coming to the end of my comments, Madam Deputy Speaker. The simple truth is that these mechanisms—call them what you like—are controllable if we put our mind to it. It requires subtlety, testing the thing out in practice and enormous expert input, but we can get this right.