Tag: Speeches

  • Neale Hanvey – 2022 Speech on the Online Safety Bill

    Neale Hanvey – 2022 Speech on the Online Safety Bill

    The speech made by Neale Hanvey, the Alba MP for Kirkcaldy and Cowdenbeath, in the House of Commons on 5 December 2022.

    I approach my contribution from the perspective of the general principle, the thread that runs through all the amendments on the paper today on safety, reform of speech, illegal content and so on. That thread is how we deal with the harm landscape and the real-world impact of issues such as cyber-bullying, revenge porn, predatory grooming, self-harm or indeed suicide forums.

    There is a serious risk to children and young people, particularly women and girls, on which there has been no debate allowed: the promulgation of gender ideology pushed by Mermaids and other so-called charities, which has created a toxic online environment that silences genuine professional concern, amplifies unquestioned affirmation and brands professional therapeutic concern, such as that of James Esses, a therapist and co-founder of Thoughtful Therapists, as transphobic. That approach, a non-therapeutic and affirmative model, has been promoted and fostered online.

    The reality is that adolescent dysphoria is a completely normal thing. It can be a response to disruption from adverse childhood experiences or trauma, it can be a feature of autism or personality disorders or it can be a response to the persistence of misogynistic social attitudes. Dysphoria can present and manifest in many different ways, not just gender. If someone’s gender dysphoria persists even after therapeutic support, I am first in the queue to defend that person and ensure their wishes are respected and protected, but it is an absolute falsity to give young people information that suggests there is a quick-fix solution.

    It is not normal to resolve dysphoria with irreversible so-called puberty blockers and cross-sex hormones, or with radical, irreversible, mutilating surgery. Gender ideology is being reinforced everywhere online and, indeed, in our public services and education system, but it is anything but progressive. It attempts to stuff dysphoric or gender non-conforming young people into antiquated, regressive boxes of what a woman is and what a man is, and it takes no account of the fact that it is fine to be a butch or feminine lesbian, a femboy or a boy next door, an old duffer like me, an elite gay sportsman or woman, or anything in between.

    Transitioning will be right for some, but accelerating young people into an affirmative model is absolutely reckless. What do those who perpetuate this myth want to achieve? What is in it for them? Those are fundamental questions that we have to ask. The reality is that the affirmative model is the true conversion therapy—trans-ing away the gay and nullifying same-sex attraction.

    I urge all right hon. and hon. Members to watch the four-part documentary “Dysphoric” on YouTube. It is so powerful and shows the growing number of young people who have been transitioned rapidly into those services, and the pain, torment and regret that they have experienced through the irreversible effects of their surgery and treatments. The de-transitioners are bearing the impacts. There is no follow-up to such services, and those people are just left to get on with it. Quite often, their friends in the trans community completely abandon them when they detransition.

    I pay particular tribute to Sinead Watson and Ritchie Herron, who are both de-transitioners, for their courage and absolutely incredible resilience in dealing with this issue online and shining a light on this outrage. I also pay tribute to the LGB Alliance, For Women Scotland, and Sex Matters, which have done a huge amount of work to bring this matter to the fore.

    Mermaids—the organisation—continues to deny that there is any harm, co-morbidities or serious iatrogenic impacts from hormone treatment or radical surgery. That is a lie; it is not true. Mermaids has promoted the illegal availability of online medicines that do lasting, irreversible damage to young people.

    I pay tribute to the Government for the Cass review, which is beginning to shine a light on the matter. I welcome the interim report, but we as legislators must make a connection between what is happening online, how it is policed in society and the message that is given out there. We must link harm to online forums and organisations, as well as to frontline services.

    I point out with real regret that I came across a document being distributed through King’s College Hospital NHS Foundation Trust from an organisation called CliniQ, which runs an NHS clinic for the trans community. The document has lots of important safety and health advice, but it normalises self-harm as sexual

    “Play that involves blood, cutting and piercing.”

    It advises that trans-identifying females can go in

    “stealth if it is possible for them”

    to private gay clubs, and gives examples of how to obtain sex by deception. It is unacceptable that such information is provided on NHS grounds.

    Speaking out about this in Scotland has been a very painful experience for many of us. We have faced doxing, threats, harassment and vilification. In 2019, I raised my concerns about safeguarding with my colleagues in Government. A paper I wrote had this simple message: women are not being listened to in the gender recognition reform debate. I approached the then Cabinet Secretary for Social Security and Older People, Shirley-Anne Somerville, whose brief included equality. She was someone I had known for years and considered a friend; she knew my professional background, my family and, of course, my children. She told me she that she shared my concerns—she has children of her own—but she instructed me to be silent. She personally threatened and attempted to bully friends of mine, insisting that they abandon me. I pay great tribute to Danny Stone and the Antisemitism Policy Trust for their support in guiding me through what was an incredibly difficult period of my life. I also pay tribute to the hon. Member for Brigg and Goole (Andrew Percy).

    I can see that you are anxious for me close, Madam Deputy Speaker, so I will—[Interruption.] I will chance my arm a bit further, then.

    I am not on my pity pot here; this is not about me. It is happening all over Scotland. Women in work are being forced out of employment. If Governments north and south of the border are to tackle online harms, we must follow through with responsible legislation. Only last week, the First Minister of Scotland, who denied any validity to the concerns I raised in 2019, eventually admitted they were true. But her response must be to halt her premature and misguided legislation, which is without any protection for the trans community, women or girls. We must make the connection from online harms all the way through to meaningful legislation at every stage.

  • Jeremy Wright – 2022 Speech on the Online Safety Bill

    Jeremy Wright – 2022 Speech on the Online Safety Bill

    The speech made by Jeremy Wright, the Conservative MP for Kenilworth and Southam, in the House of Commons on 5 December 2022.

    I rise to speak to amendments 1 to 9 and new clause 1 in my name and the names of other hon. and right hon. Members. They all relate to the process of categorisation of online services, particularly the designation of some user-to-user services as category 1 services. There is some significance in that designation. In the Bill as it stands, perhaps the greatest significance is that only category 1 services have to concern themselves with so-called “legal but harmful” content as far as adults are concerned. I recognise that the Government have advertised their intention to modify the Bill so that users are offered instead mechanisms by which they can insulate themselves from such content, but that requirement, too, would only apply to category 1 services. There are also other obligations to which only category 1 services are subject—to protect content of democratic importance and journalistic content, and extra duties to assess the impact of their policies and safety measures on rights of freedom of expression and privacy.

    Category 1 status matters. The Bill requires Ofcom to maintain a register of services that qualify as category 1 based on threshold criteria set out in regulations under schedule 11 of the Bill. As schedule 11 stands, the Secretary of State must make those regulations, specifying threshold conditions, which Ofcom must then apply to designate a service as category 1. That is based only on the number of users of the service and its functionalities, which are defined in clause 189.

    Amendments 2 to 8 would replace the word “functionalities” with the word “characteristics”. This term is defined in amendment 1 to include not only functionalities —in other words what can be done on the platform—but other aspects of the service: its user base; its business model; governance and other systems and processes. Incidentally, that definition of the term “characteristics” is already in the Bill in clause 84 dealing with risk profiles, so it is a definition that the Government have used themselves.

    Categorisation is about risk, so the amendments ask more of platforms and services where the greatest risk is concentrated; but the greatest risk will not always be concentrated in the functionality of an online service. For example, its user base and business model will also disclose a significant risk in some cases. I suggest that there should be broader criteria available to Ofcom to enable it to categorise. I also argue that the greatest risk is not always concentrated on the platforms with the most users. Amendment 9 would change schedule 11 from its current wording, which requires the meeting of both a scale and a functionality threshold for a service to be designated as category 1, to instead require only one or the other.

    Very harmful content being located on smaller platforms is an issue that has been discussed many times in consideration of the Bill. That could arise organically or deliberately, with harmful content migrating to smaller platforms to escape more onerous regulatory requirements. Amendment 9 would resolve that problem by allowing Ofcom to designate a service as category 1 based on its size or on its functionalities—or, better yet, on its broader characteristics.

    I do not want to take too many risks, but I think the Government have some sympathy with my position, based on the indicative amendments they have published for the further Committee stage they would like this Bill to have. I appreciate entirely that we are not discussing those amendments today, but I hope, Madam Deputy Speaker, you will permit me to make some brief reference to them, as some of them are on exactly the same territory as my amendments here.

    Some of those amendments that the Government have published would add the words “any other characteristics” to schedule 11 provisions on threshold conditions for categorisation, and define them in a very similar way to my amendment 1. They may ask whether that will answer my concerns, and the answer is, “Nearly.” I welcome the Government’s adding other characteristics to the consideration, not just of threshold criteria, but to the research Ofcom will carry out on how threshold conditions will be set in the first place, but I am afraid that they do not propose to change schedule 11, paragraph 1(4), which requires regulations made on threshold conditions to include,

    “at least one specified condition about number of users and at least one specified condition about functionality.”

    That means that to be category 1, a service must still be big.

    I ask the Minister to consider again very carefully a way in which we can meet the genuine concern about high harm on small platforms. The amendment that he is likely to bring forward in Committee will not yet do so comprehensively. I also observe in passing that the reference the Government make in those amendments to any other characteristics are those that the Secretary of State considers relevant, not that Ofcom considers relevant—but that is perhaps a conversation for another day.

    Secondly, I come on to the process of re-categorisation and new clause 1. It is broadly agreed in this debate that this is a fast-changing landscape; platforms can grow quickly, and the nature and scale of the content on them can change fast as well. If the Government are wedded to categorisation processes with an emphasis on scale, then the capacity to re-categorise a platform that is now category 2B but might become category 1 in the future will be very important.

    That process is described in clause 83 of the Bill, but there are no timeframes or time limits for the re-categorisation process set out. We can surely anticipate that some category 2B platforms might be reluctant to take on the additional applications of category 1 status, and may not readily acquiesce in re-categorisation but instead dispute it, including through an appeal to the tribunal provided for in clause 139. That would mean that re-categorisation could take some time after Ofcom has decided to commence it and communicate it to the relevant service. New clause 1 is concerned with what happens in the meantime.

    To be clear, I would not expect the powers that new clause 1 would create to be used often, but I can envisage circumstances where they would be beneficial. Let us imagine that the general election is under way—some of us will do that with more pleasure than others. Category 1 services have a particular obligation to protect content of democratic importance, including of course by applying their systems and processes for moderating content even-handedly across all shades of political opinion. There will not be a more important time for that obligation than during an election.

    Let us assume also that a service subject to ongoing re-categorisation, because in Ofcom’s opinion it now has considerable reach, is not applying that even-handedness to the moderation of content or even to its removal. Formal re-categorisation and Ofcom powers to enforce a duty to protect democratic content could be months away, but the election will be over in weeks, and any failure to correct disinformation against a particular political viewpoint will be difficult or impossible to fully remedy by retrospective penalties at that point.

    New clause 1 would give Ofcom injunction-style powers in such a scenario to act as if the platform is a category 1 service where that is,

    “necessary to avoid or mitigate significant harm.”

    It is analogous in some ways to the powers that the Government have already given to Ofcom to require a service to address a risk that it should have identified in its risk assessment but did not because that risk assessment was inadequate, and to do so before the revised risk assessment has been done.

    Again, the Minister may say that there is an answer to that in a proposed Committee stage amendment to come, but I think the proposal that is being made is for a list of emerging category 1 services—those on a watchlist, as it were, as being borderline category 1—but that in itself will not speed up the re-categorisation process. It is the time that that process might take that gives rise to the potential problem that new clause 1 seeks to address.

    I hope that my hon. Friend the Minister will consider the amendments in the spirit they are offered. He has probably heard me say before—though perhaps not, because he is new to this, although I do not think anyone else in the room is—that the right way to approach this groundbreaking, complex and difficult Bill is with a degree of humility. That is never an easy sell in this institution, but I none the less think that if we are prepared to approach this with humility, we will all accept, whether Front Bench or Back Bench, Opposition or Government, that we will not necessarily get everything right first time.

    Therefore, these Report stages in this Bill of all Bills are particularly important to ensure that where we can offer positive improvements, we do so, and that the Government consider them in that spirit of positive improvement. We owe that to this process, but we also owe it to the families who have been present for part of this debate, who have lost far more than we can possibly imagine. We owe it to them to make sure that where we can make the Bill better, we make it better, but that we do not lose the forward momentum that I hope it will now have.

  • John McDonnell – 2022 Speech on the Online Safety Bill

    John McDonnell – 2022 Speech on the Online Safety Bill

    The speech made by John McDonnell, the Labour MP for Hayes and Harlington, in the House of Commons on 5 December 2022.

    The debate so far has been serious, and it has respected the views that have been expressed not only by Members from across the House, on a whole range of issues, but by the families joining us today who have suffered such a sad loss.

    I wish to address one detailed element of the Bill, and I do so in my role as secretary of the National Union of Journalists’ cross-party parliamentary group. It is an issue to which we have returned time and again when we have been debating legislation of this sort. I just want to bring it to the attention of the House; I do not intend to divide the House on this matter. I hope that the Government will take up the issue, and then, perhaps, when it goes to the other place, it will be resolved more effectively than it has been in this place. I am happy to offer the NUJ’s services in seeking to provide a way forward on this matter.

    Many investigative journalists base their stories on confidential information, disclosed often by whistleblowers. There has always been an historic commitment—in this House as well—to protect journalists’ right to protect their sources. It has been at the core of the journalists’ code of practice, promoted by the NUJ. As Members know, in some instances, journalists have even gone to prison to protect their sources, because they believe that it is a fundamental principle of journalism, and also a fundamental principle of the role of journalism in protecting our democracy.

    The growth in the use of digital technology in journalism has raised real challenges in protecting sources. In the case of traditional material, a journalist has possession of it, whereas with digital technology a journalist does not own or control the data in the same way. Whenever legislation of this nature is discussed, there has been a long-standing, cross-party campaign in the House to seek to protect this code of practice of the NUJ and to provide protection for journalists to protect their sources and their information. It goes back as far as the Police and Criminal Evidence Act 1984. If Members can remember the operation of that Act, they will know that it requires the police or the investigatory bodies to produce a production order, and requires notice to be given to journalists of any attempt to access information. We then looked at it again in the Investigatory Powers Act 2016. Again, what we secured there were arrangements by which there should be prior approval by a judicial commissioner before an investigatory power can seek communications data likely to compromise a journalists’ sources. There has been a consistent pattern.

    To comply with Madam Deputy Speaker’s attempt to constrain the length of our speeches, let me briefly explain to Members what amendment 204 would do. It is a moderate probing amendment, which seeks to ask the Government to look again at this matter. When Ofcom is determining whether to issue a notice to intervene or when it is issuing a notice to that tech platform to monitor user-to-user content, the amendment asks it to consider the level of risk of the specified technology accessing, retaining or disclosing the identity of any confidential journalistic source or confidential journalistic material. The amendment stands in the tradition of the other amendments that have been tabled in this House and that successive Government have agreed to. It puts the onus on Ofcom to consider how to ensure that technologies can be limited to the purpose that was intended. It should not result in massive data harvesting operations, which was referred to earlier, or become a back door way for investigating authorities to obtain journalistic data, or material, without official judicial approval.

    Mr Davis

    I rise in support of the right hon. Gentleman. The production order structure, as it stands, is already being abused: I know of a case in place today. The measure should be stronger and clearer—the Bill contains almost nothing on this—on the protection of journalists, whistleblowers and all people for public interest reasons.

    John McDonnell

    The right hon. Gentleman and I have some form on this matter going back a number of years. The amendment is in the tradition that this House has followed of passing legislation to protect journalists, their sources and their material. I make this offer again to the Minister: the NUJ is happy to meet and discuss how the matter can be resolved effectively through the tabling of an amendment in the other place or discussions around codes of practice. However, I emphasise to the Minister that, as we have found previously, the stronger protection is through a measure in the Bill itself.

  • David Davis – 2022 Speech on the Online Safety Bill

    David Davis – 2022 Speech on the Online Safety Bill

    The speech made by David Davis, the Conservative MP for Haltemprice and Howden, in the House of Commons on 5 December 2022.

    I do not agree with every detail of what the hon. Member for Rotherham (Sarah Champion) said, but I share her aims. She has exactly the right surname for what she does in standing up for children.

    To avoid the risk of giving my Whip a seizure, I congratulate the Government and the Minister on all they have done so far, both in delaying the Bill and in modifying their stance.

    My hon. Friend the Member for Solihull (Julian Knight), who is no longer in the Chamber, said that this is five Bills in one and should have had massively more time. At the risk of sounding like a very old man, there was a time when this Bill would have had five days on Report. That is what should have happened with such a big Bill.

    Opposition Members will not agree, but I am grateful that the Government decided to remove the legal but harmful clause. The simple fact is that the hon. Member for Pontypridd (Alex Davies-Jones) and I differ not in our aim—my new clause 16 is specifically designed to protect children—but on the method of achieving it. Once upon a time, there was a tradition that this Chamber would consider a Companies Bill every year, because things change over time. We ought to have a digital Bill every year, specifically to address not legal but harmful but, “Is it harmful enough to be made illegal?” Obviously, self-harm material is harmful enough to be made illegal.

    The hon. Lady and I have similar aims, but we have different perspectives on how to attack this. My perspective is as someone who has seen many pieces of legislation go badly wrong despite the best of intentions.

    The Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), knows he is a favourite of mine. He did a fantastic job in his previous role. I think this Bill is a huge improvement, but he has a lot more to do, as he recognises with the Bill returning to Committee.

    One area on which I disagree with many of my hon. and right hon. Friends is the question of encryption. The Bill allows Ofcom to issue notices directing companies to use “accredited technology,” but it might as well say “magic,” because we do not know what is meant by “accredited technology.” Clause 104 will create a pressure to undermine the end-to-end encryption that is not only desirable but crucial to our telecommunications. The clause sounds innocuous and legalistic, especially given that the notices will be issued to remove terrorist or child sexual exploitation content, which we all agree has no place online.

    Damian Collins

    Rather than it being magic, does my right hon. Friend agree that a company could not ignore it if we demystified the process? By saying there is an existing technology that is available and proven to work, the company would have to explain why it is not using that technology or something better.

    Mr Davis

    I will come back to that in some detail.

    The first time I used encryption it was one-time pads and Morse, so it was a long time ago. The last time was much more recent. The issue here is that clause 104 causes pressure by requiring real-time decryption. The only way to do that is by either having it unencrypted on the server, having it weakly encrypted or creating a back door. I am talking not about metadata, which I will come back to in a second, but about content. In that context, if the content needs to be rapidly accessible, it is bound to lead to weakened encryption.

    This is perhaps a debate for a specialist forum, but it is very dangerous in a whole series of areas. What do we use encryption for? We use it for banking, for legal and privileged conversations, and for conversations with our constituents and families. I could go on and on about the areas in which encryption matters.

    Adam Afriyie (Windsor) (Con)

    My right hon. Friend will be aware that the measure will encompass every single telephone conversation when it switches to IP. That is data, too.

    Mr Davis

    That is correct. The companies cannot easily focus the measure on malicious content alone, and that is the problem. With everything we do in dealing with enforcing the law, we have to balance the extent to which we make the job of the law enforcement agency possible—ideally, easy—against the rights we take away from innocent citizens. That is the key balance. Many bad things happen in households but we do not require people to live in houses with glass walls. That shows the intrinsic problem we have.

    That imposition on privacy cannot sit comfortably with anybody who takes privacy rights seriously. As an aside, let me say to the House that the last thing we need, given that we want something to happen quickly, or at least effectively and soon, is to find ourselves in a Supreme Court case or a European Court case on privacy imposition. I do not think that is necessary. That is where I think the argument stands. If we end up in a case like that, it will not be about paedophiles or criminals; it will be about the weakening of the encryption of the data of an investigative journalist or a whistleblower. That is where it will come back to haunt us and we have to put that test on it. That is my main opening gambit.

    I am conscious that everybody has spoken for quite a long time, so I am trying to make this short. However, the other thing I wish to say is that we have weapons, particularly in terms of metadata. If I recall correctly, Facebook takes down about 300,000 or so sites for paedophile content alone and millions for other reasons; so the use of metadata is very important. Europol carried out a survey of what was useful in terms of the data arising from the internet, social media and the like, and content was put at No. 7, after all sorts of other data. I will not labour the point, but I just worry about this. We need to get it right and so far we have taken more of a blunderbuss approach than a rifle shot. We need to correct that, which is what my two amendments are about.

    The other thing I briefly wish to talk about is new clause 16, which a number of people have mentioned in favourable terms. It will make it an offence to encourage or assist another person to self-harm—that includes suicide. I know that the Government have difficulties getting their proposed provisions right in how they interact with other legislation—the suicide legislation and so on. I will be pressing the new clause to a vote. I urge the Government to take this new clause and to amend the Bill again in the Lords if it is not quite perfect. I want to be sure that this provision goes into the legislation. It comes back to the philosophical distinction involving “legal but harmful”, a decision put first in the hands of a Minister and then in the hands of an entirely Whip-chosen statutory instrument Committee, neither of which are trustworthy vehicles for the protection of free speech. My approach will take it from there and put it in the hands of this Chamber and the other place. Our control, in as much as we control the internet, should be through primary legislation, with maximum scrutiny, exposure and democratic content. If we do it in that way, nobody can argue with us and we will be world leaders, because we are pretty much the only people who can do that.

    As I say, we should come back to this area time and time again, because this Bill will not be the last shot at it. People have talked about the “grey area”. How do we assess a grey area? Do I trust Whitehall to do it? No, I do not; good Minister though we have, he will not always be there and another Minister will be in place. We may have the British equivalent of Trump one day, who knows, and we do not want to leave this provision in that context. We want this House, and the public scrutiny that this Chamber gets, to be in control of it.

    Sir William Cash (Stone) (Con)

    Many years ago, in the 1970s, I was much involved in the Protection of Children Bill, which was one of the first steps in condemning and making illegal explicit imagery of children and their involvement in the making of such films. We then had the broadcasting Acts and the video Acts, and I was very much involved at that time in saying that we ought to prohibit such things in videos and so on. I got an enormous amount of flack for that. We have now moved right the way forward and it is tremendous to see not only the Government but the Opposition co-operating together on this theme. I very much sympathise with not only what my right hon. Friend has just said—I am very inclined to support his new clause for that reason— but with what the right hon. Member for Barking (Dame Margaret Hodge) said. I was deeply impressed by the way in which she presented the argument about the personal liability of directors. We cannot distinguish between a company and the people who run it, and I am interested to hear what the Government have to say in reply to that.

    Mr Davis

    I very much agree with my hon. Friend on that. He and I have been allies in the past—and sometimes opponents—and he has often been far ahead of other people. I am afraid that I do not remember the example from the 1970s, as that was before even my time here, but I remember the intervention he made in the 1990s and the fuss it caused. From that point of view, I absolutely agree with him. My new clause is clearly worded and I hope the House will give it proper consideration. It is important that we put something in the Bill on this issue, even if the Government, quite properly, amend it later.

    I wish to raise one last point, which has come up as we have talked through these issues. I refer to the question of individual responsibility. One or two hon. Ladies on the Opposition Benches have cited algorithmic outcomes. As I said to the right hon. Member for Barking, I am worried about how we place the responsibility, and how it would lead the courts to behave, and so on. We will debate that in the next few days and when the Bill comes back again.

    There is one other issue that nothing in this Bill covers, and I am not entirely sure why. Much of the behaviour pattern is algorithmic and it is algorithmic with an explicit design. As a number of people have said, it is designed as clickbait; it is designed to bring people back. We may get to a point, particularly if we come back to this year after year, of saying, “There are going to be rules about your algorithms, so you have to write it into the algorithm. You will not use certain sorts of content, pornographic content and so on, as clickbait.” We need to think about that in a sophisticated and subtle way. I am looking at my hon. Friend the Member for Folkestone and Hythe (Damian Collins), the ex-Chairman of the Select Committee, on this issue. If we are going to be the innovators—and we are the digital world innovators— we have to get this right.

    Damian Collins

    My right hon. Friend is right to raise this important point. The big area here is not only clickbait, but AI-generated recommendation tools, such as a news feed on Facebook or “next up” on YouTube. Mitigating the illegal content on the platforms is not just about content moderation and removal; it is about not promoting.

    Mr Davis

    My hon. Friend is exactly right about that. I used the example of clickbait as shorthand. The simple truth is that “AI-generated” is also a misnomer, because these things are not normally AI; they are normally algorithms written specifically to recommend and to maximise returns and revenue. We are not surprised at that. Why should we be? After all, these are commercial companies we are talking about and that is what they are going to do. Every commercial company in the world operates within a regulatory framework that prevents them from making profits out of antisocial behaviour.

    Aaron Bell (Newcastle-under-Lyme) (Con)

    On the AI point, let me say that the advances we have seen over the weekend are remarkable. I have just asked OpenAI.com to write a speech in favour of the Bill and it is not bad. That goes to show that the risks to people are not just going to come from algorithms; people are going to be increasingly scammed by AI. We need a Bill that can adapt with the times as we move forward.

    Mr Davis

    Perhaps we should run my speech against—[Laughter.] I am teasing. I am coming to the end of my comments, Madam Deputy Speaker. The simple truth is that these mechanisms—call them what you like—are controllable if we put our mind to it. It requires subtlety, testing the thing out in practice and enormous expert input, but we can get this right.

  • Sarah Champion – 2022 Speech on the Online Safety Bill

    Sarah Champion – 2022 Speech on the Online Safety Bill

    The speech made by Sarah Champion, the Labour MP for Rotherham, in the House of Commons on 5 December 2022.

    I am learning so much sitting here. I am going to speak just on child protection, but all of us are vulnerable to online harms, so I am really grateful to hon. Members across the House who are bringing their specialisms to this debate with the sole aim of strengthening this piece of legislation to protect all of us. I really hope the Government listen to what is being said, because there seems to be a huge amount of consensus on this.

    The reason I am focusing on child protection is that every police officer in this field that I talk to says that, in almost every case, abusers are now finding children first through online platforms. We cannot keep up with the speed or the scale of this, so I look to this Bill to try to do so much more. My frustration is that when the Bill first started, we were very much seen as a world leader in this field, but now the abuse has become so prolific, other countries have stepped in and we are sadly lagging behind, so I really hope the Minister does everything he can to get this into law as soon as possible.

    Although there are aspects of the Bill that go a long way towards tackling child abuse online, it is far from perfect. I want to speak on a number of specific ways in which the Minister can hopefully improve it. The NSPCC has warned that over 100 online grooming and child abuse image crimes are likely to be recorded every day while we wait for this crucial legislation to pass. Of course, that is only the cases that are recorded. The number is going to be far greater than that. There are vital protections in the Bill, but there is a real threat that the use of virtual private networks—VPNs—could undermine the effectiveness of these measures. VPNs allow internet users to hide their private information, such as their location and data. They are commonly used, and often advertised, as a way for people to protect their data or watch online content. For example, on TV services such as Netflix, people might be able to access something only in the US, so they could use a VPN to circumnavigate that to enable them to watch it in this country.

    During the Bill’s evidence sessions, Professor Clare McGlynn said that 75% of children aged 16 and 17 used, or knew how to use, a VPN, which means that they can avoid age verification controls. So if companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed. I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet. The Internet Watch Foundation uses a blocking list to remove this content from internet service providers, but users with a VPN are usually not protected through the provisions they use. It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. Have the Government tested what will happen if someone uses a VPN to give the appearance of being overseas?

    My new clause 54 would require the Secretary of State to publish, within six months of the Bill’s passage, a report on the effect of VPN use on Ofcom’s ability to enforce the requirements under clause 112. If VPNs cause significant issues, the Government must identify those issues and find solutions, rather than avoiding difficult problems.

    New clause 28 would establish a user advocacy body to represent the interests of children in regulatory decisions. Children are not a homogenous group, and an advocacy body could reflect their diverse opinions and experiences. This new clause is widely supported in the House, as we have heard, and the NSPCC has argued that it would be an important way to counterbalance the attempts of big tech companies to reduce their obligations, which are placing their interests over children’s needs.

    I would like to see more third sector organisations consulted on the code of practice. The Internet Watch Foundation, which many Members have discussed, already has the necessary expertise to drastically reduce the amount of child sexual abuse material on the internet. The Government must work with the IWF and build on its knowledge of web page blocking and image hashing.

    Girls in particular face increased risk on social media, with the NSPCC reporting that nearly a quarter of girls who have taken a nude photo have had their image sent to someone else online without their permission. New clauses 45 to 50 would provide important protections to women and girls from intimate image abuse, by making the non-consensual sharing of such photos illegal. I am pleased that the Government have announced that they will look into introducing these measures in the other place, but we are yet to see any measures to compare with these new clauses.

    In the face of the huge increase in online abuse, victims’ services must have the necessary means to provide specialist support. Refuge’s tech abuse team, for example, is highly effective at improving outcomes for thousands of survivors, but the demand for its services is rapidly increasing. It is only right that new clause 23 is instated so that a good proportion of the revenue made from the Bill’s provisions goes towards funding these vital services.

    The landmark report by the independent inquiry into child sexual abuse recently highlighted that, between 2017-18 and 2020-21, there was an approximately 53% rise in recorded grooming offences. With this crime increasingly taking place online, the report emphasised that internet companies will need more moderators to aid technology in identifying this complex type of abuse. I urge the Minister to also require internet companies to provide sufficient and meaningful support to those moderators, who have to view and deal with disturbing images and videos on a daily basis. They, as well as the victims of these horrendous crimes, deserve our support.

    I have consistently advocated for increased prevention of abuse, particularly through education in schools, but we must also ensure that adults, particularly parents, are educated about the threats online. Internet Matters found that parents underestimate the extent to which their children are having negative experiences online, and that the majority of parents believe their 14 to 16-year-olds know more about technology than they do.

    The example that most sticks in my mind was provided by the then police chief in charge of child protection, who said, “What is happening on a Sunday night is that the family are sitting in the living room, all watching telly together. The teenager is online, and is being abused online.” In his words, “You wouldn’t let a young child go and open the door without knowing who is there, but that is what we do every day by giving them their iPad.”

    If parents, guardians, teachers and other professionals are not aware of the risks and safeguards, how are they able to protect children online? I strongly encourage the Government to accept new clauses 29 and 30, which would place an additional duty on Ofcom to promote media literacy. Minister, you have the potential—

    Madam Deputy Speaker (Dame Eleanor Laing)

    Order.

    Sarah Champion

    Thank you, Madam Deputy Speaker. The Minister has the potential to do so much with this Bill. I urge him to do it, and to do it speedily, because that is what this country really needs.

  • Damian Collins – 2022 Speech on the Online Safety Bill

    Damian Collins – 2022 Speech on the Online Safety Bill

    The speech made by Damian Collins, the Conservative MP for Folkestone and Hythe, in the House of Commons on 5 December 2022.

    As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.

    The right hon. Member for Barking (Dame Margaret Hodge) was kind in her words about me and my right hon. Friend the Member for Croydon South (Chris Philp). I know that my successor will continue in the same tradition and, more importantly, that he is supported by a team of officials who have dedicated, in some cases, years of their career to the Bill, who care deeply about it and who want to see it introduced with success. I had better be nice to them because some of them are sitting in the Box.

    It is easy to consider the Bill on Report as it now, thinking about some areas where Members think it goes too far and other areas where Members think it does not quite go far enough, but let us not lose sight of the fact that we are establishing a world-leading regulatory system. It is not the first in the world, but it goes further than any other system in the world in the scope of offences. Companies will have to show priority activity in identifying and mitigating the harm of the unlawful activity. A regulator will be empowered to understand what is going on inside the companies, challenge them on the way that they enforce their codes and hold them to account for that. We currently have the ability to do none of those things. Creating a regulator with that statutory power and the power to fine and demand evidence and information is really important.

    The case of Molly Russell has rightly been cited as so important many times in this debate. One of the hardships was not just the tragedy that the family had to endure and the cold, hard, terrible fact—presented by the coroner—that social media platforms had contributed to the death of their daughter, but that it took years for the family and the coroner, going about his lawful duty, to get hold of the information that was required and to bring it to people’s attention. I have had conversations with social media companies about how they combat self-harm and suicide, including with TikTok about what they were doing to combat the “blackout challenge”, which has led to the death of children in this country and around the world. They reassure us that they have systems in place to deal with that and that they are doing all that they can, but we do not know the truth. We do not know what they can see and we have no legal power to readily get our hands on that information and publish it. That will change.

    This is a systems Bill—the hon. Member for Pontypridd (Alex Davies-Jones) and I have had that conversation over the Dispatch Boxes—because we are principally regulating the algorithms and artificial intelligence that drive the recommendation tools on platforms. The right hon. Member for Barking spoke about that, as have other Members. When we describe pieces of content, they are exemplars of the problem, but the biggest problem is the systems effect. If people posted individually and organically, and that sat on a Facebook page or a YouTube channel that hardly anyone saw, the amount of harm done would be very small. The fact is, however, that those companies have created systems to promote content to people by data-profiling them to keep them on their site longer and to get them coming back more frequently. That has been done for a business reason—to make money. Most of the platforms are basically advertising platforms making money out of other people’s content.

    That point touches on every issue that Members have raised so far today. The Bill squarely makes the companies fully legally liable for their business activity, what they have designed to make money for themselves and the detriment that that can cause other people. That amplification of content, giving people more of what they think they want, is seen as a net positive, and people think that it therefore must always be positive, but it can be extremely damaging and negative.

    That is why the new measures that the Government are introducing on combating self-harm and suicide are so important. Like other Members, I think that the proposal from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) is important, and I hope that the Government’s amendment will address the issue fully. We are talking not just about the existing, very high bar in the law on assisting suicide, which almost means being present and part of the act. The act of consistently, systematically promoting content that exacerbates depression, anxiety and suicidal feelings among anyone, but particularly young people, must be an offence in law and the companies must be held to account for that.

    When Ian Russell spoke about his daughter’s experience, I thought it was particularly moving when he said that police officers were not allowed to view the content on their own. They worked in shifts for short periods of time, yet that content was pushed at a vulnerable girl by a social media platform algorithm when she was on her own, probably late at night, with no one else to see it and no one to protect her. That was done in a systematic way, consistently, over a lengthy period of time. People should be held to account for that. It is outrageous—it is disgusting—that that was allowed to happen. Preventing that is one of the changes that the Bill will help us to deliver.

    Mr David Davis

    I listened with interest to the comments of the right hon. Member for Barking (Dame Margaret Hodge) about who should be held responsible. I am trying to think through how that would work in practice. Frankly, the adjudication mechanism, under Ofcom or whoever it might be, would probably take a rather different view in the case of a company: bluntly, it would go for “on the balance of probabilities”, whereas with an individual it might go for “beyond reasonable doubt”. I am struggling —really struggling—with the question of which would work best. Does my hon. Friend have a view?

    Damian Collins

    My right hon. Friend raises a very good question. As well as having a named individual with criminal liability for the supplying of information, should there be somebody who is accountable within a company, whether that comes with criminal sanctions or not—somebody whose job it is to know? As all hon. Members know if they have served on the Digital, Culture, Media and Sport Committee, which I chaired, on the Public Accounts Committee or on other Select Committees that have questioned people from the big tech companies, the frustrating thing is that no matter who they put up, it never seems to be the person who actually knows.

    There needs to be someone who is legally liable, whether or not they have criminal liability, and is the accountable officer. In the same way as in a financial institution, it is really important to have someone whose job it is to know what is going on and who has certain liabilities. The Bill gives Ofcom the power to seek information and to appoint experts within a company to dig information out and work with the company to get it, but the companies need to feel the same sense of liability that a bank would if its systems had been used to launder money and it had not raised a flag.

    Dame Margaret Hodge rose—

    Damian Collins

    I will dare to give way to yet another former Committee Chair—the former chair of the Public Accounts Committee.

    Dame Margaret Hodge

    I draw all hon. Members’ attention to issues relating to Barclays Bank in the wake of the economic crisis. An authority—I think it was the Serious Fraud Office—attempted to hold both the bank and its directors to account, but it failed because there was not a corporate criminal liability clause that worked. It was too difficult. Putting such a provision in the Bill would be a means of holding individual directors as well as companies to account, whatever standard of proof was used.

    Damian Collins

    I thank the right hon. Lady for that information.

    Let me move on to the debate about encryption, which my right hon. Friend the Member for Haltemprice and Howden has mentioned. I think it is important that Ofcom and law enforcement agencies be able to access information from companies that could be useful in prosecuting cases related to terrorism and child sexual exploitation. No one is suggesting that encrypted messaging services such as WhatsApp should be de-encrypted, and there is no requirement in the Bill for encryption to end, but we might ask how Meta makes money out of WhatsApp when it appears to be free. One way in which it makes money is by gathering huge amounts of data and information about the people who use it, about the names of WhatsApp groups and about the websites people visit before and after sending messages. It gathers a lot of background metadata about people’s activity around using the app and service.

    If someone has visited a website on which severe illegal activity is taking place and has then used a messaging service, and the person to whom they sent the message has done the same, it should be grounds for investigation. It should be easy for law enforcement to get hold of the relevant information without the companies resisting. It should be possible for Ofcom to ask questions about how readily the companies make that information available. That is what the Government seek to do through their amendments on encryption. They are not about creating a back door for encryption, which could create other dangers, and not just on freedom of expression grounds: once a back door to a system is created, even if it is only for the company itself or for law enforcement, other people tend to find their way in.

    Ian Paisley (North Antrim) (DUP)

    I thank the hon. Member for jointly sponsoring my private Member’s Bill, the Digital Devices (Access for Next of Kin) Bill. Does he agree that the best way to make progress is to ensure open access for the next of kin to devices that a deceased person leaves behind?

    Damian Collins

    The hon. Member makes an important point. Baroness Kidron’s amendment has been referred to; I anticipate that future amendments in the House of Lords will also seek to address the issue, which our Joint Committee looked at carefully in our pre-legislative scrutiny.

    It should be much easier than it has been for the Russell family and the coroner to gain access to such important information. However, depending on the nature of the case, there may well be times when it would be wrong for families to have access. I think there has to be an expedited and official process through which the information can be sought, rather than a general provision, because some cases are complicated. There should not be a general right in law, but it needs to be a lot easier than it is. Companies should make the information available much more readily than they have done. The Molly Russell inquest had to be delayed for four months because of the late release of thousands of pages of information from Meta to the coroner. That is clearly not acceptable either.

    My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled an amendment relating to small and risky platforms. The categorisation of platforms on the basis of size was linked to duties under the “legal but harmful” provisions, which we expect now to change. The priority illegal harms apply to platforms of all sizes. Surely when illegal activity is taking place on any platform of any size—I hope that the Minister will clarify this later—Ofcom must have the right to intervene and start asking questions. I think that, in practice, that is how we should expect the system to work.

    Like other Members who served on the Joint Committee —I am thinking particularly of my hon. Friends the Members for Watford (Dean Russell) and for Stourbridge (Suzanne Webb), both of whom spoke so passionately about this subject, and the hon. Member for Ochil and South Perthshire (John Nicolson) raised it as well—I was delighted to see that the Government had tabled amendments to cover Zach’s law. The fact that someone can deliberately seek out a person with epilepsy and target that person with flashing images with the intention of causing a seizure is a terrible example of the way in which systems can be abused. It is wrong for the platforms to be neutral and have no obligation to identify and stop that action, but the action is wrong in practice as well, and it demonstrates the need for us to ensure that the law keeps pace with the nature of new offences. I was very proud to meet Zach and his mother in October. I said to them then that their work had changed the law, and I am glad that the Government have tabled those amendments.

    Dean Russell

    May I pay tribute to my hon. Friend for his chairmanship of the Joint Committee last year? We covered a wide range of challenging ethical, moral and technical decisions, with work across both Houses, and I think that the decisions contained in our report informed many of the Government amendments, but it was my hon. Friend’s chairmanship that helped to guide us through that period.

    Damian Collins

    I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.

    There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.

    We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.

    Priti Patel

    My hon. Friend is making a very thoughtful speech. This is an important point, because it relates to criminality fuelled by online activity. We have discussed that before in the context of advertising. Tools already exist throughout Government to pick up such criminality, but we need the Bill to integrate them and drive the right outcomes—to stop this criminality, to secure the necessary prosecutions, and to bring about the deterrent effect that my hon. Friend the Member for Dover (Mrs Elphicke) is pursuing.

    Damian Collins rose—

    Mrs Natalie Elphicke (Dover) (Con)

    Will my right hon. Friend give way?

    Damian Collins

    Of course.

    Mrs Elphicke

    I am grateful to my right hon. Friend raising this and for his support in this important area that affects our constituencies so much. I will be speaking later to the details of this, which go beyond the advertising payment to the usage, showing and sharing of this. As he has mentioned schedule 7, does he agree that there is—as I have set out in my amendment—a strong case for making sure that it covers all those illegal immigration and modern slavery offences, given the incredible harm that is being caused and that we see on a day-to-day basis?

    Damian Collins

    I agree with my hon. Friend, which is why I think it is important that immigration offences were included in schedule 7 of the Bill. I think this is something my right hon. Friend the Member for Croydon South felt strongly about, having been Immigration Minister before he was a tech Minister. It is right that this has been included in the scope of the Bill and I hope that when the code of practice is developed around that, the scope of those offences will be made clear.

    On whether advertising should be included as well as other postings, it may well be that at this time the Online Safety Bill is not necessarily the vehicle through which that needs to be incorporated. It could be done separately through the review of the online advertising code. Either way, these are loopholes that need to be closed, and the debate around the Online Safety Bill has brought about a recognition of what offences can be brought within the regulatory scope of the Bill and where Ofcom can have a role in enforcing those measures. Indeed, the measures on disinformation in the National Security Bill are good example of that. In some ways it required the National Security Bill to create the offence, and then the offence could be read across into the Online Safety Bill and Ofcom could play a role in regulating the platforms to ensure that they complied with requests to take down networks of Russian state-backed disinformation. Something similar could work with immigration offences as well, but whether it is done that way or through the online advertising review or through new legislation, this is a loophole that needs to be closed.

  • Margaret Hodge – 2022 Speech on the Online Safety Bill

    Margaret Hodge – 2022 Speech on the Online Safety Bill

    The speech made by Margaret Hodge, the Labour MP for Barking, in the House of Commons on 5 December 2022.

    I pay tribute to all the relatives and families of the victims of online abuse who have chosen to be with us today. I am sure that, for a lot of you, our debate is very dry and detached, yet we would not be here but for you. Our hearts are with you all.

    I welcome the Minister to his new role. I hope that he will guide his Bill with the same spirit set by his predecessors, the right hon. Member for Croydon South (Chris Philp) and the hon. Member for Folkestone and Hythe (Damian Collins), who is present today and has done much work on this issue. Both Ministers listened and accepted ideas suggested by Back Benchers across the House. As a result, we had a better Bill.

    We all understand that this is groundbreaking legislation, and that it therefore presents us with complex challenges as we try to legislate to achieve the best answers to the horrific, fast-changing and ever-growing problems of online abuse. Given that complexity, and given that this is our first attempt at regulating online platforms, the new Minister would do well to build on the legacy of his predecessors and approach the amendments on which there are votes tonight as wholly constructive. The policies we are proposing enjoy genuine cross-party support, and are proposed to help the Minister not to cause him problems.

    Let me express particular support for new clauses 45 to 50, in the name of the right hon. Member for Basingstoke (Dame Maria Miller), which tackle the abhorrent misogynistic problem of intimate image abuse, and amendments 1 to 14, in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), which address the issue of smaller platforms falling into category 2, which is now outside the scope of regulations. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosque in Christchurch New Zealand is probably the most egregious example, as the individual concerned used 8chan to plan his attack.

    New clause 15, which I have tabled, seeks to place responsibility for complying with the new law unequivocally on the shoulders of individual directors of online platforms. As the Bill stands, criminal liability is enforced only when senior tech executives fail to co-operate with information requests from Ofcom. I agree that is far too limited, as the right hon. and learned Member for Kenilworth and Southam said. The Bill allows executives to choose and name the individual who Ofcom will hold to account, so that the company itself, not Ofcom, decides who is liable. That is simply not good enough.

    Let me explain the thinking behind new clause 15. The purpose of the Bill is to change behaviour. Our experience in many other spheres of life tells us that the most effective way of achieving such change is to make individuals at the top of an organisation personally responsible for the behaviour of that organisation. We need to hold the chairmen and women, directors and senior executives to account by making those individuals personally liable for the practices and actions of their organisation.

    Let us look at the construction industry, for example. Years ago, building workers dying on construction sites was an all too regular feature of the construction industry. Only when we reformed health and safety legislation and made the directors of construction companies personally responsible and liable for health and safety standards on their sites did we see an incredible 90% drop in deaths on building sites. Similarly, when we introduced corporate and director liability offences in the Bribery Act 2010, companies stopped trying to bribe their way into contracts.

    It is not that we want to lock up directors of construction companies or trading companies, or indeed directors of online platforms; it is that the threat of personal criminal prosecution is the most powerful and effective way of changing behaviour. It is just the sort of deterrent tool that the Bill needs if it is to protect children and adults from online harms. That is especially important in this context, because the business model that underpins the profits that platforms enjoy encourages harmful content. The platforms need to encourage traffic on their sites, because the greater the traffic, the more attractive their sites become to advertisers; and the more advertising revenue they secure, the higher the profits they enjoy.

    Harmful content attracts more traffic and so supports the platforms’ business objectives. We know that from studies such as the one by Harvard law professor Jonathan Zittrain, which showed that posts that tiptoe close to violating platforms’ terms and conditions generate far more engagement. We also know that from Mark Zuckerberg’s decisions in the lead-up to and just after the 2020 presidential elections, when he personally authorised tweaks to the Facebook algorithm to reduce the spread of election misinformation. However, after the election, despite officials at Facebook asking for the change to stay, he ensured that the previous algorithm was placed back on. An internal Facebook memo revealed that the tweak preventing fake news had led to “a decrease in sessions”, which made his offer less attractive to advertising and impacted his profits. Restoring fake news helped restore his profits.

    The incentives in online platforms’ business models promote rather than prevent online harms, and we will not break those incentives by threatening to fine companies. We know from our experience elsewhere that, even at 10% of global revenue, such fines will inevitably be viewed as a cost to business, which will simply be passed on by raising advertising charges. However, we can and will break the incentives in the business model if we make Mark Zuckerberg or Elon Musk personally responsible for breaking the rules. It will not mean that we will lock them up, much as some of us might be tempted to do so. It will, however, provide that most powerful incentive that we have as legislators to change behaviour.

    Furthermore, we know that the directors of online platforms personally take decisions in relation to harmful content, so they should be personally held to account. In 2018, Facebook’s algorithm was promoting posts for users in Myanmar that incited violence against protesters. The whistleblower Frances Haugen showed evidence that Facebook was aware that its engagement-based content was fuelling the violence, but it continued to roll it out on its platforms worldwide without checks. Decisions made at the top resulted in direct ethnic violence on the ground. That same year, Zuckerberg gave a host of interviews defending his decision to keep holocaust-denial on his platform, saying he did not believe that posts should be taken down for people getting it wrong. The debate continued for two years until 2020, when only after months of protest he finally decided to remove that abhorrent content.

    In what world do we live where overpaid executives running around in their jeans and sneakers are allowed to make decisions on the hoof about how their platforms should be regulated without being held to account for their actions?

    Mr David Davis

    The right hon. Lady and I have co-operated to deal with international corporate villains, so I am interested in her proposal. However, a great number of these actions are taken by algorithms—I speak as someone who was taken down by a Google algorithm—so what happens then? I see no reason why we should not penalise directors, but how do we establish culpability?

    Dame Margaret Hodge

    That is for an investigation by the appropriate enforcement agency—Ofcom et al.—and if there is evidence that culpability rests with the managing director, the owner or whoever, they should be prosecuted. It is as simple as that. A case would have to be established through evidence, and that should be carried out by the enforcement agency. I do not think that this is any different from any other form of financial or other crime. In fact, it is from my experience in that that I came to this conclusion.

    John Penrose (Weston-super-Mare) (Con)

    The right hon. Lady is making a powerful case, particularly on the effective enforcement of rules to ensure that they bite properly and that people genuinely pay attention to them. She gave the example of a senior executive talking about whether people should be stopped for getting it wrong—I think the case she mentioned was holocaust denial—by making factually inaccurate statements or allowing factually inaccurate statements to persist on their platform. May I suggest that her measures would be even stronger if she were to support new clause 34, which I have tabled? My new clause would require factual inaccuracy to become wrong, to be prevented and to be pursued by the kinds of regulators she is talking about. It would be a much stronger basis on which her measure could then abut.

    Dame Margaret Hodge

    Indeed. The way the hon. Gentleman describes his new clause, which I will look at, is absolutely right, but can I just make a more general point because it speaks to the point about legal but harmful? What I really fear with the legal but harmful rule is that we create more and more laws to make content illegal and that, ironically, locks up more and more people, rather than creates structures and systems that will prevent the harm occurring in the first place. So I am not always in favour of new laws simply criminalising individuals. I would love us to have kept to the legal but harmful route.

    We can look to Elon Musk’s recent controversial takeover of Twitter. Decisions taken by Twitter’s newest owner—by Elon Musk himself—saw use of the N-word increase by nearly 500% within 12 hours of acquisition. And allowing Donald Trump back on Twitter gives a chilling permission to Trump and others to use the site yet again to incite violence.

    The tech giants know that their business models are dangerous. Platforms can train their systems to recognise so-called borderline content and reduce engagement. However, it is for business reasons, and business reasons alone, that they actively choose not to do that. In fact, they do the opposite and promote content known to trigger extreme emotions. These platforms are like a “danger for profit” machine, and the decision to allow that exploitation is coming from the top. Do not take my word for it; just listen to the words of Ian Russell. He has said:

    “The only person that I’ve ever come across in this whole world…that thought that content”—

    the content that Molly viewed—

    “was safe was…Meta.”

    There is a huge disconnect between what silicon valley executives think is safe and what we expect, both for ourselves and for our children. By introducing liability for directors, the behaviour of these companies might finally change. Experience elsewhere has shown us that that would prove to be the most effective way of keeping online users safe. New clause 17 would hold directors of a regulated service personally liable on the grounds that they have failed, or are failing, to comply with any duties set in relation to their service, for instance failure that leads to the death of a child. The new clause further states that the decision on who was liable would be made by Ofcom, not the provider, meaning that responsibility could not be shirked.

    I say to all Members that if we really want to reduce the amount of harmful abuse online, then making senior directors personally liable is a very good way of achieving it. Some 82% of UK adults agree with us, Labour Front Benchers agree and Back Benchers across the House agree. So I urge the Government to rethink their position on director liability and support new clause 17 as a cross-party amendment. I really think it will make a difference.

  • Priti Patel – 2022 Speech on the Online Safety Bill

    Priti Patel – 2022 Speech on the Online Safety Bill

    The speech made by Priti Patel, the Conservative MP for Witham, in the House of Commons on 5 December 2022.

    Before I speak to specific clauses I pay tribute to all the campaigners, particularly the families who have campaigned so hard to give their loved ones a voice through this Bill and to change our laws. Having had some prior involvement in the early stages of this Bill three years ago as Home Secretary, I also pay tribute to many of the officials and Members of this House on both sides who have worked assiduously on the construction, development and advancement of this Bill. In particular, I pay tribute to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and the work of the Joint Committee; when I was Home Secretary we had many discussions about this important work. I also thank the Minister for the assiduous way in which he has handled interventions and actually furthered the debate with this Bill. There are many Government Departments that have a raft of involvement and engagement.

    The victims must be at the heart of everything that we do now to provide safeguards and protections. Children and individuals have lost their lives because of the online space. We know there is a great deal of good in the online space, but also a great deal of harm, and that must unite us all in delivering this legislation. We have waited a long time for this Bill, but we must come together, knowing that this is foundational legislation, which will have to be improved and developed alongside the technology, and that there is much more work to do.

    I start by focusing on a couple of the new clauses, beginning with Government new clause 11 on end-to-end encryption. The House will not be surprised by my background in dealing with end-to-end encryption, particularly the harmful content, the types of individuals and the perpetrators who hide behind end-to-end encryption. We must acknowledge the individuals who harm children or who peddle terrorist content through end-to-end encryption while recognising that encryption services are important to protect privacy.

    There is great justification for encryption—business transactions, working for the Government and all sorts of areas of importance—but we must acknowledge in this House that there is more work to do, because these services are being used by those who would do harm to our country, threaten our national interest or threaten the safety of young people and children in particular. We know for a fact that there are sick-minded individuals who seek to abuse and exploit children and vulnerable adults. The Minister will know that, and I am afraid that many of us do. I speak now as a constituency Member of Parliament, and one of my first surgery cases back in 2010 was the sad and tragic case of a mother who came to see me because her son had accessed all sorts of content. Thanks to the Bill, that content will now be ruled as harmful. There were other services associated with access that the family could not see and could not get access to, and encryption platforms are part of that.

    There are shocking figures, and I suspect that many of my colleagues in the House will be aware of them. Almost 100,000 reports relating to online child abuse were received by UK enforcement agencies in 2021 alone. That is shocking. The House will recognise my experience of working with the National Crime Agency, to which we must pay tribute for its work in this space, as we should to law enforcement more widely. Police officers and all sorts of individuals in law enforcement are, day in, day out, investigating these cases and looking at some of the most appalling images and content, all in the name of protecting vulnerable children, and we must pay tribute to them as well.

    It is also really shocking that that figure of 100,000 reports in 2021 alone is a 29% increase on the previous year. The amount of disturbing content is going up and up, and we are, I am afraid, looking only at the tip of the iceberg. So, I think it is absolutely right—and I will always urge the Government and whichever Secretary of State, be they in the Home Office, DMCS or the MOJ—to put the right measures and powers in place so that we act to prevent child sexual abuse and exploitation, prevent terrorist content from being shielded behind the platforms of encryption and, importantly, bring those involved to face justice. End-to-end encryption is one thing, but we need end-to-end justice for victims and the prevention of the most heinous crimes.

    This is where we, as a House, must come together. I commend the hon. Member for Rotherham (Sarah Champion) in particular for her work relating to girls, everything to do with the grooming gangs, and the most appalling crimes against individuals, quite frankly. I will always urge colleagues to support the Bill, on which we will need to build going forward.

    I think I can speak with experience about the difficulties in drafting legislation—both more broadly and specifically in this area, which is complex and challenging. It is hard to foresee the multiplicity of circumstances. My hon. Friend the Member for Folkestone and Hythe was absolutely right to say in his comments to the SNP spokesman, the hon. Member for Ochil and South Perthshire (John Nicolson), that we have to focus on illegal content. It is difficult to get the balance right between the lawful and harmful. The illegal side is what we must focus on.

    I also know that many campaigners and individuals—they are not just campaigners, but families—have given heartbreaking and devastating accounts of their experiences of online harms. As legislators, we owe them this Bill, because although their suffering is not something that we will experience, it must bring about the type of changes that we all want to see for everyone—children, adults and vulnerable individuals.

    May I ask the Minister for reassurances on the definition of “best endeavours”? As my right hon. Friend the Member for Basingstoke (Dame Maria Miller) touched on, when it comes to implementation, that will be the area where the rubber hits the road. That is where we will need to know that our collective work will be meaningful and will deliver protections—not just change, but protections. We must be honest about the many serious issues that will arise even after we pass the Bill—be it, God forbid, a major terrorist incident, or cases of child sexual exploitation—and there is a risk that, without clarity in this area, when a serious issue does arise, we may not know whether a provider undertook best endeavours. I think we owe it to everyone to ensure that we run a slide rule over this on every single granular detail.

    Cases and issues relating to best endeavours are debated and discussed extensively in court cases, coroner inquests and for social services relating to child safeguarding issues, for example—all right hon. and hon. Members here will have experience of dealing with social services on behalf of their constituents in child protection cases—or, even worse, in serious case reviews or public inquiries that could come in future. I worry that in any response a provider could say that it did its best and had undertaken its best endeavours, as a defence. That would be unacceptable. That would lead those affected to feel as if they suffered an even greater injustice than the violations that they experienced. It is not clear whether best endeavours will be enough to change the culture, behaviour and attitudes of online platforms.

    I raise best endeavours in the context of changing attitudes and cultures because in many institutions, that very issue is under live debate right now. That may be in policing, attitudes around women and girls or how we protect other vulnerable groups, even in other services such as the fire service, which we have heard about recently. It is important that we ask those questions and have the scrutiny. We need to hear more about what constitutes best endeavours. Who will hold the providers to account? Ofcom clearly has a role. I know the Minister will do a very earnest and diligent job to provide answers, but the best endeavours principle goes wider than just the Minister on the Front Bench—it goes across the whole of Government. He knows that we will give him every backing to use his sharp elbows—perhaps I can help with my sharp elbows—to ensure that others are held to account.

    It will also be for Ofcom to give further details and guidance. As ever, the guidance will be so important. The guidance has to have teeth and statutory powers. It has to be able to put the mirror up and hold people to account. For example, would Ofcom be able, in its notices to providers, to instruct them to use specific technologies and programmes to tackle and end the exposure to exploitation, in relation to end-to-end encryption services, to protect victims? That is an open question, but one that could be put to Ofcom and could be an implementation test. There is no reason why we should not put a series of questions to Ofcom around how it would practically implement.

    I would like to ask the Minister why vulnerable adults and victims of domestic abuse and violence against women and girls are not included. We must do everything in this House. This is not about being party political. When it comes to all our work on women and violence against women and girls, there should be no party politics whatsoever. We should ensure that what is right for one group is consistent and that the laws are strengthened. That will require the MOJ, as well as the Home Office, to ensure that the work is joined up in the right kind of way.

    It is right that powers are available for dealing with terrorist threats and tackling child sexual abuse thoroughly. There is some good work around terrorist content. There is excellent work in GIFCT, the Global Internet Forum to Counter Terrorism. The technology companies are doing great work. There is international co-operation in this space. The House should take some comfort in the fact that the United Kingdom leads the world in this space. We owe our gratitude to our intelligence and security agencies. I give my thanks to MI5 in particular for its work and to counter-terrorism policing, because they have led the world robustly in this work.

    Damian Collins

    My right hon. Friend makes an important point about this being a cross-Government effort. The Online Safety Bill creates a regulatory framework for the internet, but we need to make sure that we have the right offences in law clearly defined. Then, it is easy to read them and cross them with legislation. If we do not have that, it is a job for the whole of Government.

    Priti Patel

    Exactly that. My hon. Friend is absolutely right. I come back to the point about drafting this legislation, which is not straightforward and easy because of the definitions. It is not just about what is in scope of the Bill but about the implications of the definitions and how they could be applied in law.

    The Minister touched on the criminal side of things; interpretation in the criminal courts and how that would be applied in case law are the points that need to be fleshed out. This is where our work on CT is so important, because across the world with Five Eyes we have been consistent. Again, there are good models out there that can be built upon. We will not fix all this through one Bill—we know that. This Bill is foundational, which is why we must move forward.

    On new clause 11, I seek clarity—in this respect, I need reassurance not from the Minister but from other parts of government—on how victims and survivors, whether of terrorist activity, domestic abuse or violence against women and girls, will be supported and protected by the new safeguards in the Bill, and by the work of the Victims’ Commissioner.

    Rachel Maclean (Redditch) (Con)

    I thank my right hon. Friend for sharing her remarks with the House. She is making an excellent speech based on her considerable experience. On the specific issue of child sexual abuse and exploitation, many organisations, such as the Internet Watch Foundation, are instrumental in removing reports and web pages containing that vile and disgusting material. In the April 2020 White Paper, the Government committed to look at how the Internet Watch Foundation could use its technical expertise in that field. Does she agree that it would be good to hear from the Minister about how the Internet Watch Foundation could work with Ofcom to assist victims?

    Priti Patel

    My hon. Friend is absolutely right. I thank her for not just her intervention but her steadfast work when she was a Home Office Minister with responsibility for safeguarding. I also thank the Internet Watch Foundation; many of the statistics and figures that we have been using about child sexual abuse and exploitation content, and the take-downs, are thanks to its work. There is some important work to do there. The Minister will be familiar with its work—[Interruption.] Exactly that.

    We need the expertise of the Internet Watch Foundation, so it is about integrating that skillset. There is a great deal of expertise out there, including at the Internet Watch Foundation, at GIFCT on the CT side and, obviously, in our services and agencies. As my right hon. Friend the Member for Basingstoke said, it is crucial that we pool organisations’ expertise to implement the Bill, as we will not be able to create it all over again overnight in government.

    I thank my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) for tabling new clause 16, which would create new offences to address the challenges caused by those who promote, encourage and assist self-harm. That has been the subject of much of the debate already, which is absolutely right when we think about the victims and their families. In particular, I thank the Samaritans and others for their work to highlight this important issue. I do not need to dwell on the Samaritans’ report, because I think all hon. Members have read it.

    All hon. Members who spoke in the early stages of the Bill, which I did not because I was in government, highlighted this essential area. It is important to ensure that we do everything we can to address it in the right way. Like all right hon. and hon. Members, I pay tribute to the family of Molly Russell. There are no words for the suffering that they have endured, but their campaign of bravery, courage and fortitude aims to close every loophole to stop other young people being put at risk.

    Right hon. and hon. Members meet young people in schools every week, and we are also parents and, in some cases, grandparents. To know that this grey area leaves so many youngsters at risk is devastating, so we have almost a collective corporate duty to stand up and do the right thing. The long and short of it is that we need to be satisfied, when passing the Bill, that we are taking action to protect vulnerable people and youngsters who are susceptible to dangerous communications.

    As I have emphasised, we should also seek to punish those who cause and perpetrate this harm and do everything we can to protect those who are vulnerable, those with learning disabilities, those with mental health conditions, and those who are exposed to self-harm content. We need to protect them and we have a duty to do that, so I look forward to the Minister’s reply.

    I welcome new clauses 45 to 50, tabled by my right hon. Friend the Member for Basingstoke. I pay tribute to her for her work; she has been a strong campaigner for protecting the privacy of individuals, especially women and children, and for closing loopholes that have enabled people to be humiliated or harmed in the ways she has spoken about so consistently in the House. I am pleased that the Deputy Prime Minister, my right hon. Friend the Member for Esher and Walton (Dominic Raab), announced last month that the Government would table amendments in the other place to criminalise the sharing of intimate images, photographs and videos without consent; that is long overdue. When I was Home Secretary I heard the most appalling cases, with which my right hon. Friend the Member for Basingstoke will be familiar. I have met so many victims and survivors, and we owe it to them to do the right thing.

    It would be reassuring to hear not just from the Minister in this debate, but from other Ministers in the Departments involved in the Bill, to ensure they are consistent in giving voice to the issues and in working through their Ministries on the implementation—not just of this Bill, but of the golden thread that runs throughout the legislation. Over the last three years, we have rightly produced a lot of legislation to go after perpetrators, and support women and girls, including the Domestic Abuse Act 2021. We should use those platforms to stand up for the individuals affected by these issues.

    I want to highlight the importance of the provisions to protect women and girls, particularly the victims and survivors of domestic abuse and violence. Some abusive partners and ex-partners use intimate images in their possession; as the Minister said, that is coercive control which means that the victim ends up living their life in fear. That is completely wrong. We have heard and experienced too many harrowing and shocking stories of women who have suffered as a result of the use of such images and videos. It must now be a priority for the criminal justice system, and the online platforms in particular, to remove such content. This is no longer a negotiation. Too many of us—including myself, when I was Home Secretary—have phoned platforms at weekends and insisted that they take down content. Quite frankly, I have then been told, “Twitter doesn’t work on a Saturday, Home Secretary” or “This is going to take time.” That is not acceptable. It is an absolute insult to the victims, and is morally reprehensible and wrong. The platforms must be held to account.

    Hon. Members will be well aware of the Home Office’s work on the tackling violence against women and girls strategy. I pay tribute to all colleagues, but particularly my hon. Friend the Member for Redditch (Rachel Maclean), who was the Minister at the time. The strategy came about after much pain, sorrow and loss of life, and it garnered an unprecedented 180,000 responses. The range of concerns raised were predominantly related to the issues we are discussing today. We can no longer stay mute and turn a blind eye. We must ensure that the safety of women in the public space offline—on the streets—and online is respected. We know how women feel about the threats. The strategy highlighted so much; I do not want to go over it again, as it is well documented and I have spoken about it in the House many times.

    It remains a cause of concern that the Bill does not include a specific VAWG code of practice. We want and need the Bill. We are not going to fix everything through it, but, having spent valued time with victims and survivors, I genuinely believe that we could move towards a code of practice. Colleagues, this is an area on which we should unite, and we should bring such a provision forward; it is vital.

    Let me say a few words in support of new clause 23, which was tabled by my right hon. Friend the Member for Basingstoke. I have always been a vocal and strong supporter of services for victims of crime, and of victims full stop. I think it was 10 years ago that I stood in this House and proposed a victims code of practice—a victims Bill is coming, and we look forward to that as well. This Government have a strong record of putting more resources into support for victims, including the £440 million over three years, but it is imperative that offenders—those responsible for the harm caused to victims—are made to pay, and it is absolutely right that they should pay more in compensation.

    Companies profiteering from online platforms where these harms are being perpetrated should be held to account. When companies fail in their duties and have been found wanting, they must make a contribution for the harm caused. There are ways in which we can do that. There has been a debate already, and I heard the hon. Member for Pontypridd (Alex Davies-Jones) speak for the Opposition about one way, but I think we should be much more specific now, particularly in individual cases. I want to see those companies pay the price for their crimes, and I expect the financial penalties issued to reflect the severity of the harm caused—we should support that—and that such money should go to supporting the victims.

    I pay tribute to the charities, advocacy groups and other groups that, day in and day out, have supported the victims of crime and of online harms. I have had an insight into that work from my former role in Government, but we should never underestimate how traumatic and harrowing it is. I say that about the support groups, but we have to magnify that multiple times for the victims. This is one area where we must ensure that more is done to provide extra resources for them. I look forward to hearing more from the Minister, but also from Ministers from other Departments in this space.

    I will conclude on new clause 28, which has already been raised, on the advocacy body for children. There is a long way to go with this—there really is. Children are harmed in just too many ways, and the harm is unspeakable. We have touched on this in earlier debates and discussions on the Bill, in relation to child users on online platforms, and there will be further harm. I gently urge the Government —if not today or through this Bill, then later—to think about how we can pull together the skills and expertise in organisations outside this House and outside Government that give voice to children who have nowhere else to go.

    This is not just about the online space; in the cases in the constituency of the hon. Member for Rotherham (Sarah Champion) and other constituencies, we have seen children being harmed under cover. Statutory services failed them and the state failed them. It was state institutional failure that let children down in the cases in Rotherham and other child grooming cases. We could see that all over again in the online space, and I really urge the Government to make sure that that does not happen—and actually never happens again, because those cases are far too harrowing.

    There really is a lot here, and we must come together to ensure that the Bill comes to pass, but there are so many other areas where we can collectively put aside party politics and give voice to those who really need representation.

  • Julian Knight – 2022 Speech on the Online Safety Bill

    Julian Knight – 2022 Speech on the Online Safety Bill

    The speech made by Julian Knight, the Chair of the Culture Select Committee, in the House of Commons on 5 December 2022.

    I welcome the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), to his place. To say that he has been given a hospital pass in terms of this legislation is a slight understatement. It is very difficult to understand, and the ability he has shown at the Dispatch Box in grasping many of the major issues is to his credit. He really is a safe pair of hands and I thank him for that.

    Looking at the list of amendments, I think it is a bit of a hotchpotch, yet we are going to deal only with certain amendments today and others are not in scope. That shows exactly where we are with this legislation. We have been in this stasis now for five years. I remember that we were dealing with the issue when I joined the Digital, Culture, Media and Sport Committee, and it is almost three years since the general election when we said we would bring forward this world-leading legislation. We have to admit that is a failure of the political class in all respects, but we have to understand the problem and the realities facing my hon. Friend, other Ministers and the people from different Departments involved in drafting this legislation.

    We are dealing with companies that are more powerful than the oil barons and railway barons of the 19th century. These companies are more important than many states. The total value of Alphabet, for instance, is more than the total GDP of the Netherlands, and that is probably a low estimate of Alphabet’s global reach and power. These companies are, in many respects, almost new nation states in their power and reach, and they have been brought about by individuals having an idea in their garage. They still have that culture of having power without the consequences that flow from it.

    These companies have created wonderful things that enhance our lives in many respects through better communication and increased human knowledge, which we can barely begin to imagine, but they have done it with a skater boy approach—the idea that they are beyond the law. They had that enshrined in law in the United States, where they have effectively become nothing more than a megaphone or a noticeboard, and they have always relied on that. They are based or domiciled, in the main, in the United States, which is where they draw their legal power. They will always be in that position of power.

    We talk about 10% fines and even business interruption to ensure these companies have skin in the game, but we have to realise these businesses are so gigantic and of such importance that they could simply ignore what we do in this place. Will we really block a major social media platform? The only time something like that has been done was when a major social media platform blocked a country, if I remember rightly. We have to understand where we are coming from in that respect.

    This loose cannon, Elon Musk, is an enormously wealthy man, and he is quite strange, isn’t he? He is intrinsically imbued with the power of silicon valley and those new techno-masters of the universe. We are dealing with those realities, and this Bill is very imperfect.

    Mr David Davis

    My hon. Friend is giving a fascinating disquisition on this industry, but is not the implication that, in effect, these companies are modern buccaneer states and we need to do much more to legislate? I am normally a deregulator, but we need more than one Bill to do what we seek to do today.

    Julian Knight

    My right hon. Friend is correct. We spoke privately before this debate, and he said this is almost five Bills in one. There will be a patchwork of legislation, and there is a time limit. This is a carry-over Bill, and we have to get it on the statute book.

    This Bill is not perfect by any stretch of the imagination, and I take the Opposition’s genuine concerns about legal but harmful material. The shadow Minister mentioned the tragic case of Molly Russell. I heard her father being interviewed on the “Today” programme, and he spoke about how at least three quarters of the content he had seen that had prompted that young person to take her life had been legal but harmful. We have to stand up, think and try our best to ensure there is a safer space for young people. This Bill does part of that work, but only part. The work will be done in the execution of the Bill, through the wording on age verification and age assurance.

    Dame Maria Miller

    Given the complexities of the Bill, and given the Digital, Culture, Media and Sport Committee’s other responsibilities, will my hon. Friend join me in saying there should be a special Committee, potentially of both Houses, to keep this area under constant review? That review, as he says, is so badly needed.

    Julian Knight

    I thank my right hon. Friend for her question, which I have previously addressed. The problem is the precedent it would set. Any special Committee set up by a Bill would be appointed by the Whips, so we might as well forget about the Select Committee system. This is not a huge concern for the Digital, Culture, Media and Sport Committee, because the advent of any such special Committee would probably be beyond the next general election, and I am not thinking to that timeframe. I am concerned about the integrity of Parliament. The problem is that if we do that in this Bill, the next Government will come along and do it with another Bill and then another Bill. Before we know it, we will have a Select Committee system that is Whips-appointed and narrow in definition, and that cuts across something we all vote for.

    There are means by which we can have legislative scrutiny—that is the point I am making in my speech. I would very much welcome a Committee being set up after a year, temporarily, to carry out post-legislative scrutiny. My Committee has a Sub-Committee on disinformation and fake news, which could also look at this Bill going forward. So I do not accept my right hon. Friend’s point, but I appreciate completely the concerns about our needing proper scrutiny in this area. We must also not forget that any changes to Ofcom’s parameters can be put in a statutory instrument, which can by prayed against by the Opposition and thus we would have the scrutiny of the whole House in debate, which is preferable to having a Whips-appointed Committee.

    I have gone into quite a bit of my speech there, so I am grateful for that intervention in many respects. I am not going to touch on every aspect of this issue, but I urge right hon. and hon. Members in all parts of the House to think about the fact that although this is far from perfect legislation and it is a shame that we have not found a way to work through the legal but harmful material issue, we have to understand the parameters we are working in, in the real world, with these companies. We need to see that there is a patchwork of legislation, and the biggest way in which we can effectively let the social media companies know they have skin in the game in society—a liberal society that created them—is through competition legislation, across other countries and other jurisdictions. I am talking about our friends in the European Union and in the United States. We are working together closely now to come up with a suite of competition legislation. That is how we will be able to cover off some of this going forward. I will be supporting this Bill tonight and I urge everyone to do so, because, frankly, after five years I have had enough.

  • Alex Davies-Jones – 2022 Speech on the Online Safety Bill

    Alex Davies-Jones – 2022 Speech on the Online Safety Bill

    The speech made by Alex Davies-Jones, the Shadow Culture Minister, in the House of Commons on 5 December 2022.

    It is an absolute pleasure to be back in the Chamber to respond on behalf of the Opposition to this incredibly important piece of legislation on its long overdue second day on Report. It certainly has not been an easy ride so far: I am sure that Bill Committee colleagues across the House agree that unpicking and making sense of this unnecessarily complicated Bill has been anything but straightforward.

    We should all be incredibly grateful and are all indebted to the many individuals, charities, organisations and families who have worked so hard to bring online safety to the forefront for us all. Today is a particularly important day, as we are joined in the Public Gallery by a number of families who have lost children in connection with online harms. They include Lorin LaFave, Ian Russell, Andy and Judy Thomas, Amanda and Stuart Stephens and Ruth Moss. I sincerely hope that this debate will do justice to their incredible hard work and commitment in the most exceptionally difficult of circumstances.

    We must acknowledge that the situation has been made even harder by the huge changes that we have seen in the Government since the Bill was first introduced. Since its First Reading, it has been the responsibility of three different Ministers and two Secretaries of State. Remarkably, it has seen three Prime Ministers in post, too. We can all agree that legislation that will effectively keep people safe online urgently needs to be on the statute book: that is why Labour has worked hard and will continue to work hard to get the Bill over the line, despite the best efforts of this Government to kick the can down the road.

    The Government have made a genuine mess of this important legislation. Before us today are a huge number of new amendments tabled by the Government to their own Bill. We now know that the Government also plan to recommit parts of their own Bill—to send them back into Committee, where the Minister will attempt to make significant changes that are likely to damage even further the Bill’s ability to properly capture online harm.

    We need to be moving forwards, not backwards. With that in mind, I am keen to speak to a number of very important new clauses this afternoon. I will first address new clause 17, which was tabled by my right hon. Friend the Member for Barking (Dame Margaret Hodge), who has been an incredibly passionate and vocal champion for internet regulation for many years.

    As colleagues will be aware, the new clause will fix the frustrating gaps in Ofcom’s enforcement powers. As the Bill stands, it gives Ofcom the power to fine big tech companies only 10% of their turnover for compliance failures. It does not take a genius to recognise that that can be a drop in the ocean for some of the global multimillionaires and billionaires whose companies are often at the centre of the debate around online harm. That is why the new clause, which will mean individual directors, managers or other officers finally being held responsible for their compliance failures, is so important. When it comes to responsibilities over online safety, it is clear that the Bill needs to go further if the bosses in silicon valley are truly to sit up, take notice and make positive and meaningful changes.

    Sir Jeremy Wright (Kenilworth and Southam) (Con)

    I am afraid I cannot agree with the hon. Lady that the fines would be a drop in the ocean. These are very substantial amounts of money. In relation to individual director liability, I completely understand where the right hon. Member for Barking (Dame Margaret Hodge) is coming from, and I support a great deal of what she says. However, there are difficulties with the amendment. Does the hon. Member for Pontypridd (Alex Davies-Jones) accept that it would be very odd to end up in a position in which the only individual director liability attached to information offences, meaning that, as long as an individual director was completely honest with Ofcom about their wrongdoing, they would attract no individual liability?

    Alex Davies-Jones

    It may be a drop in the ocean to the likes of Elon Musk or Mark Zuckerberg—these multibillionaires who are taking over social media and using it as their personal plaything. They are not going to listen to fines; the only way they are going to listen, sit up and take notice is if criminal liability puts their neck on the line and makes them answer for some of the huge failures of which they are aware.

    The right hon. and learned Member mentions that he shares the sentiment of the amendment but feels it could be wrong. We have an opportunity here to put things right and put responsibility where it belongs: with the tech companies, the platforms and the managers responsible. In a similar way to what happens in the financial sector or in health and safety regulation, it is vital that people be held responsible for issues on their platforms. We feel that criminal liability will make that happen.

    Mr David Davis

    May I intervene on a point of fact? The hon. Lady says that fines are a drop in the ocean. The turnover of Google is $69 billion; 10% of that is just shy of $7 billion. That is not a drop in the ocean, even to Elon Musk.

    Alex Davies-Jones

    We are looking at putting people on the line. It needs to be something that people actually care about. Money does not matter to these people, as we have seen with the likes of Google, Elon Musk and Mark Zuckerberg; what matters to them is actually being held to account. Money may matter to Government Members, but it will be criminal liability that causes people to sit up, listen and take responsibility.

    While I am not generally in the habit of predicting the Minister’s response or indeed his motives—although my job would be a hell of a lot easier if I did—I am confident that he will try to peddle the line that it was the Government who introduced director liability for compliance failures in an earlier draft of the Bill. Let me be crystal clear in making this point, because it is important. The Bill, in its current form, makes individuals at the top of companies personally liable only when a platform fails to supply information to Ofcom, which misses the point entirely. Directors must be held personally liable when safety duties are breached. That really is quite simple, and I am confident that it would be effective in tackling harm online much more widely.

    We also support new clause 28, which seeks to establish an advocacy body to represent the interests of children online. It is intended to deal with a glaring omission from the Bill, which means that children who experience online sexual abuse will receive fewer statutory user advocacy protections than users of a post office or even passengers on a bus. The Minister must know that that is wrong and, given his Government’s so-called commitment to protecting children, I hope he will carefully consider a new clause which is supported by Members on both sides of the House as well as the brilliant National Society for the Prevention of Cruelty to Children. In rejecting new clause 28, the Government would be denying vulnerable children a strong, authoritative voice to represent them directly, so I am keen to hear the Minister’s justification for doing so, if that is indeed his plan.

    Members will have noted the bundle of amendments tabled by my hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) relating to Labour’s concerns about the unnecessary powers to overrule Ofcom that the Bill, as currently drafted, gives the Secretary of State of the day. During Committee evidence sessions, we heard from Will Perrin of the Carnegie UK Trust, who, as Members will know, is an incredibly knowledgeable voice when it comes to internet regulation. He expressed concern about the fact that, in comparison with other regulatory frameworks such as those in place for advertising, the Bill

    “goes a little too far in introducing a range of powers for the Secretary of State to interfere with Ofcom’s day-to-day doing of its business.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 117.]

    Labour shares that concern. Ofcom must be truly independent if it is to be an effective regulator. Surely we have to trust it to undertake logical processes, rooted in evidence, to arrive at decisions once this regime is finally up and running. It is therefore hard to understand how the Government can justify direct interference, and I hope that the Minister will seriously consider amendments 23 to 30, 32, and 35 to 41.

    Before I address Labour’s main concerns about the Government’s proposed changes to the Bill, I want to record our support for new clauses 29 and 30, which seek to bring media literacy duties back into the scope of the Bill. As we all know, media literacy is the first line of defence when it comes to protecting ourselves against false information online. Prevention is always better than cure. Whether it is a question of viral conspiracy theories or Russian disinformation, Labour fears that the Government’s approach to internet regulation will create a two-tier internet, leaving some more vulnerable than others.

    However, I am sorry to say that the gaps in this Bill do not stop there. I was pleased to see that my hon. Friend the Member for Rotherham (Sarah Champion) had tabled new clause 54, which asks the Government to formally consider the impact that the use of virtual private networks will have on Ofcom’s ability to enforce its powers. This touches on the issue of future-proofing, which Labour has raised repeatedly in debates on the Bill. As we have heard from a number of Members, the tech industry is evolving rapidly, with concepts such as the metaverse changing the way in which we will all interact with the internet in the future. When the Bill was first introduced, TikTok was not even a platform. I hope the Minister can reassure us that the Bill will be flexible enough to deal with those challenges head-on; after all, we have waited far too long.

    That brings me to what Labour considers to be an incredible overturn by the Government relating to amendment 239, which seeks to remove the new offence of harmful communications from the Bill entirely. As Members will know, the communications offence was designed by the Law Commission with the intention of introducing a criminal threshold for the most dangerous online harms. Indeed, in Committee it was welcome to hear the then Minister—the present Minister for Crime, Policing and Fire, the right hon. Member for Croydon South (Chris Philp)—being so positive about the Government’s consultation with the commission. In relation to clause 151, which concerns the communications offences, he even said:

    “The Law Commission is the expert in this kind of thing…and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do.” ––[Official Report, Online Safety Public Bill Committee, 21 June 2022; c. 558.]

    Less than six months down the line, we are seeing yet another U-turn from this Government, who are doing precisely the opposite of what was promised.

    Removing these communications offences from the Bill will have real-life consequences. It will mean that harmful online trends such as hoax bomb threats, abusive social media pile-ons and fake news such as encouraging people to drink bleach to cure covid will be allowed to spread online without any consequence.

    Christian Wakeford (Bury South) (Lab)

    No Jewish person should have to log online and see Hitler worship, but what we have seen in recent weeks from Kanye West has been nothing short of disgusting, from him saying “I love Hitler” to inciting online pile-ons against Jewish people, and this is magnified by the sheer number of his followers, with Jews actually being attacked on the streets in the US. Does my hon. Friend agree that the Government’s decision to drop the “legal but harmful” measures from the Bill will allow this deeply offensive and troubling behaviour to continue?

    Alex Davies-Jones

    I thank my hon. Friend for that important and powerful intervention. Let us be clear: everything that Kanye West said online is completely abhorrent and has no place in our society. It is not for any of us to glorify Hitler and his comments or praise him for the work he did; that is absolutely abhorrent and it should never be online. Sadly, however, that is exactly the type of legal but harmful content that will now be allowed to proliferate online because of the Government’s swathes of changes to the Bill, meaning that that would be allowed to be seen by everybody. Kanye West has 30 million followers online. His followers will be able to look at, share, research and glorify that content without any consequence to that content being freely available online.

    Dame Margaret Hodge

    Further to that point, it is not just that some of the content will be deeply offensive to the Jewish community; it could also harm wider society. Some further examples of postings that would be considered legal but harmful are likening vaccination efforts to Nazi death camps and alleging that NHS nurses should stand trial for genocide. Does my hon. Friend not agree that the changes the Government are now proposing will lead to enormous and very damaging impacts right through society?

    Alex Davies-Jones

    My right hon. Friend is absolutely right. I am keen to bring this back into scope before Mr Speaker chastises us any further, but she is right to say that this will have a direct real-world impact. This is what happens when we focus on content rather than directly on the platforms and the algorithms on the platforms proliferating this content. That is where the focus needs to be. It is the algorithms that share and amplify this content to these many followers time and again that need to be tackled, rather than the content itself. That is what we have been pleading with the Government to concentrate on, but here we are in this mess.

    We are pleased that the Government have taken on board Labour’s policy to criminalise certain behaviours—including the encouragement of self-harm, sharing people’s intimate images without their consent, and controlling or coercive behaviours—but we believe that the communications offences more widely should remain in order to tackle dangerous online harms at their root. We have worked consistently to get this Bill over the line and we have reached out to do so. It has been subject to far too many delays and it is on the Government’s hands that we are again facing substantial delays, when internet regulation has never been more sorely needed. I know that the Minister knows that, and I sincerely hope he will take our concerns seriously. I reach out to him again across the Dispatch Box, and look forward to working with him and challenging him further where required as the Bill progresses. I look forward to getting the Bill on to the statute book.