Category: Culture

  • Sarah Champion – 2022 Speech on the Online Safety Bill

    Sarah Champion – 2022 Speech on the Online Safety Bill

    The speech made by Sarah Champion, the Labour MP for Rotherham, in the House of Commons on 5 December 2022.

    I am learning so much sitting here. I am going to speak just on child protection, but all of us are vulnerable to online harms, so I am really grateful to hon. Members across the House who are bringing their specialisms to this debate with the sole aim of strengthening this piece of legislation to protect all of us. I really hope the Government listen to what is being said, because there seems to be a huge amount of consensus on this.

    The reason I am focusing on child protection is that every police officer in this field that I talk to says that, in almost every case, abusers are now finding children first through online platforms. We cannot keep up with the speed or the scale of this, so I look to this Bill to try to do so much more. My frustration is that when the Bill first started, we were very much seen as a world leader in this field, but now the abuse has become so prolific, other countries have stepped in and we are sadly lagging behind, so I really hope the Minister does everything he can to get this into law as soon as possible.

    Although there are aspects of the Bill that go a long way towards tackling child abuse online, it is far from perfect. I want to speak on a number of specific ways in which the Minister can hopefully improve it. The NSPCC has warned that over 100 online grooming and child abuse image crimes are likely to be recorded every day while we wait for this crucial legislation to pass. Of course, that is only the cases that are recorded. The number is going to be far greater than that. There are vital protections in the Bill, but there is a real threat that the use of virtual private networks—VPNs—could undermine the effectiveness of these measures. VPNs allow internet users to hide their private information, such as their location and data. They are commonly used, and often advertised, as a way for people to protect their data or watch online content. For example, on TV services such as Netflix, people might be able to access something only in the US, so they could use a VPN to circumnavigate that to enable them to watch it in this country.

    During the Bill’s evidence sessions, Professor Clare McGlynn said that 75% of children aged 16 and 17 used, or knew how to use, a VPN, which means that they can avoid age verification controls. So if companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed. I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet. The Internet Watch Foundation uses a blocking list to remove this content from internet service providers, but users with a VPN are usually not protected through the provisions they use. It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. Have the Government tested what will happen if someone uses a VPN to give the appearance of being overseas?

    My new clause 54 would require the Secretary of State to publish, within six months of the Bill’s passage, a report on the effect of VPN use on Ofcom’s ability to enforce the requirements under clause 112. If VPNs cause significant issues, the Government must identify those issues and find solutions, rather than avoiding difficult problems.

    New clause 28 would establish a user advocacy body to represent the interests of children in regulatory decisions. Children are not a homogenous group, and an advocacy body could reflect their diverse opinions and experiences. This new clause is widely supported in the House, as we have heard, and the NSPCC has argued that it would be an important way to counterbalance the attempts of big tech companies to reduce their obligations, which are placing their interests over children’s needs.

    I would like to see more third sector organisations consulted on the code of practice. The Internet Watch Foundation, which many Members have discussed, already has the necessary expertise to drastically reduce the amount of child sexual abuse material on the internet. The Government must work with the IWF and build on its knowledge of web page blocking and image hashing.

    Girls in particular face increased risk on social media, with the NSPCC reporting that nearly a quarter of girls who have taken a nude photo have had their image sent to someone else online without their permission. New clauses 45 to 50 would provide important protections to women and girls from intimate image abuse, by making the non-consensual sharing of such photos illegal. I am pleased that the Government have announced that they will look into introducing these measures in the other place, but we are yet to see any measures to compare with these new clauses.

    In the face of the huge increase in online abuse, victims’ services must have the necessary means to provide specialist support. Refuge’s tech abuse team, for example, is highly effective at improving outcomes for thousands of survivors, but the demand for its services is rapidly increasing. It is only right that new clause 23 is instated so that a good proportion of the revenue made from the Bill’s provisions goes towards funding these vital services.

    The landmark report by the independent inquiry into child sexual abuse recently highlighted that, between 2017-18 and 2020-21, there was an approximately 53% rise in recorded grooming offences. With this crime increasingly taking place online, the report emphasised that internet companies will need more moderators to aid technology in identifying this complex type of abuse. I urge the Minister to also require internet companies to provide sufficient and meaningful support to those moderators, who have to view and deal with disturbing images and videos on a daily basis. They, as well as the victims of these horrendous crimes, deserve our support.

    I have consistently advocated for increased prevention of abuse, particularly through education in schools, but we must also ensure that adults, particularly parents, are educated about the threats online. Internet Matters found that parents underestimate the extent to which their children are having negative experiences online, and that the majority of parents believe their 14 to 16-year-olds know more about technology than they do.

    The example that most sticks in my mind was provided by the then police chief in charge of child protection, who said, “What is happening on a Sunday night is that the family are sitting in the living room, all watching telly together. The teenager is online, and is being abused online.” In his words, “You wouldn’t let a young child go and open the door without knowing who is there, but that is what we do every day by giving them their iPad.”

    If parents, guardians, teachers and other professionals are not aware of the risks and safeguards, how are they able to protect children online? I strongly encourage the Government to accept new clauses 29 and 30, which would place an additional duty on Ofcom to promote media literacy. Minister, you have the potential—

    Madam Deputy Speaker (Dame Eleanor Laing)

    Order.

    Sarah Champion

    Thank you, Madam Deputy Speaker. The Minister has the potential to do so much with this Bill. I urge him to do it, and to do it speedily, because that is what this country really needs.

  • Damian Collins – 2022 Speech on the Online Safety Bill

    Damian Collins – 2022 Speech on the Online Safety Bill

    The speech made by Damian Collins, the Conservative MP for Folkestone and Hythe, in the House of Commons on 5 December 2022.

    As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.

    The right hon. Member for Barking (Dame Margaret Hodge) was kind in her words about me and my right hon. Friend the Member for Croydon South (Chris Philp). I know that my successor will continue in the same tradition and, more importantly, that he is supported by a team of officials who have dedicated, in some cases, years of their career to the Bill, who care deeply about it and who want to see it introduced with success. I had better be nice to them because some of them are sitting in the Box.

    It is easy to consider the Bill on Report as it now, thinking about some areas where Members think it goes too far and other areas where Members think it does not quite go far enough, but let us not lose sight of the fact that we are establishing a world-leading regulatory system. It is not the first in the world, but it goes further than any other system in the world in the scope of offences. Companies will have to show priority activity in identifying and mitigating the harm of the unlawful activity. A regulator will be empowered to understand what is going on inside the companies, challenge them on the way that they enforce their codes and hold them to account for that. We currently have the ability to do none of those things. Creating a regulator with that statutory power and the power to fine and demand evidence and information is really important.

    The case of Molly Russell has rightly been cited as so important many times in this debate. One of the hardships was not just the tragedy that the family had to endure and the cold, hard, terrible fact—presented by the coroner—that social media platforms had contributed to the death of their daughter, but that it took years for the family and the coroner, going about his lawful duty, to get hold of the information that was required and to bring it to people’s attention. I have had conversations with social media companies about how they combat self-harm and suicide, including with TikTok about what they were doing to combat the “blackout challenge”, which has led to the death of children in this country and around the world. They reassure us that they have systems in place to deal with that and that they are doing all that they can, but we do not know the truth. We do not know what they can see and we have no legal power to readily get our hands on that information and publish it. That will change.

    This is a systems Bill—the hon. Member for Pontypridd (Alex Davies-Jones) and I have had that conversation over the Dispatch Boxes—because we are principally regulating the algorithms and artificial intelligence that drive the recommendation tools on platforms. The right hon. Member for Barking spoke about that, as have other Members. When we describe pieces of content, they are exemplars of the problem, but the biggest problem is the systems effect. If people posted individually and organically, and that sat on a Facebook page or a YouTube channel that hardly anyone saw, the amount of harm done would be very small. The fact is, however, that those companies have created systems to promote content to people by data-profiling them to keep them on their site longer and to get them coming back more frequently. That has been done for a business reason—to make money. Most of the platforms are basically advertising platforms making money out of other people’s content.

    That point touches on every issue that Members have raised so far today. The Bill squarely makes the companies fully legally liable for their business activity, what they have designed to make money for themselves and the detriment that that can cause other people. That amplification of content, giving people more of what they think they want, is seen as a net positive, and people think that it therefore must always be positive, but it can be extremely damaging and negative.

    That is why the new measures that the Government are introducing on combating self-harm and suicide are so important. Like other Members, I think that the proposal from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) is important, and I hope that the Government’s amendment will address the issue fully. We are talking not just about the existing, very high bar in the law on assisting suicide, which almost means being present and part of the act. The act of consistently, systematically promoting content that exacerbates depression, anxiety and suicidal feelings among anyone, but particularly young people, must be an offence in law and the companies must be held to account for that.

    When Ian Russell spoke about his daughter’s experience, I thought it was particularly moving when he said that police officers were not allowed to view the content on their own. They worked in shifts for short periods of time, yet that content was pushed at a vulnerable girl by a social media platform algorithm when she was on her own, probably late at night, with no one else to see it and no one to protect her. That was done in a systematic way, consistently, over a lengthy period of time. People should be held to account for that. It is outrageous—it is disgusting—that that was allowed to happen. Preventing that is one of the changes that the Bill will help us to deliver.

    Mr David Davis

    I listened with interest to the comments of the right hon. Member for Barking (Dame Margaret Hodge) about who should be held responsible. I am trying to think through how that would work in practice. Frankly, the adjudication mechanism, under Ofcom or whoever it might be, would probably take a rather different view in the case of a company: bluntly, it would go for “on the balance of probabilities”, whereas with an individual it might go for “beyond reasonable doubt”. I am struggling —really struggling—with the question of which would work best. Does my hon. Friend have a view?

    Damian Collins

    My right hon. Friend raises a very good question. As well as having a named individual with criminal liability for the supplying of information, should there be somebody who is accountable within a company, whether that comes with criminal sanctions or not—somebody whose job it is to know? As all hon. Members know if they have served on the Digital, Culture, Media and Sport Committee, which I chaired, on the Public Accounts Committee or on other Select Committees that have questioned people from the big tech companies, the frustrating thing is that no matter who they put up, it never seems to be the person who actually knows.

    There needs to be someone who is legally liable, whether or not they have criminal liability, and is the accountable officer. In the same way as in a financial institution, it is really important to have someone whose job it is to know what is going on and who has certain liabilities. The Bill gives Ofcom the power to seek information and to appoint experts within a company to dig information out and work with the company to get it, but the companies need to feel the same sense of liability that a bank would if its systems had been used to launder money and it had not raised a flag.

    Dame Margaret Hodge rose—

    Damian Collins

    I will dare to give way to yet another former Committee Chair—the former chair of the Public Accounts Committee.

    Dame Margaret Hodge

    I draw all hon. Members’ attention to issues relating to Barclays Bank in the wake of the economic crisis. An authority—I think it was the Serious Fraud Office—attempted to hold both the bank and its directors to account, but it failed because there was not a corporate criminal liability clause that worked. It was too difficult. Putting such a provision in the Bill would be a means of holding individual directors as well as companies to account, whatever standard of proof was used.

    Damian Collins

    I thank the right hon. Lady for that information.

    Let me move on to the debate about encryption, which my right hon. Friend the Member for Haltemprice and Howden has mentioned. I think it is important that Ofcom and law enforcement agencies be able to access information from companies that could be useful in prosecuting cases related to terrorism and child sexual exploitation. No one is suggesting that encrypted messaging services such as WhatsApp should be de-encrypted, and there is no requirement in the Bill for encryption to end, but we might ask how Meta makes money out of WhatsApp when it appears to be free. One way in which it makes money is by gathering huge amounts of data and information about the people who use it, about the names of WhatsApp groups and about the websites people visit before and after sending messages. It gathers a lot of background metadata about people’s activity around using the app and service.

    If someone has visited a website on which severe illegal activity is taking place and has then used a messaging service, and the person to whom they sent the message has done the same, it should be grounds for investigation. It should be easy for law enforcement to get hold of the relevant information without the companies resisting. It should be possible for Ofcom to ask questions about how readily the companies make that information available. That is what the Government seek to do through their amendments on encryption. They are not about creating a back door for encryption, which could create other dangers, and not just on freedom of expression grounds: once a back door to a system is created, even if it is only for the company itself or for law enforcement, other people tend to find their way in.

    Ian Paisley (North Antrim) (DUP)

    I thank the hon. Member for jointly sponsoring my private Member’s Bill, the Digital Devices (Access for Next of Kin) Bill. Does he agree that the best way to make progress is to ensure open access for the next of kin to devices that a deceased person leaves behind?

    Damian Collins

    The hon. Member makes an important point. Baroness Kidron’s amendment has been referred to; I anticipate that future amendments in the House of Lords will also seek to address the issue, which our Joint Committee looked at carefully in our pre-legislative scrutiny.

    It should be much easier than it has been for the Russell family and the coroner to gain access to such important information. However, depending on the nature of the case, there may well be times when it would be wrong for families to have access. I think there has to be an expedited and official process through which the information can be sought, rather than a general provision, because some cases are complicated. There should not be a general right in law, but it needs to be a lot easier than it is. Companies should make the information available much more readily than they have done. The Molly Russell inquest had to be delayed for four months because of the late release of thousands of pages of information from Meta to the coroner. That is clearly not acceptable either.

    My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled an amendment relating to small and risky platforms. The categorisation of platforms on the basis of size was linked to duties under the “legal but harmful” provisions, which we expect now to change. The priority illegal harms apply to platforms of all sizes. Surely when illegal activity is taking place on any platform of any size—I hope that the Minister will clarify this later—Ofcom must have the right to intervene and start asking questions. I think that, in practice, that is how we should expect the system to work.

    Like other Members who served on the Joint Committee —I am thinking particularly of my hon. Friends the Members for Watford (Dean Russell) and for Stourbridge (Suzanne Webb), both of whom spoke so passionately about this subject, and the hon. Member for Ochil and South Perthshire (John Nicolson) raised it as well—I was delighted to see that the Government had tabled amendments to cover Zach’s law. The fact that someone can deliberately seek out a person with epilepsy and target that person with flashing images with the intention of causing a seizure is a terrible example of the way in which systems can be abused. It is wrong for the platforms to be neutral and have no obligation to identify and stop that action, but the action is wrong in practice as well, and it demonstrates the need for us to ensure that the law keeps pace with the nature of new offences. I was very proud to meet Zach and his mother in October. I said to them then that their work had changed the law, and I am glad that the Government have tabled those amendments.

    Dean Russell

    May I pay tribute to my hon. Friend for his chairmanship of the Joint Committee last year? We covered a wide range of challenging ethical, moral and technical decisions, with work across both Houses, and I think that the decisions contained in our report informed many of the Government amendments, but it was my hon. Friend’s chairmanship that helped to guide us through that period.

    Damian Collins

    I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.

    There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.

    We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.

    Priti Patel

    My hon. Friend is making a very thoughtful speech. This is an important point, because it relates to criminality fuelled by online activity. We have discussed that before in the context of advertising. Tools already exist throughout Government to pick up such criminality, but we need the Bill to integrate them and drive the right outcomes—to stop this criminality, to secure the necessary prosecutions, and to bring about the deterrent effect that my hon. Friend the Member for Dover (Mrs Elphicke) is pursuing.

    Damian Collins rose—

    Mrs Natalie Elphicke (Dover) (Con)

    Will my right hon. Friend give way?

    Damian Collins

    Of course.

    Mrs Elphicke

    I am grateful to my right hon. Friend raising this and for his support in this important area that affects our constituencies so much. I will be speaking later to the details of this, which go beyond the advertising payment to the usage, showing and sharing of this. As he has mentioned schedule 7, does he agree that there is—as I have set out in my amendment—a strong case for making sure that it covers all those illegal immigration and modern slavery offences, given the incredible harm that is being caused and that we see on a day-to-day basis?

    Damian Collins

    I agree with my hon. Friend, which is why I think it is important that immigration offences were included in schedule 7 of the Bill. I think this is something my right hon. Friend the Member for Croydon South felt strongly about, having been Immigration Minister before he was a tech Minister. It is right that this has been included in the scope of the Bill and I hope that when the code of practice is developed around that, the scope of those offences will be made clear.

    On whether advertising should be included as well as other postings, it may well be that at this time the Online Safety Bill is not necessarily the vehicle through which that needs to be incorporated. It could be done separately through the review of the online advertising code. Either way, these are loopholes that need to be closed, and the debate around the Online Safety Bill has brought about a recognition of what offences can be brought within the regulatory scope of the Bill and where Ofcom can have a role in enforcing those measures. Indeed, the measures on disinformation in the National Security Bill are good example of that. In some ways it required the National Security Bill to create the offence, and then the offence could be read across into the Online Safety Bill and Ofcom could play a role in regulating the platforms to ensure that they complied with requests to take down networks of Russian state-backed disinformation. Something similar could work with immigration offences as well, but whether it is done that way or through the online advertising review or through new legislation, this is a loophole that needs to be closed.

  • Margaret Hodge – 2022 Speech on the Online Safety Bill

    Margaret Hodge – 2022 Speech on the Online Safety Bill

    The speech made by Margaret Hodge, the Labour MP for Barking, in the House of Commons on 5 December 2022.

    I pay tribute to all the relatives and families of the victims of online abuse who have chosen to be with us today. I am sure that, for a lot of you, our debate is very dry and detached, yet we would not be here but for you. Our hearts are with you all.

    I welcome the Minister to his new role. I hope that he will guide his Bill with the same spirit set by his predecessors, the right hon. Member for Croydon South (Chris Philp) and the hon. Member for Folkestone and Hythe (Damian Collins), who is present today and has done much work on this issue. Both Ministers listened and accepted ideas suggested by Back Benchers across the House. As a result, we had a better Bill.

    We all understand that this is groundbreaking legislation, and that it therefore presents us with complex challenges as we try to legislate to achieve the best answers to the horrific, fast-changing and ever-growing problems of online abuse. Given that complexity, and given that this is our first attempt at regulating online platforms, the new Minister would do well to build on the legacy of his predecessors and approach the amendments on which there are votes tonight as wholly constructive. The policies we are proposing enjoy genuine cross-party support, and are proposed to help the Minister not to cause him problems.

    Let me express particular support for new clauses 45 to 50, in the name of the right hon. Member for Basingstoke (Dame Maria Miller), which tackle the abhorrent misogynistic problem of intimate image abuse, and amendments 1 to 14, in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), which address the issue of smaller platforms falling into category 2, which is now outside the scope of regulations. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosque in Christchurch New Zealand is probably the most egregious example, as the individual concerned used 8chan to plan his attack.

    New clause 15, which I have tabled, seeks to place responsibility for complying with the new law unequivocally on the shoulders of individual directors of online platforms. As the Bill stands, criminal liability is enforced only when senior tech executives fail to co-operate with information requests from Ofcom. I agree that is far too limited, as the right hon. and learned Member for Kenilworth and Southam said. The Bill allows executives to choose and name the individual who Ofcom will hold to account, so that the company itself, not Ofcom, decides who is liable. That is simply not good enough.

    Let me explain the thinking behind new clause 15. The purpose of the Bill is to change behaviour. Our experience in many other spheres of life tells us that the most effective way of achieving such change is to make individuals at the top of an organisation personally responsible for the behaviour of that organisation. We need to hold the chairmen and women, directors and senior executives to account by making those individuals personally liable for the practices and actions of their organisation.

    Let us look at the construction industry, for example. Years ago, building workers dying on construction sites was an all too regular feature of the construction industry. Only when we reformed health and safety legislation and made the directors of construction companies personally responsible and liable for health and safety standards on their sites did we see an incredible 90% drop in deaths on building sites. Similarly, when we introduced corporate and director liability offences in the Bribery Act 2010, companies stopped trying to bribe their way into contracts.

    It is not that we want to lock up directors of construction companies or trading companies, or indeed directors of online platforms; it is that the threat of personal criminal prosecution is the most powerful and effective way of changing behaviour. It is just the sort of deterrent tool that the Bill needs if it is to protect children and adults from online harms. That is especially important in this context, because the business model that underpins the profits that platforms enjoy encourages harmful content. The platforms need to encourage traffic on their sites, because the greater the traffic, the more attractive their sites become to advertisers; and the more advertising revenue they secure, the higher the profits they enjoy.

    Harmful content attracts more traffic and so supports the platforms’ business objectives. We know that from studies such as the one by Harvard law professor Jonathan Zittrain, which showed that posts that tiptoe close to violating platforms’ terms and conditions generate far more engagement. We also know that from Mark Zuckerberg’s decisions in the lead-up to and just after the 2020 presidential elections, when he personally authorised tweaks to the Facebook algorithm to reduce the spread of election misinformation. However, after the election, despite officials at Facebook asking for the change to stay, he ensured that the previous algorithm was placed back on. An internal Facebook memo revealed that the tweak preventing fake news had led to “a decrease in sessions”, which made his offer less attractive to advertising and impacted his profits. Restoring fake news helped restore his profits.

    The incentives in online platforms’ business models promote rather than prevent online harms, and we will not break those incentives by threatening to fine companies. We know from our experience elsewhere that, even at 10% of global revenue, such fines will inevitably be viewed as a cost to business, which will simply be passed on by raising advertising charges. However, we can and will break the incentives in the business model if we make Mark Zuckerberg or Elon Musk personally responsible for breaking the rules. It will not mean that we will lock them up, much as some of us might be tempted to do so. It will, however, provide that most powerful incentive that we have as legislators to change behaviour.

    Furthermore, we know that the directors of online platforms personally take decisions in relation to harmful content, so they should be personally held to account. In 2018, Facebook’s algorithm was promoting posts for users in Myanmar that incited violence against protesters. The whistleblower Frances Haugen showed evidence that Facebook was aware that its engagement-based content was fuelling the violence, but it continued to roll it out on its platforms worldwide without checks. Decisions made at the top resulted in direct ethnic violence on the ground. That same year, Zuckerberg gave a host of interviews defending his decision to keep holocaust-denial on his platform, saying he did not believe that posts should be taken down for people getting it wrong. The debate continued for two years until 2020, when only after months of protest he finally decided to remove that abhorrent content.

    In what world do we live where overpaid executives running around in their jeans and sneakers are allowed to make decisions on the hoof about how their platforms should be regulated without being held to account for their actions?

    Mr David Davis

    The right hon. Lady and I have co-operated to deal with international corporate villains, so I am interested in her proposal. However, a great number of these actions are taken by algorithms—I speak as someone who was taken down by a Google algorithm—so what happens then? I see no reason why we should not penalise directors, but how do we establish culpability?

    Dame Margaret Hodge

    That is for an investigation by the appropriate enforcement agency—Ofcom et al.—and if there is evidence that culpability rests with the managing director, the owner or whoever, they should be prosecuted. It is as simple as that. A case would have to be established through evidence, and that should be carried out by the enforcement agency. I do not think that this is any different from any other form of financial or other crime. In fact, it is from my experience in that that I came to this conclusion.

    John Penrose (Weston-super-Mare) (Con)

    The right hon. Lady is making a powerful case, particularly on the effective enforcement of rules to ensure that they bite properly and that people genuinely pay attention to them. She gave the example of a senior executive talking about whether people should be stopped for getting it wrong—I think the case she mentioned was holocaust denial—by making factually inaccurate statements or allowing factually inaccurate statements to persist on their platform. May I suggest that her measures would be even stronger if she were to support new clause 34, which I have tabled? My new clause would require factual inaccuracy to become wrong, to be prevented and to be pursued by the kinds of regulators she is talking about. It would be a much stronger basis on which her measure could then abut.

    Dame Margaret Hodge

    Indeed. The way the hon. Gentleman describes his new clause, which I will look at, is absolutely right, but can I just make a more general point because it speaks to the point about legal but harmful? What I really fear with the legal but harmful rule is that we create more and more laws to make content illegal and that, ironically, locks up more and more people, rather than creates structures and systems that will prevent the harm occurring in the first place. So I am not always in favour of new laws simply criminalising individuals. I would love us to have kept to the legal but harmful route.

    We can look to Elon Musk’s recent controversial takeover of Twitter. Decisions taken by Twitter’s newest owner—by Elon Musk himself—saw use of the N-word increase by nearly 500% within 12 hours of acquisition. And allowing Donald Trump back on Twitter gives a chilling permission to Trump and others to use the site yet again to incite violence.

    The tech giants know that their business models are dangerous. Platforms can train their systems to recognise so-called borderline content and reduce engagement. However, it is for business reasons, and business reasons alone, that they actively choose not to do that. In fact, they do the opposite and promote content known to trigger extreme emotions. These platforms are like a “danger for profit” machine, and the decision to allow that exploitation is coming from the top. Do not take my word for it; just listen to the words of Ian Russell. He has said:

    “The only person that I’ve ever come across in this whole world…that thought that content”—

    the content that Molly viewed—

    “was safe was…Meta.”

    There is a huge disconnect between what silicon valley executives think is safe and what we expect, both for ourselves and for our children. By introducing liability for directors, the behaviour of these companies might finally change. Experience elsewhere has shown us that that would prove to be the most effective way of keeping online users safe. New clause 17 would hold directors of a regulated service personally liable on the grounds that they have failed, or are failing, to comply with any duties set in relation to their service, for instance failure that leads to the death of a child. The new clause further states that the decision on who was liable would be made by Ofcom, not the provider, meaning that responsibility could not be shirked.

    I say to all Members that if we really want to reduce the amount of harmful abuse online, then making senior directors personally liable is a very good way of achieving it. Some 82% of UK adults agree with us, Labour Front Benchers agree and Back Benchers across the House agree. So I urge the Government to rethink their position on director liability and support new clause 17 as a cross-party amendment. I really think it will make a difference.

  • Priti Patel – 2022 Speech on the Online Safety Bill

    Priti Patel – 2022 Speech on the Online Safety Bill

    The speech made by Priti Patel, the Conservative MP for Witham, in the House of Commons on 5 December 2022.

    Before I speak to specific clauses I pay tribute to all the campaigners, particularly the families who have campaigned so hard to give their loved ones a voice through this Bill and to change our laws. Having had some prior involvement in the early stages of this Bill three years ago as Home Secretary, I also pay tribute to many of the officials and Members of this House on both sides who have worked assiduously on the construction, development and advancement of this Bill. In particular, I pay tribute to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and the work of the Joint Committee; when I was Home Secretary we had many discussions about this important work. I also thank the Minister for the assiduous way in which he has handled interventions and actually furthered the debate with this Bill. There are many Government Departments that have a raft of involvement and engagement.

    The victims must be at the heart of everything that we do now to provide safeguards and protections. Children and individuals have lost their lives because of the online space. We know there is a great deal of good in the online space, but also a great deal of harm, and that must unite us all in delivering this legislation. We have waited a long time for this Bill, but we must come together, knowing that this is foundational legislation, which will have to be improved and developed alongside the technology, and that there is much more work to do.

    I start by focusing on a couple of the new clauses, beginning with Government new clause 11 on end-to-end encryption. The House will not be surprised by my background in dealing with end-to-end encryption, particularly the harmful content, the types of individuals and the perpetrators who hide behind end-to-end encryption. We must acknowledge the individuals who harm children or who peddle terrorist content through end-to-end encryption while recognising that encryption services are important to protect privacy.

    There is great justification for encryption—business transactions, working for the Government and all sorts of areas of importance—but we must acknowledge in this House that there is more work to do, because these services are being used by those who would do harm to our country, threaten our national interest or threaten the safety of young people and children in particular. We know for a fact that there are sick-minded individuals who seek to abuse and exploit children and vulnerable adults. The Minister will know that, and I am afraid that many of us do. I speak now as a constituency Member of Parliament, and one of my first surgery cases back in 2010 was the sad and tragic case of a mother who came to see me because her son had accessed all sorts of content. Thanks to the Bill, that content will now be ruled as harmful. There were other services associated with access that the family could not see and could not get access to, and encryption platforms are part of that.

    There are shocking figures, and I suspect that many of my colleagues in the House will be aware of them. Almost 100,000 reports relating to online child abuse were received by UK enforcement agencies in 2021 alone. That is shocking. The House will recognise my experience of working with the National Crime Agency, to which we must pay tribute for its work in this space, as we should to law enforcement more widely. Police officers and all sorts of individuals in law enforcement are, day in, day out, investigating these cases and looking at some of the most appalling images and content, all in the name of protecting vulnerable children, and we must pay tribute to them as well.

    It is also really shocking that that figure of 100,000 reports in 2021 alone is a 29% increase on the previous year. The amount of disturbing content is going up and up, and we are, I am afraid, looking only at the tip of the iceberg. So, I think it is absolutely right—and I will always urge the Government and whichever Secretary of State, be they in the Home Office, DMCS or the MOJ—to put the right measures and powers in place so that we act to prevent child sexual abuse and exploitation, prevent terrorist content from being shielded behind the platforms of encryption and, importantly, bring those involved to face justice. End-to-end encryption is one thing, but we need end-to-end justice for victims and the prevention of the most heinous crimes.

    This is where we, as a House, must come together. I commend the hon. Member for Rotherham (Sarah Champion) in particular for her work relating to girls, everything to do with the grooming gangs, and the most appalling crimes against individuals, quite frankly. I will always urge colleagues to support the Bill, on which we will need to build going forward.

    I think I can speak with experience about the difficulties in drafting legislation—both more broadly and specifically in this area, which is complex and challenging. It is hard to foresee the multiplicity of circumstances. My hon. Friend the Member for Folkestone and Hythe was absolutely right to say in his comments to the SNP spokesman, the hon. Member for Ochil and South Perthshire (John Nicolson), that we have to focus on illegal content. It is difficult to get the balance right between the lawful and harmful. The illegal side is what we must focus on.

    I also know that many campaigners and individuals—they are not just campaigners, but families—have given heartbreaking and devastating accounts of their experiences of online harms. As legislators, we owe them this Bill, because although their suffering is not something that we will experience, it must bring about the type of changes that we all want to see for everyone—children, adults and vulnerable individuals.

    May I ask the Minister for reassurances on the definition of “best endeavours”? As my right hon. Friend the Member for Basingstoke (Dame Maria Miller) touched on, when it comes to implementation, that will be the area where the rubber hits the road. That is where we will need to know that our collective work will be meaningful and will deliver protections—not just change, but protections. We must be honest about the many serious issues that will arise even after we pass the Bill—be it, God forbid, a major terrorist incident, or cases of child sexual exploitation—and there is a risk that, without clarity in this area, when a serious issue does arise, we may not know whether a provider undertook best endeavours. I think we owe it to everyone to ensure that we run a slide rule over this on every single granular detail.

    Cases and issues relating to best endeavours are debated and discussed extensively in court cases, coroner inquests and for social services relating to child safeguarding issues, for example—all right hon. and hon. Members here will have experience of dealing with social services on behalf of their constituents in child protection cases—or, even worse, in serious case reviews or public inquiries that could come in future. I worry that in any response a provider could say that it did its best and had undertaken its best endeavours, as a defence. That would be unacceptable. That would lead those affected to feel as if they suffered an even greater injustice than the violations that they experienced. It is not clear whether best endeavours will be enough to change the culture, behaviour and attitudes of online platforms.

    I raise best endeavours in the context of changing attitudes and cultures because in many institutions, that very issue is under live debate right now. That may be in policing, attitudes around women and girls or how we protect other vulnerable groups, even in other services such as the fire service, which we have heard about recently. It is important that we ask those questions and have the scrutiny. We need to hear more about what constitutes best endeavours. Who will hold the providers to account? Ofcom clearly has a role. I know the Minister will do a very earnest and diligent job to provide answers, but the best endeavours principle goes wider than just the Minister on the Front Bench—it goes across the whole of Government. He knows that we will give him every backing to use his sharp elbows—perhaps I can help with my sharp elbows—to ensure that others are held to account.

    It will also be for Ofcom to give further details and guidance. As ever, the guidance will be so important. The guidance has to have teeth and statutory powers. It has to be able to put the mirror up and hold people to account. For example, would Ofcom be able, in its notices to providers, to instruct them to use specific technologies and programmes to tackle and end the exposure to exploitation, in relation to end-to-end encryption services, to protect victims? That is an open question, but one that could be put to Ofcom and could be an implementation test. There is no reason why we should not put a series of questions to Ofcom around how it would practically implement.

    I would like to ask the Minister why vulnerable adults and victims of domestic abuse and violence against women and girls are not included. We must do everything in this House. This is not about being party political. When it comes to all our work on women and violence against women and girls, there should be no party politics whatsoever. We should ensure that what is right for one group is consistent and that the laws are strengthened. That will require the MOJ, as well as the Home Office, to ensure that the work is joined up in the right kind of way.

    It is right that powers are available for dealing with terrorist threats and tackling child sexual abuse thoroughly. There is some good work around terrorist content. There is excellent work in GIFCT, the Global Internet Forum to Counter Terrorism. The technology companies are doing great work. There is international co-operation in this space. The House should take some comfort in the fact that the United Kingdom leads the world in this space. We owe our gratitude to our intelligence and security agencies. I give my thanks to MI5 in particular for its work and to counter-terrorism policing, because they have led the world robustly in this work.

    Damian Collins

    My right hon. Friend makes an important point about this being a cross-Government effort. The Online Safety Bill creates a regulatory framework for the internet, but we need to make sure that we have the right offences in law clearly defined. Then, it is easy to read them and cross them with legislation. If we do not have that, it is a job for the whole of Government.

    Priti Patel

    Exactly that. My hon. Friend is absolutely right. I come back to the point about drafting this legislation, which is not straightforward and easy because of the definitions. It is not just about what is in scope of the Bill but about the implications of the definitions and how they could be applied in law.

    The Minister touched on the criminal side of things; interpretation in the criminal courts and how that would be applied in case law are the points that need to be fleshed out. This is where our work on CT is so important, because across the world with Five Eyes we have been consistent. Again, there are good models out there that can be built upon. We will not fix all this through one Bill—we know that. This Bill is foundational, which is why we must move forward.

    On new clause 11, I seek clarity—in this respect, I need reassurance not from the Minister but from other parts of government—on how victims and survivors, whether of terrorist activity, domestic abuse or violence against women and girls, will be supported and protected by the new safeguards in the Bill, and by the work of the Victims’ Commissioner.

    Rachel Maclean (Redditch) (Con)

    I thank my right hon. Friend for sharing her remarks with the House. She is making an excellent speech based on her considerable experience. On the specific issue of child sexual abuse and exploitation, many organisations, such as the Internet Watch Foundation, are instrumental in removing reports and web pages containing that vile and disgusting material. In the April 2020 White Paper, the Government committed to look at how the Internet Watch Foundation could use its technical expertise in that field. Does she agree that it would be good to hear from the Minister about how the Internet Watch Foundation could work with Ofcom to assist victims?

    Priti Patel

    My hon. Friend is absolutely right. I thank her for not just her intervention but her steadfast work when she was a Home Office Minister with responsibility for safeguarding. I also thank the Internet Watch Foundation; many of the statistics and figures that we have been using about child sexual abuse and exploitation content, and the take-downs, are thanks to its work. There is some important work to do there. The Minister will be familiar with its work—[Interruption.] Exactly that.

    We need the expertise of the Internet Watch Foundation, so it is about integrating that skillset. There is a great deal of expertise out there, including at the Internet Watch Foundation, at GIFCT on the CT side and, obviously, in our services and agencies. As my right hon. Friend the Member for Basingstoke said, it is crucial that we pool organisations’ expertise to implement the Bill, as we will not be able to create it all over again overnight in government.

    I thank my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) for tabling new clause 16, which would create new offences to address the challenges caused by those who promote, encourage and assist self-harm. That has been the subject of much of the debate already, which is absolutely right when we think about the victims and their families. In particular, I thank the Samaritans and others for their work to highlight this important issue. I do not need to dwell on the Samaritans’ report, because I think all hon. Members have read it.

    All hon. Members who spoke in the early stages of the Bill, which I did not because I was in government, highlighted this essential area. It is important to ensure that we do everything we can to address it in the right way. Like all right hon. and hon. Members, I pay tribute to the family of Molly Russell. There are no words for the suffering that they have endured, but their campaign of bravery, courage and fortitude aims to close every loophole to stop other young people being put at risk.

    Right hon. and hon. Members meet young people in schools every week, and we are also parents and, in some cases, grandparents. To know that this grey area leaves so many youngsters at risk is devastating, so we have almost a collective corporate duty to stand up and do the right thing. The long and short of it is that we need to be satisfied, when passing the Bill, that we are taking action to protect vulnerable people and youngsters who are susceptible to dangerous communications.

    As I have emphasised, we should also seek to punish those who cause and perpetrate this harm and do everything we can to protect those who are vulnerable, those with learning disabilities, those with mental health conditions, and those who are exposed to self-harm content. We need to protect them and we have a duty to do that, so I look forward to the Minister’s reply.

    I welcome new clauses 45 to 50, tabled by my right hon. Friend the Member for Basingstoke. I pay tribute to her for her work; she has been a strong campaigner for protecting the privacy of individuals, especially women and children, and for closing loopholes that have enabled people to be humiliated or harmed in the ways she has spoken about so consistently in the House. I am pleased that the Deputy Prime Minister, my right hon. Friend the Member for Esher and Walton (Dominic Raab), announced last month that the Government would table amendments in the other place to criminalise the sharing of intimate images, photographs and videos without consent; that is long overdue. When I was Home Secretary I heard the most appalling cases, with which my right hon. Friend the Member for Basingstoke will be familiar. I have met so many victims and survivors, and we owe it to them to do the right thing.

    It would be reassuring to hear not just from the Minister in this debate, but from other Ministers in the Departments involved in the Bill, to ensure they are consistent in giving voice to the issues and in working through their Ministries on the implementation—not just of this Bill, but of the golden thread that runs throughout the legislation. Over the last three years, we have rightly produced a lot of legislation to go after perpetrators, and support women and girls, including the Domestic Abuse Act 2021. We should use those platforms to stand up for the individuals affected by these issues.

    I want to highlight the importance of the provisions to protect women and girls, particularly the victims and survivors of domestic abuse and violence. Some abusive partners and ex-partners use intimate images in their possession; as the Minister said, that is coercive control which means that the victim ends up living their life in fear. That is completely wrong. We have heard and experienced too many harrowing and shocking stories of women who have suffered as a result of the use of such images and videos. It must now be a priority for the criminal justice system, and the online platforms in particular, to remove such content. This is no longer a negotiation. Too many of us—including myself, when I was Home Secretary—have phoned platforms at weekends and insisted that they take down content. Quite frankly, I have then been told, “Twitter doesn’t work on a Saturday, Home Secretary” or “This is going to take time.” That is not acceptable. It is an absolute insult to the victims, and is morally reprehensible and wrong. The platforms must be held to account.

    Hon. Members will be well aware of the Home Office’s work on the tackling violence against women and girls strategy. I pay tribute to all colleagues, but particularly my hon. Friend the Member for Redditch (Rachel Maclean), who was the Minister at the time. The strategy came about after much pain, sorrow and loss of life, and it garnered an unprecedented 180,000 responses. The range of concerns raised were predominantly related to the issues we are discussing today. We can no longer stay mute and turn a blind eye. We must ensure that the safety of women in the public space offline—on the streets—and online is respected. We know how women feel about the threats. The strategy highlighted so much; I do not want to go over it again, as it is well documented and I have spoken about it in the House many times.

    It remains a cause of concern that the Bill does not include a specific VAWG code of practice. We want and need the Bill. We are not going to fix everything through it, but, having spent valued time with victims and survivors, I genuinely believe that we could move towards a code of practice. Colleagues, this is an area on which we should unite, and we should bring such a provision forward; it is vital.

    Let me say a few words in support of new clause 23, which was tabled by my right hon. Friend the Member for Basingstoke. I have always been a vocal and strong supporter of services for victims of crime, and of victims full stop. I think it was 10 years ago that I stood in this House and proposed a victims code of practice—a victims Bill is coming, and we look forward to that as well. This Government have a strong record of putting more resources into support for victims, including the £440 million over three years, but it is imperative that offenders—those responsible for the harm caused to victims—are made to pay, and it is absolutely right that they should pay more in compensation.

    Companies profiteering from online platforms where these harms are being perpetrated should be held to account. When companies fail in their duties and have been found wanting, they must make a contribution for the harm caused. There are ways in which we can do that. There has been a debate already, and I heard the hon. Member for Pontypridd (Alex Davies-Jones) speak for the Opposition about one way, but I think we should be much more specific now, particularly in individual cases. I want to see those companies pay the price for their crimes, and I expect the financial penalties issued to reflect the severity of the harm caused—we should support that—and that such money should go to supporting the victims.

    I pay tribute to the charities, advocacy groups and other groups that, day in and day out, have supported the victims of crime and of online harms. I have had an insight into that work from my former role in Government, but we should never underestimate how traumatic and harrowing it is. I say that about the support groups, but we have to magnify that multiple times for the victims. This is one area where we must ensure that more is done to provide extra resources for them. I look forward to hearing more from the Minister, but also from Ministers from other Departments in this space.

    I will conclude on new clause 28, which has already been raised, on the advocacy body for children. There is a long way to go with this—there really is. Children are harmed in just too many ways, and the harm is unspeakable. We have touched on this in earlier debates and discussions on the Bill, in relation to child users on online platforms, and there will be further harm. I gently urge the Government —if not today or through this Bill, then later—to think about how we can pull together the skills and expertise in organisations outside this House and outside Government that give voice to children who have nowhere else to go.

    This is not just about the online space; in the cases in the constituency of the hon. Member for Rotherham (Sarah Champion) and other constituencies, we have seen children being harmed under cover. Statutory services failed them and the state failed them. It was state institutional failure that let children down in the cases in Rotherham and other child grooming cases. We could see that all over again in the online space, and I really urge the Government to make sure that that does not happen—and actually never happens again, because those cases are far too harrowing.

    There really is a lot here, and we must come together to ensure that the Bill comes to pass, but there are so many other areas where we can collectively put aside party politics and give voice to those who really need representation.

  • Julian Knight – 2022 Speech on the Online Safety Bill

    Julian Knight – 2022 Speech on the Online Safety Bill

    The speech made by Julian Knight, the Chair of the Culture Select Committee, in the House of Commons on 5 December 2022.

    I welcome the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), to his place. To say that he has been given a hospital pass in terms of this legislation is a slight understatement. It is very difficult to understand, and the ability he has shown at the Dispatch Box in grasping many of the major issues is to his credit. He really is a safe pair of hands and I thank him for that.

    Looking at the list of amendments, I think it is a bit of a hotchpotch, yet we are going to deal only with certain amendments today and others are not in scope. That shows exactly where we are with this legislation. We have been in this stasis now for five years. I remember that we were dealing with the issue when I joined the Digital, Culture, Media and Sport Committee, and it is almost three years since the general election when we said we would bring forward this world-leading legislation. We have to admit that is a failure of the political class in all respects, but we have to understand the problem and the realities facing my hon. Friend, other Ministers and the people from different Departments involved in drafting this legislation.

    We are dealing with companies that are more powerful than the oil barons and railway barons of the 19th century. These companies are more important than many states. The total value of Alphabet, for instance, is more than the total GDP of the Netherlands, and that is probably a low estimate of Alphabet’s global reach and power. These companies are, in many respects, almost new nation states in their power and reach, and they have been brought about by individuals having an idea in their garage. They still have that culture of having power without the consequences that flow from it.

    These companies have created wonderful things that enhance our lives in many respects through better communication and increased human knowledge, which we can barely begin to imagine, but they have done it with a skater boy approach—the idea that they are beyond the law. They had that enshrined in law in the United States, where they have effectively become nothing more than a megaphone or a noticeboard, and they have always relied on that. They are based or domiciled, in the main, in the United States, which is where they draw their legal power. They will always be in that position of power.

    We talk about 10% fines and even business interruption to ensure these companies have skin in the game, but we have to realise these businesses are so gigantic and of such importance that they could simply ignore what we do in this place. Will we really block a major social media platform? The only time something like that has been done was when a major social media platform blocked a country, if I remember rightly. We have to understand where we are coming from in that respect.

    This loose cannon, Elon Musk, is an enormously wealthy man, and he is quite strange, isn’t he? He is intrinsically imbued with the power of silicon valley and those new techno-masters of the universe. We are dealing with those realities, and this Bill is very imperfect.

    Mr David Davis

    My hon. Friend is giving a fascinating disquisition on this industry, but is not the implication that, in effect, these companies are modern buccaneer states and we need to do much more to legislate? I am normally a deregulator, but we need more than one Bill to do what we seek to do today.

    Julian Knight

    My right hon. Friend is correct. We spoke privately before this debate, and he said this is almost five Bills in one. There will be a patchwork of legislation, and there is a time limit. This is a carry-over Bill, and we have to get it on the statute book.

    This Bill is not perfect by any stretch of the imagination, and I take the Opposition’s genuine concerns about legal but harmful material. The shadow Minister mentioned the tragic case of Molly Russell. I heard her father being interviewed on the “Today” programme, and he spoke about how at least three quarters of the content he had seen that had prompted that young person to take her life had been legal but harmful. We have to stand up, think and try our best to ensure there is a safer space for young people. This Bill does part of that work, but only part. The work will be done in the execution of the Bill, through the wording on age verification and age assurance.

    Dame Maria Miller

    Given the complexities of the Bill, and given the Digital, Culture, Media and Sport Committee’s other responsibilities, will my hon. Friend join me in saying there should be a special Committee, potentially of both Houses, to keep this area under constant review? That review, as he says, is so badly needed.

    Julian Knight

    I thank my right hon. Friend for her question, which I have previously addressed. The problem is the precedent it would set. Any special Committee set up by a Bill would be appointed by the Whips, so we might as well forget about the Select Committee system. This is not a huge concern for the Digital, Culture, Media and Sport Committee, because the advent of any such special Committee would probably be beyond the next general election, and I am not thinking to that timeframe. I am concerned about the integrity of Parliament. The problem is that if we do that in this Bill, the next Government will come along and do it with another Bill and then another Bill. Before we know it, we will have a Select Committee system that is Whips-appointed and narrow in definition, and that cuts across something we all vote for.

    There are means by which we can have legislative scrutiny—that is the point I am making in my speech. I would very much welcome a Committee being set up after a year, temporarily, to carry out post-legislative scrutiny. My Committee has a Sub-Committee on disinformation and fake news, which could also look at this Bill going forward. So I do not accept my right hon. Friend’s point, but I appreciate completely the concerns about our needing proper scrutiny in this area. We must also not forget that any changes to Ofcom’s parameters can be put in a statutory instrument, which can by prayed against by the Opposition and thus we would have the scrutiny of the whole House in debate, which is preferable to having a Whips-appointed Committee.

    I have gone into quite a bit of my speech there, so I am grateful for that intervention in many respects. I am not going to touch on every aspect of this issue, but I urge right hon. and hon. Members in all parts of the House to think about the fact that although this is far from perfect legislation and it is a shame that we have not found a way to work through the legal but harmful material issue, we have to understand the parameters we are working in, in the real world, with these companies. We need to see that there is a patchwork of legislation, and the biggest way in which we can effectively let the social media companies know they have skin in the game in society—a liberal society that created them—is through competition legislation, across other countries and other jurisdictions. I am talking about our friends in the European Union and in the United States. We are working together closely now to come up with a suite of competition legislation. That is how we will be able to cover off some of this going forward. I will be supporting this Bill tonight and I urge everyone to do so, because, frankly, after five years I have had enough.

  • Alex Davies-Jones – 2022 Speech on the Online Safety Bill

    Alex Davies-Jones – 2022 Speech on the Online Safety Bill

    The speech made by Alex Davies-Jones, the Shadow Culture Minister, in the House of Commons on 5 December 2022.

    It is an absolute pleasure to be back in the Chamber to respond on behalf of the Opposition to this incredibly important piece of legislation on its long overdue second day on Report. It certainly has not been an easy ride so far: I am sure that Bill Committee colleagues across the House agree that unpicking and making sense of this unnecessarily complicated Bill has been anything but straightforward.

    We should all be incredibly grateful and are all indebted to the many individuals, charities, organisations and families who have worked so hard to bring online safety to the forefront for us all. Today is a particularly important day, as we are joined in the Public Gallery by a number of families who have lost children in connection with online harms. They include Lorin LaFave, Ian Russell, Andy and Judy Thomas, Amanda and Stuart Stephens and Ruth Moss. I sincerely hope that this debate will do justice to their incredible hard work and commitment in the most exceptionally difficult of circumstances.

    We must acknowledge that the situation has been made even harder by the huge changes that we have seen in the Government since the Bill was first introduced. Since its First Reading, it has been the responsibility of three different Ministers and two Secretaries of State. Remarkably, it has seen three Prime Ministers in post, too. We can all agree that legislation that will effectively keep people safe online urgently needs to be on the statute book: that is why Labour has worked hard and will continue to work hard to get the Bill over the line, despite the best efforts of this Government to kick the can down the road.

    The Government have made a genuine mess of this important legislation. Before us today are a huge number of new amendments tabled by the Government to their own Bill. We now know that the Government also plan to recommit parts of their own Bill—to send them back into Committee, where the Minister will attempt to make significant changes that are likely to damage even further the Bill’s ability to properly capture online harm.

    We need to be moving forwards, not backwards. With that in mind, I am keen to speak to a number of very important new clauses this afternoon. I will first address new clause 17, which was tabled by my right hon. Friend the Member for Barking (Dame Margaret Hodge), who has been an incredibly passionate and vocal champion for internet regulation for many years.

    As colleagues will be aware, the new clause will fix the frustrating gaps in Ofcom’s enforcement powers. As the Bill stands, it gives Ofcom the power to fine big tech companies only 10% of their turnover for compliance failures. It does not take a genius to recognise that that can be a drop in the ocean for some of the global multimillionaires and billionaires whose companies are often at the centre of the debate around online harm. That is why the new clause, which will mean individual directors, managers or other officers finally being held responsible for their compliance failures, is so important. When it comes to responsibilities over online safety, it is clear that the Bill needs to go further if the bosses in silicon valley are truly to sit up, take notice and make positive and meaningful changes.

    Sir Jeremy Wright (Kenilworth and Southam) (Con)

    I am afraid I cannot agree with the hon. Lady that the fines would be a drop in the ocean. These are very substantial amounts of money. In relation to individual director liability, I completely understand where the right hon. Member for Barking (Dame Margaret Hodge) is coming from, and I support a great deal of what she says. However, there are difficulties with the amendment. Does the hon. Member for Pontypridd (Alex Davies-Jones) accept that it would be very odd to end up in a position in which the only individual director liability attached to information offences, meaning that, as long as an individual director was completely honest with Ofcom about their wrongdoing, they would attract no individual liability?

    Alex Davies-Jones

    It may be a drop in the ocean to the likes of Elon Musk or Mark Zuckerberg—these multibillionaires who are taking over social media and using it as their personal plaything. They are not going to listen to fines; the only way they are going to listen, sit up and take notice is if criminal liability puts their neck on the line and makes them answer for some of the huge failures of which they are aware.

    The right hon. and learned Member mentions that he shares the sentiment of the amendment but feels it could be wrong. We have an opportunity here to put things right and put responsibility where it belongs: with the tech companies, the platforms and the managers responsible. In a similar way to what happens in the financial sector or in health and safety regulation, it is vital that people be held responsible for issues on their platforms. We feel that criminal liability will make that happen.

    Mr David Davis

    May I intervene on a point of fact? The hon. Lady says that fines are a drop in the ocean. The turnover of Google is $69 billion; 10% of that is just shy of $7 billion. That is not a drop in the ocean, even to Elon Musk.

    Alex Davies-Jones

    We are looking at putting people on the line. It needs to be something that people actually care about. Money does not matter to these people, as we have seen with the likes of Google, Elon Musk and Mark Zuckerberg; what matters to them is actually being held to account. Money may matter to Government Members, but it will be criminal liability that causes people to sit up, listen and take responsibility.

    While I am not generally in the habit of predicting the Minister’s response or indeed his motives—although my job would be a hell of a lot easier if I did—I am confident that he will try to peddle the line that it was the Government who introduced director liability for compliance failures in an earlier draft of the Bill. Let me be crystal clear in making this point, because it is important. The Bill, in its current form, makes individuals at the top of companies personally liable only when a platform fails to supply information to Ofcom, which misses the point entirely. Directors must be held personally liable when safety duties are breached. That really is quite simple, and I am confident that it would be effective in tackling harm online much more widely.

    We also support new clause 28, which seeks to establish an advocacy body to represent the interests of children online. It is intended to deal with a glaring omission from the Bill, which means that children who experience online sexual abuse will receive fewer statutory user advocacy protections than users of a post office or even passengers on a bus. The Minister must know that that is wrong and, given his Government’s so-called commitment to protecting children, I hope he will carefully consider a new clause which is supported by Members on both sides of the House as well as the brilliant National Society for the Prevention of Cruelty to Children. In rejecting new clause 28, the Government would be denying vulnerable children a strong, authoritative voice to represent them directly, so I am keen to hear the Minister’s justification for doing so, if that is indeed his plan.

    Members will have noted the bundle of amendments tabled by my hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) relating to Labour’s concerns about the unnecessary powers to overrule Ofcom that the Bill, as currently drafted, gives the Secretary of State of the day. During Committee evidence sessions, we heard from Will Perrin of the Carnegie UK Trust, who, as Members will know, is an incredibly knowledgeable voice when it comes to internet regulation. He expressed concern about the fact that, in comparison with other regulatory frameworks such as those in place for advertising, the Bill

    “goes a little too far in introducing a range of powers for the Secretary of State to interfere with Ofcom’s day-to-day doing of its business.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 117.]

    Labour shares that concern. Ofcom must be truly independent if it is to be an effective regulator. Surely we have to trust it to undertake logical processes, rooted in evidence, to arrive at decisions once this regime is finally up and running. It is therefore hard to understand how the Government can justify direct interference, and I hope that the Minister will seriously consider amendments 23 to 30, 32, and 35 to 41.

    Before I address Labour’s main concerns about the Government’s proposed changes to the Bill, I want to record our support for new clauses 29 and 30, which seek to bring media literacy duties back into the scope of the Bill. As we all know, media literacy is the first line of defence when it comes to protecting ourselves against false information online. Prevention is always better than cure. Whether it is a question of viral conspiracy theories or Russian disinformation, Labour fears that the Government’s approach to internet regulation will create a two-tier internet, leaving some more vulnerable than others.

    However, I am sorry to say that the gaps in this Bill do not stop there. I was pleased to see that my hon. Friend the Member for Rotherham (Sarah Champion) had tabled new clause 54, which asks the Government to formally consider the impact that the use of virtual private networks will have on Ofcom’s ability to enforce its powers. This touches on the issue of future-proofing, which Labour has raised repeatedly in debates on the Bill. As we have heard from a number of Members, the tech industry is evolving rapidly, with concepts such as the metaverse changing the way in which we will all interact with the internet in the future. When the Bill was first introduced, TikTok was not even a platform. I hope the Minister can reassure us that the Bill will be flexible enough to deal with those challenges head-on; after all, we have waited far too long.

    That brings me to what Labour considers to be an incredible overturn by the Government relating to amendment 239, which seeks to remove the new offence of harmful communications from the Bill entirely. As Members will know, the communications offence was designed by the Law Commission with the intention of introducing a criminal threshold for the most dangerous online harms. Indeed, in Committee it was welcome to hear the then Minister—the present Minister for Crime, Policing and Fire, the right hon. Member for Croydon South (Chris Philp)—being so positive about the Government’s consultation with the commission. In relation to clause 151, which concerns the communications offences, he even said:

    “The Law Commission is the expert in this kind of thing…and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do.” ––[Official Report, Online Safety Public Bill Committee, 21 June 2022; c. 558.]

    Less than six months down the line, we are seeing yet another U-turn from this Government, who are doing precisely the opposite of what was promised.

    Removing these communications offences from the Bill will have real-life consequences. It will mean that harmful online trends such as hoax bomb threats, abusive social media pile-ons and fake news such as encouraging people to drink bleach to cure covid will be allowed to spread online without any consequence.

    Christian Wakeford (Bury South) (Lab)

    No Jewish person should have to log online and see Hitler worship, but what we have seen in recent weeks from Kanye West has been nothing short of disgusting, from him saying “I love Hitler” to inciting online pile-ons against Jewish people, and this is magnified by the sheer number of his followers, with Jews actually being attacked on the streets in the US. Does my hon. Friend agree that the Government’s decision to drop the “legal but harmful” measures from the Bill will allow this deeply offensive and troubling behaviour to continue?

    Alex Davies-Jones

    I thank my hon. Friend for that important and powerful intervention. Let us be clear: everything that Kanye West said online is completely abhorrent and has no place in our society. It is not for any of us to glorify Hitler and his comments or praise him for the work he did; that is absolutely abhorrent and it should never be online. Sadly, however, that is exactly the type of legal but harmful content that will now be allowed to proliferate online because of the Government’s swathes of changes to the Bill, meaning that that would be allowed to be seen by everybody. Kanye West has 30 million followers online. His followers will be able to look at, share, research and glorify that content without any consequence to that content being freely available online.

    Dame Margaret Hodge

    Further to that point, it is not just that some of the content will be deeply offensive to the Jewish community; it could also harm wider society. Some further examples of postings that would be considered legal but harmful are likening vaccination efforts to Nazi death camps and alleging that NHS nurses should stand trial for genocide. Does my hon. Friend not agree that the changes the Government are now proposing will lead to enormous and very damaging impacts right through society?

    Alex Davies-Jones

    My right hon. Friend is absolutely right. I am keen to bring this back into scope before Mr Speaker chastises us any further, but she is right to say that this will have a direct real-world impact. This is what happens when we focus on content rather than directly on the platforms and the algorithms on the platforms proliferating this content. That is where the focus needs to be. It is the algorithms that share and amplify this content to these many followers time and again that need to be tackled, rather than the content itself. That is what we have been pleading with the Government to concentrate on, but here we are in this mess.

    We are pleased that the Government have taken on board Labour’s policy to criminalise certain behaviours—including the encouragement of self-harm, sharing people’s intimate images without their consent, and controlling or coercive behaviours—but we believe that the communications offences more widely should remain in order to tackle dangerous online harms at their root. We have worked consistently to get this Bill over the line and we have reached out to do so. It has been subject to far too many delays and it is on the Government’s hands that we are again facing substantial delays, when internet regulation has never been more sorely needed. I know that the Minister knows that, and I sincerely hope he will take our concerns seriously. I reach out to him again across the Dispatch Box, and look forward to working with him and challenging him further where required as the Bill progresses. I look forward to getting the Bill on to the statute book.

  • Paul Scully – 2022 Statement on the Online Safety Bill

    Paul Scully – 2022 Statement on the Online Safety Bill

    The statement made by Paul Scully, the Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport, in the House of Commons on 5 December 2022.

    I am delighted to bring the Online Safety Bill back to the House for the continuation of Report stage. I start by expressing my gratitude to colleagues across the House for their contributions to the Bill through pre-legislative scrutiny and before the summer recess, and for their engagement with me since I took office as the Minister for Tech and the Digital Economy.

    The concept at the heart of this legislation is simple: tech companies, like those in every other sector, must take responsibility for the consequences of their business decisions. As they continue to offer users the latest innovations, they must consider the safety of their users as well as profit. They must treat their users fairly and ensure that the internet remains a place for free expression and robust debate. As Members will be aware, the majority of the Bill was discussed on Report before the summer recess. Our focus today is on the provisions that relate to the regulator’s power and the criminal law reforms. I will take this opportunity also to briefly set out the further changes that the Government recently committed to making later in the Bill’s passage.

    Let me take the Government amendments in turn. The Government’s top priority for this legislation has always been the protection of children. We recognise that the particularly abhorrent and pernicious nature of online child sexual exploitation and abuse—CSEA—demands the most robust response possible. Throughout the passage of the Bill, we have heard evidence of the appalling harm that CSEA causes. Repeatedly, we heard calls for strong incentives for companies to do everything they can to innovate and make safety technologies their priority, to ensure that there is no place for offenders to hide online. The Bill already includes a specific power to tackle CSEA, which allows Ofcom, subject to safeguards, to require tech companies to use accredited technology to identify and remove illegal CSEA content in public and private communications. However, we have seen in recent years how the online world has evolved to allow offenders to reach their victims and one another in new ways.

    Priti Patel (Witham) (Con)

    I am listening to my hon. Friend with great interest on this aspect of child sexual abuse and exploitation, which is a heinous crime. Will he go on to speak about how the Ofcom role will interact with law enforcement, in particular the National Crime Agency, when dealing with these awful crimes?

    Paul Scully

    It is important that we tackle this in a number of ways. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and I spoke earlier, and I will come to some of what he will outline. It is important that Ofcom recognises the technologies that are available and—with the Children’s Commissioner as one of the statutory consultees—liaises with the social media platforms, and the agencies, to ensure that there are codes of practice that work, and that we get this absolutely right. It is about enforcing the terms and conditions of the companies and being able to produce the evidence and track the exchanges, as I will outline later, for the agency to use for enforcement.

    With the rapid developments in technology, on occasions there will be no existing accredited technology available that will satisfactorily mitigate the risks. Similarly, tech companies might be able to better design solutions that integrate more easily with their services than those that are already accredited. The new regulatory framework must incentivise tech companies to ensure that their safety measures keep pace with the evolving threat, and that they design their services to be safe from the outset. It is for these reasons that the Government have tabled the amendments that we are discussing.

    New clauses 11 and 12 establish options for Ofcom when deploying its powers under notices to deal with terrorism content and CSEA content. These notices will empower Ofcom to require companies to use accredited technology to identify and remove illegal terrorism and CSEA content or to prevent users from encountering that content or, crucially, to use their best endeavours to develop or to source technology to tackle CSEA. That strikes the right balance of supporting the adoption of new technology, while ensuring that it does not come at the expense of children’s physical safety.

    Rehman Chishti (Gillingham and Rainham) (Con)

    Terrorism is often linked to non-violent extremism, which feeds into violent extremism and terrorism. How does the Bill define extremism? Previous Governments failed to define it, although it is often linked to terrorism.

    Paul Scully

    This Bill links with other legislation, and obviously the agencies. We do not seek to redefine extremism where those definitions already exist. As we expand on the changes that we are making, we will first ensure that anything that is already illegal goes off the table. Anything that is against the terms and conditions of those platforms that are hosting that content must not be seen. I will come to the safety net and user protection later.

    Charlotte Nichols (Warrington North) (Lab)

    Since Elon Musk’s takeover of Twitter, hate speech has ballooned on the platform and the number of staff members at Twitter identifying images of child sexual abuse and exploitation has halved. How can the Minister be sure that the social media companies are able to mark their own homework in the way that he suggests?

    Paul Scully

    Because if those companies do not, they will get a fine of up to £18 million or 10% of their global turnover, whichever is higher. As we are finding with Twitter, there is also a commercial impetus, because advertisers are fleeing that platform as they see the uncertainty being caused by those changes. A lot of things are moving here to ensure that safety is paramount; it is not just for the Government to act in this area. All we are doing is making sure that those companies enforce their own terms and conditions.

    Priti Patel

    This point is important: we are speaking about terrorism and counter-terrorism and the state’s role in preventing terrorist activity. For clarity, will the Minister update the House later on the work that takes place between his Department and the platforms and, importantly, between the Home Office and the security services. In particular, some specialist work takes place with the Global Internet Forum to Counter Terrorism, which looks at online terrorist and extremist content. That work can ensure that crimes are prevented and that the right kinds of interventions take place.

    Paul Scully

    My right hon. Friend talks with experience from her time at the Home Office. She is absolutely right that the Bill sets a framework to adhere to the terms and conditions of the platforms. It also sets out the ability for the services to look at things such as terrorism and CSEA, which I have been talking about—for example, through the evidence of photos being exchanged. The Bill is not re-examining and re-prosecuting the interaction between all the agencies, however, because that is apparent for all to see.

    New clauses 11 and 12 bring those powers in line with the wider safety duties by making it clear that the tools may seek to proactively prevent CSEA content from appearing on a service, rather than focusing only on identification and removal after the fact. That will ensure the best possible protection for children, including on services that offer livestreaming.

    The safeguards around those powers remain as strong as before to protect user privacy. Any tools that are developed will be accredited using a rigorous assessment process to ensure that they are highly accurate before the company is asked to use them. That will avoid any unnecessary intrusions into user privacy by minimising the risk that the tools identify false positives.

    Crucially, the powers do not represent a ban on or seek to undermine any specific type of technology or design, such as end-to-end encryption. They align with the UK Government’s view that online privacy and cyber-security must be protected, but that technological changes should not be implemented in a way that diminishes public safety.

    Kit Malthouse (North West Hampshire) (Con)

    Can the Minister expand on the notion of “accredited technology”? The definition in the Bill is pretty scant as to where it will emerge from. Is he essentially saying that he is relying on the same industry that has thus far presided over the problem to produce the technology that will police it for us? Within that equation, which seems a little self-defeating, is it the case that if the technology does not emerge for one reason or another—commercial or otherwise—the Government will step in and devise, fund or otherwise create the technology required to be implemented?

    Paul Scully

    I thank my right hon. Friend. It is the technology sector that develops technology—it is a simple, circular definition—not the Government. We are looking to make sure that it has that technology in place, but if we prescribed it in the Bill, it would undoubtedly be out of date within months, never mind years. That is why it is better for us to have a rounded approach, working with the technology sector, to ensure that it is robust enough.

    Kit Malthouse

    I may not have been clear in my original intervention: my concern is that the legislation relies on the same sector that has thus far failed to regulate itself and failed to invent the technology that is required, even though it is probably perfectly capable of doing so, to produce the technology that we will then accredit to be used. My worry is that the sector, for one reason or another—the same reason that it has not moved with alacrity already to deal with these problems in the 15 years or so that it has existed—may not move at the speed that the Minister or the rest of us require to produce the technology for accreditation. What happens if it does not?

    Paul Scully

    Clearly, the Government can choose to step in. We are setting up a framework to ensure that we get the right balance and are not being prescriptive. I take issue with the idea that a lot of this stuff has not been invented, because there is some pretty robust work on age assurance and verification, and other measures to identify harmful and illegal material, although my right hon. Friend is right that it is not being used as robustly as it could be. That is exactly what we are addressing in the Bill.

    Mr David Davis (Haltemprice and Howden) (Con)

    My intervention is on the same point as that raised by my right hon. Friend the Member for North West Hampshire (Kit Malthouse), but from the opposite direction, in effect. What if it turns out that, as many security specialists and British leaders in security believe—not just the companies, but professors of security at Cambridge and that sort of thing—it is not possible to implement such measures without weakening encryption? What will the Minister’s Bill do then?

    Paul Scully

    The Bill is very specific with regard to encryption; this provision will cover solely CSEA and terrorism. It is important that we do not encroach on privacy.

    Damian Collins (Folkestone and Hythe) (Con)

    I welcome my hon. Friend to his position. Under the Bill, is it not the case that if a company refuses to use existing technologies, that will be a failure of the regulatory duties placed on that company? Companies will be required to demonstrate which technology they will use and will have to use one that is available. On encrypted messaging, is it not the case that companies already gather large amounts of information about websites that people visit before and after they send a message that could be hugely valuable to law enforcement?

    Paul Scully

    My hon. Friend is absolutely right. Not only is it incumbent on companies to use that technology should it exist; if they hamper Ofcom’s inquiries by not sharing information about what they are doing, what they find and which technologies they are not using, that will be a criminal liability under the Bill.

    Dr Luke Evans (Bosworth) (Con)

    To take that one step further, is it correct that Ofcom would set minimum standards for operators? For example, the Content Authenticity Initiative does not need primary legislation, but is an industry open-standard, open-source format. That is an example of modern technology that all companies could sign up to use, and Ofcom would therefore determine what needs to be done in primary legislation.

    Mr Speaker

    Can I be helpful? We did say that our discussions should be within scope, but the Minister is tempting everybody to intervene out of scope. From his own point of view, I would have thought that it would be easier to keep within scope.

    Paul Scully

    Thank you, Mr Speaker; I will just respond to my hon. Friend the Member for Bosworth (Dr Evans). There is a minimum standard in so far as the operators have to adhere to the terms of the Bill. Our aim is to exclude illegal content and ensure that children are as safe as possible within the remit of the Bill.

    The changes will ensure a flexible approach so that companies can use their expertise to develop or source the most effective solution for their service, rather than us being prescriptive. That, in turn, supports the continued growth of our digital economy while keeping our citizens safe online.

    Sajid Javid (Bromsgrove) (Con)

    My hon. Friend may know that there are third-party technology companies—developers of this accredited technology, as he calls it—that do not have access to all the data that might be necessary to develop technology to block the kind of content we are discussing. They need to be given the right to access that data from the larger platforms. Will Ofcom be able to instruct large platforms that have users’ data to make it available to third-party developers of technology that can help to block such content?

    Paul Scully

    Ofcom will be working with the platforms over the next few months—in the lead-up to the commencement of the Bill and afterwards—to ensure that the provisions are operational, so that we get them up and running as soon as practicably possible. My right hon. Friend is right to raise the point.

    Jim Shannon (Strangford) (DUP)

    In Northern Ireland we face the specific issue of the glorification of terrorism. Glorifying terrorism encourages terrorism. Is it possible that the Bill will stop that type of glorification, and therefore stop the terrorism that comes off the back of it?

    Paul Scully

    I will try to cover the hon. Member’s comments a little bit later, if I may, when I talk about some of the changes coming up later in the process.

    Moving away from CSEA, I am pleased to say that new clause 53 fulfils a commitment given by my predecessor in Committee to bring forward reforms to address epilepsy trolling. It creates the two specific offences of sending and showing flashing images to an individual with epilepsy with the intention of causing them harm. Those offences will apply in England, Wales and Northern Ireland, providing people with epilepsy with specific protection from this appalling abuse. I would like to place on record our thanks to the Epilepsy Society for working with the Ministry of Justice to develop the new clause.

    The offence of sending flashing images captures situations in which an individual sends a communication in a scatter-gun manner—for example, by sharing a flashing image on social media—and the more targeted sending of flashing images to a person who the sender knows or suspects is a person with epilepsy. It can be committed by a person who forwards or shares such an electronic communication as well as by the person sending it. The separate offence of showing flashing images will apply if a person shows flashing images to someone they know or suspect to have epilepsy by means of an electronic communications device—for example, on a mobile phone or a TV screen.

    The Government have listened to parliamentarians and stakeholders about the impact and consequences of this reprehensible behaviour, and my thanks go to my hon. Friends the Members for Watford (Dean Russell), for Stourbridge (Suzanne Webb), for Blackpool North and Cleveleys (Paul Maynard) and for Ipswich (Tom Hunt) for their work and campaigning. [Interruption.] Indeed, and the hon. Member for Batley and Spen (Kim Leadbeater), who I am sure will be speaking on this later.

    New clause 53 creates offences that are legally robust and enforceable so that those seeking to cause harm to people with epilepsy will face appropriate criminal sanctions. I hope that will reassure the House that the deeply pernicious activity of epilepsy trolling will be punishable by law.

    Suzanne Webb (Stourbridge) (Con)

    The Minister is thanking lots of hon. Members, but should not the biggest thanks go, first, to the Government for the inclusion of this amendment; and secondly, to Zach Eagling, the inspirational now 11-year-old who was the victim of a series of trolling incidents when flashing images were pushed his way after a charity walk? We have a huge amount to thank Zach Eagling for, and of course the amazing Epilepsy Society too.

    Paul Scully

    A number of Members across the House have been pushing for Zach’s law, and I am really delighted that Zach’s family can see in Hansard that that campaigning has really made a direct change to the law.

    Dean Russell (Watford) (Con)

    I just want to echo the previous points. This has been a hard-fought decision, and I am so proud that the Government have done this, but may I echo the thanks to Zach for being a true hero? We talk about David and Goliath, the giant—the beast—who was taken down, but Zach has beaten the tech giants, and I think this is an incredible success.

    Paul Scully

    I absolutely echo my hon. Friend’s remarks, and I again thank him for his work.

    We are also taking steps to strengthen Ofcom’s enforcement powers, which is why we are giving Ofcom a discretionary power to require non-compliant services to publish or notify their users of enforcement action that it has taken against the service. Ofcom will be able to use this power to direct a service to publish details or notify its UK users about enforcement notices it receives from Ofcom. I thank the Antisemitism Policy Trust for bringing this proposal to our attention and for its helpful engagement on the issue. This new power will promote transparency by increasing awareness among users about breaches of the duty in the Bill. It will help users make much more informed decisions about the services they use, and act as an additional deterrent factor for service providers.

    Dr Luke Evans

    It is fantastic to have the data released. Does the Minister have any idea how many of these notifications are likely to be put out there when the Bill comes in? Has any work been done on that? Clearly, having thousands of these come out would be very difficult for the public to understand, but half a dozen over a year might be very useful to understand which companies are struggling.

    Paul Scully

    I think this is why Ofcom has discretion, so that it can determine that. The most egregious examples are the ones people can learn from, and it is about doing this in proportion. My hon. Friend is absolutely right that if we are swamped with small notifications, this will be hidden in plain sight. That would not be useful, particularly for parents, to best understand what is going on. It is all about making more informed decisions.

    The House will be aware that we recently announced our intention to make a number of other changes to the Bill. We are making those changes because we believe it is vital that people can continue to express themselves freely and engage in pluralistic debate online. That is why the Bill will be amended to strengthen its provisions relating to children and to ensure that the Bill’s protections for adults strike the right balance with its protections for free speech.

    Dame Margaret Hodge (Barking) (Lab)

    The Minister is alluding, I assume, to the legal but harmful provision, but what does he think about this as an example? People are clever; they do not use illegal language. They will not say, “I want to kill all Jews”, but they may well—and do—say, “I want to harm all globalists.” What is the Minister’s view of that?

    Paul Scully

    The right hon. Lady and I have had a detailed chat about some of the abuse that she and many others have been suffering, and there were some particularly egregious examples. This Bill is not, and never will be, a silver bullet. This has to be worked through, with the Government acting with media platforms and social media platforms, and parents also have a role. This will evolve, but we first need to get back to the fundamental point that social media platforms are not geared up to enforce their own terms and conditions. That is ridiculous, a quarter of a century after the world wide web kicked in, and when social media platforms have been around for the best part of 20 years. We are shutting the stable door afterwards, and trying to come up with legislation two decades later.

    Mr Speaker

    Order. I am really bothered. I am trying to help the Minister, because although broadening discussion of the Bill is helpful, it is also allowing Members to come in with remarks that are out of scope. If we are going to go out of scope, we could be here a long time. I am trying to support the Minister by keeping him in scope.

    Paul Scully

    Thank you, Mr Speaker; I will try to keep my remarks very much in scope.

    The harmful communications offence in clause 151 was a reform to communication offences proposed in the Bill. Since the Bill has been made public, parliamentarians and stakeholders have expressed concern that the threshold that would trigger prosecution for the offence of causing serious distress could bring robust but legitimate conversation into the illegal space. In the light of that concern, we have decided not to take forward the harmful communications offence for now. That will give the Government an opportunity to consider further how the criminal law can best protect individuals from harmful communications, and ensure that protections for free speech are robust.

    Jim Shannon

    This is about the protection of young people, and we are all here for the same reason, including the Minister. We welcome the changes that he is putting forward, but the Royal College of Psychiatrists has expressed a real concern about the mental health of children, and particularly about how screen time affects them. NHS Digital has referred to one in eight 11 to 16-year-olds being bullied. I am not sure whether we see in the Bill an opportunity to protect them, so perhaps the Minister can tell me the right way to do that.

    Paul Scully

    The hon. Gentleman talks about the wider use of screens and screen time, and that is why Ofcom’s media literacy programme, and DCMS’s media literacy strategy—

    Alex Davies-Jones (Pontypridd) (Lab)

    It is not in the Bill.

    Paul Scully

    That is because we have a detailed strategy that tackles many of these issues. Again, none of this is perfect, and as I have said, the Government are working in tandem with the platforms, and with parents and education bodies, to make sure we get that bit right. The hon. Gentleman is right to highlight that as a big issue.

    I talked about harmful communications, recognising that we could leave a potential gap in the criminal law. The Government have also decided not to repeal existing communications offences in the Malicious Communications Act 1988, or those under section 127(1) of the Communications Act 2003. That will ensure that victims of domestic abuse or other extremely harmful communications will still be robustly protected by the criminal law. Along with planned changes to the harmful communications offence, we are making a number of additional changes to the Bill—that will come later, Mr Speaker, and I will not tread too much into that, as it includes the removal of the adult safety duties, often referred to as the legal but harmful provision. The amended Bill offers adults a triple shield of protection that requires platforms to remove illegal content and material that violates their terms and conditions, and gives adults user controls to help them avoid seeing certain types of content.

    The Bill’s key objective, above everything else, is the safety of children online, and we will be making a number of changes to strengthen the Bill’s existing protections for children. We will make sure that we expect platforms to use age assurance technology when identifying the age of their users, and we will also require platforms with minimum age restrictions to explain in their terms of service what measures they have in place to prevent access to those below their minimum age, and enforce those measures consistently. We are planning to name the Children’s Commissioner as a statutory consultee for Ofcom in its development of the codes of practice, ensuring that children’s views and needs are represented.

    Alex Davies-Jones

    Which one?

    Paul Scully

    That is the Children’s Commissioner for England, specifically because they have particular reserved duties for the whole of the UK. None the less, Ofcom must also have regard to a wider range of voices, which can easily include the other Children’s Commissioners.

    Mike Amesbury (Weaver Vale) (Lab)

    On age reassurance, does the Minister not see a weakness? Lots of children and young people are far more sophisticated than many of us in the Chamber and will easily find a workaround, as they do now. The onus is being put on the children, so the Bill is not increasing regulation or the safety of those children.

    Paul Scully

    As I said, the social media platforms will have to put in place robust age assurance and age verification for material in an accredited form that is acceptable to Ofcom, which will look at that.

    Tackling violence against women and girls is a key priority for the Government. It is unacceptable that women and girls suffer disproportionately from abuse online, and it is right that we go further to address that through the Bill. That is why we will name the commissioner for victims and witnesses and the Domestic Abuse Commissioner as statutory consultees for the code of practice and list “coercive or controlling behaviour” as a priority offence. That offence disproportionately affects women and girls, and that measure will mean that companies will have to take proactive measures to tackle such content.

    Finally, we are making a number of criminal law reforms, and I thank the Law Commission for the great deal of important work that it has done to assess the law in these areas.

    Ruth Edwards (Rushcliffe) (Con)

    I strongly welcome some of the ways in which the Bill has been strengthened to protect women and girls, particularly by criminalising cyber-flashing, for example. Does the Minister agree that it is vital that our laws keep pace with the changes in how technology is being used? Will he therefore assure me that the Government will look to introduce measures along the lines set out in new clauses 45 to 50, standing in the name of my right hon. Friend the Member for Basingstoke (Dame Maria Miller), who is leading fantastic work in this area, so that we can build on the Government’s record in outlawing revenge porn and threats to share it?

    Paul Scully

    I thank my hon. Friend, and indeed I thank my right hon. Friend the Member for Basingstoke (Dame Maria Miller) for the amazing work that she has done in this area. We will table an amendment to the Bill to criminalise more behaviour relating to intimate image abuse, so more perpetrators will face prosecution and potentially time in jail. My hon. Friend has worked tirelessly in this area, and we have had a number of conversations. I thank her for that. I look forward to more conversations to ensure that we get the amendment absolutely right and that it does exactly what we all want.

    The changes we are making will include criminalising the non-consensual sharing of manufactured intimate images, which, as we have heard, are more commonly known as deepfakes. In the longer term, the Government will also take forward several of the Law Commission’s recommendations to ensure that the legislation is coherent and takes account of advancements in technology.

    We will also use the Bill to bring forward a further communication offence to make the encouragement of self-harm illegal. We have listened to parliamentarians and stakeholders concerned about such behaviour and will use the Bill to criminalise that activity, providing users with protections from that harmful content. I commend my right hon. Friend the Member for Haltemprice and Howden on his work in this area and his advocacy for such a change.

    Charlotte Nichols

    Intimate image abuse has been raised with me a number of times by younger constituents, who are particularly vulnerable to such abuse. Within the scope of what we are discussing, I am concerned that we have seen only one successful conviction for revenge porn, so if the Government base their intimate image work on the existing legislative framework for revenge porn, it will do nothing and protect no one, and will instead be a waste of everyone’s time and further let down victims who are already let down by the system.

    Paul Scully

    We will actually base that work on the independent Law Commission’s recommendations, and have been working with it on that basis.

    Vicky Ford (Chelmsford) (Con)

    On images that promote self-harm, does the Minister agree that images that promote or glamourise eating disorders should be treated just as seriously as any other content promoting self-harm?

    Paul Scully

    I thank my right hon. Friend, who spoke incredibly powerfully at Digital, Culture, Media and Sport questions, and on a number of other occasions, about her particular experience. That is always incredibly difficult. Absolutely that area will be tackled, especially for children, but it is really important—as we will see from further changes in the Bill—that, with the removal of the legal but harmful protections, there are other protections for adults.

    Sajid Javid

    I think last year over 6,000 people died from suicide in the UK. Much of that, sadly, was encouraged by online content, as we saw from the recent coroner’s report into the tragic death of Molly Russell. On new clause 16, tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), will the Minister confirm that the Government agree with the objectives of new clause 16 and will table an amendment to this Bill—to no other parliamentary vehicle, but specifically to this Bill—to introduce such a criminal offence? Will the Government amendment he referred to be published before year end?

    Paul Scully

    On self-harm, I do not think there is any doubt that we are absolutely aligned. On suicide, I have some concerns about how new clause 16 is drafted—it amends the Suicide Act 1961, which is not the right place to introduce measures on self-harm—but I will work to ensure we get this measure absolutely right as the Bill goes through the other place.

    Dame Caroline Dinenage (Gosport) (Con)

    Will my hon. Friend give way?

    Priti Patel

    Will my hon. Friend give way?

    Paul Scully

    I will give way first to one of my predecessors.

    Dame Caroline Dinenage

    I thank my hon. Friend for giving way. He is almost being given stereo questions from across the House, but I think they might be slightly different. I am very grateful to him for setting out his commitment to tackling suicide and self-harm content, and for his commitment to my right hon. Friend the Member for Chelmsford (Vicky Ford) on eating disorder content. My concern is that there is a really opaque place in the online world between what is legal and illegal, which potentially could have been tackled by the legal but harmful restrictions. Can he set out a little more clearly—not necessarily now, but as we move forward—how we really are going to begin to tackle the opaque world between legal and illegal content?

    Paul Scully

    If my hon. Friend will bear with me—I need to make some progress—I think that will be teased out today and in Committee, should the Bill be recommitted, as we amend the clauses relating directly to what she is talking about, and then as the Bill goes through the other place.

    Priti Patel

    Will the Minister give way?

    Paul Scully

    I will give way a final time before I finish.

    Priti Patel

    I am grateful to the Minister, who has taken a number of interventions. I fully agree with my hon. Friend the Member for Gosport (Dame Caroline Dinenage). This is a grey area and has consistently been so—many Members have given their views on that in previous stages of the Bill. Will the Minister come back in the later stages on tackling violence against women and girls, and show how the Bill will incorporate key aspects of the Domestic Abuse Act 2021, and tie up with the criminal justice system and the work of the forthcoming victims Bill? We cannot look at these issues in isolation—I see that the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is also on the Front Bench. Rather, they all have to be put together in a golden thread of protecting victims, making sure that people do not become victims, and ensuring that we go after the perpetrators—we must not forget that at all. The Minister will not be able to answer that now, but I would ask him to please do so in the latter stages.

    Paul Scully

    I talked about the fact that the Commissioner for Victims and Witnesses and the Domestic Abuse Commissioner will be statutory consultees, because it is really important that their voice is heard in the implementation of the Bill. We are also bringing in coercive control as one of the areas. That is so important when it comes to domestic abuse. Domestic abuse does not start with a slap, a hit, a punch; it starts with emotional abuse—manipulation, coercion and so on. That is why coercive abuse is an important point not just for domestic abuse, but for bullying, harassment and the wider concerns that the Bill seeks to tackle.

    Jamie Stone (Caithness, Sutherland and Easter Ross) (LD) rose—

    Paul Scully

    I will give way and then finish up.

    Jamie Stone

    I am one of three Scottish Members present, and the Scottish context concerns me. If time permits me in my contribution later, I will touch on a particularly harrowing case. The school involved has been approached but has done nothing. Education is devolved, so the Minister may want to think about that. It would be too bad if the Bill failed in its good intentions because of a lack of communication in relation to a function delivered by the Scottish Government. Can I take it that there will be the closest possible co-operation with the Scottish Government because of their educational responsibilities?

    Paul Scully

    There simply has to be. These are global companies and we want to make the Bill work for the whole of the UK. This is not an England-only Bill, so the changes must happen for every user, whether they are in Scotland, Northern Ireland, Wales or England.

    Debbie Abrahams (Oldham East and Saddleworth) (Lab)

    Will the Minister give way?

    Paul Scully

    I will make a bit of progress, because I am testing Mr Speaker’s patience.

    We are making a number of technical amendments to ensure that the new communications offences are targeted and effective. New clause 52 seeks to narrow the exemptions for broadcast and wireless telegraphy licence holders and providers of on-demand programme services, so that the licence holder is exempt only to the extent that communication is within the course of a licensed activity. A separate group of technical amendments ensure that the definition of sending false and threatening communications will capture all circumstances—that is far wider than we have at the moment.

    We propose a number of consequential amendments to relevant existing legislation to ensure that new offences operate consistently with the existing criminal law. We are also making a number of wider technical changes to strengthen the enforcement provisions and ensure consistency with other regulatory frameworks. New clause 42 ensures that Ofcom has the power to issue an enforcement notice to a former service provider, guarding against service providers simply shutting down their business and reappearing in a slightly different guise to avoid regulatory sanction. A package of Government amendments will set out how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework.

    Finally, new clause 40 will enable the CMA to share information with Ofcom for the purpose of facilitating Ofcom’s online safety functions. That will help to ensure effective co-operation between Ofcom and the CMA.

    Dame Maria Miller (Basingstoke) (Con)

    I thank my hon. Friend for giving way. In the past 40 minutes or so, he has demonstrated the complexity of the changes that are being proposed for the Bill, and he has done a very good job in setting that out. However, will he join me and many other right hon. and hon. Members who feel strongly that a Standing Committee should look at the Bill’s implementation, because of the complexities that he has so clearly demonstrated? I know that is a matter for the House rather than our consideration of the Bill, but I hope that other right hon. and hon. Members will join me in looking for ways to put that right. We need to be able to scrutinise the measures on an ongoing basis.

    Paul Scully

    Indeed, there will be, and are, review points in the Bill. I have no doubt that my right hon. Friend will raise that on other occasions as well.

    I want to ensure that there is plenty of time for Members to debate the Bill at this important stage, and I have spoken for long enough. I appreciate the constructive and collaborative approach that colleagues have taken throughout the Bill’s passage.

    Debbie Abrahams rose—

    Paul Scully

    I will give way a final time.

    Debbie Abrahams

    I am grateful to the Minister. Does he support Baroness Kidron’s amendment asking for swift, humane access to data where there is a suspicion that online information may have contributed to a child’s suicide? That has not happened in previous instances; does he support that important amendment?

    Paul Scully

    I am glad that I gave way so that the hon. Lady could raise that point. Baroness Kidron and her organisation have raised that issue with me directly, and they have gathered media support. We will look at that as the Bill goes through this place and the Lords, because we need to see what the powers are at the moment and why they are not working.

    Now is the time to take this legislation forward to ensure that it can deliver the safe and transparent online environment that children and adults so clearly deserve.

  • Dawn Bowden – 2022 Statement on a Culture Strategy for Wales

    Dawn Bowden – 2022 Statement on a Culture Strategy for Wales

    The statement made by Dawn Bowden, the Welsh Deputy Minister for Arts & Sport and Chief Whip, on 15 November 2022.

    Developing a Culture Strategy for Wales is a key Programme for Government and Co-operation Agreement commitment within my portfolio.

    As Wales begins to recover from the impacts of the Covid-19 pandemic, and at a time when people’s wellbeing and resilience is being adversely affected by rising costs of living and difficult financial forecasts, we must maintain a focus on those areas that make a positive difference to people’s everyday lives. We know that cultural and creative experiences are valued by the public, and that our arts, culture and heritage sectors contribute to personal wellbeing and community cohesion. I am pleased therefore to be able to share a short progress update on the development of a new culture strategy for Wales.

    Working with Plaid Cymru designated members, we have agreed that the scope of the strategy will include arts, museums, libraries, archives, and the historic environment, and it will look at how we can best support and develop these sectors in Wales. The strategy should consider, but not be limited to, the role of culture and the arts in promoting positive health and wellbeing, equalities, lifelong learning and skills, supporting digital developments in Wales, the visitor economy, and the Welsh language, together with resilience building to enable effective recovery from the pandemic and delivery on the requirements of the Future Generations Act.

    The Strategy will focus on how we can protect, conserve, and promote the arts, culture and historic assets and collections both now and for future generations. It will develop an inclusive, holistic approach to supporting our sectors and will have a focus on improving equitable access to and participation in all aspects of cultural life in Wales. It will also seek to enhance the close inter-operability of the arts, culture and heritage sectors, so they can collaborate more effectively, across sectors and in partnership with community groups and other stakeholders.

    Following a recent procurement exercise, a lead partner has been appointed to work collaboratively with Welsh Government to produce a new strategy for publication in 2023.

    Over the next few months, the contractor will undertake intensive research and engagement activity. This will involve working closely with partners across the arts, culture and heritage sectors, including but not limited to the four cultural sponsored bodies, Cadw, local sector organisations and people who work in these sectors on the ground. The contractor will also seek input from communities across Wales, especially those that are traditionally excluded or under-served.

    The development of the strategy will be supported by an Overarching Steering Group, which will scrutinise and critically evaluate progress on the development of the strategy, providing conceptual thinking and informed challenge to Welsh Government as required.

    My focus is on ensuring that the new strategy is innovative, ambitious and fit for purpose, and that it is a strategy that will be welcomed by the culture and heritage sectors and by the people of Wales. I will keep the Senedd informed of significant milestones as the work progresses.

  • Sarah Green – 2022 Parliamentary Question on Arts Council Funding

    Sarah Green – 2022 Parliamentary Question on Arts Council Funding

    The parliamentary question asked by Sarah Green, the Liberal Democrat MP for Chesham and Amersham, in the House of Commons on 1 December 2022.

    Sarah Green (Chesham and Amersham) (LD)

    What assessment she has made of the potential impact of Arts Council England funding decisions on leading cultural institutions.

    The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Stuart Andrew)

    Decisions about which organisations to fund and at what level were taken by the Arts Council, an arm’s length body from Government. The Arts Council remains committed to supporting the core cultural institutions. For example, three institutions that receive the most funding in the portfolio are the Royal Opera House, the National Theatre and the Southbank Centre. Many high-profile, established organisations such as the Royal Shakespeare Company and Opera North will continue to receive funding.

    Sarah Green

    I thank the Minister for his answer. Arts Council England’s decision to stop funding English National Opera in London and to effectively demand that it relocates to Manchester will leave hundreds of talented artists and professionals either out of work or forced to uproot their lives. Some of them live in my constituency and are understandably devastated by the decision that they now face, but they also feel blindsided, given that they had very little warning. Will the Minister tell me whether the artists directly impacted by the removal of ENO funding were consulted in advance of the decision? If not, why not?

    Stuart Andrew

    I know that the Arts Council has taken a considerable amount of time to look at the unprecedented number of applications—more than 1,700—that were received and that it has assessed them very carefully. It is making sure that £12.6 million is available in transition funding for those that will be leaving. The time has been increased from three months to seven months, so that there is support for them for up to 12 months. We would certainly encourage the Arts Council and the English National Opera to continue the dialogue that they are having.

    Mr Speaker

    I call the Chair of the Digital, Culture, Media and Sport Committee.

    Julian Knight (Solihull) (Con)

    On a similar theme, levelling up is undoubtedly a noble ambition, and the Arts Council funding has been too London-centric for too long¸ partly due to the subsidies to the Royal Opera House, which, if the Minister ever visits there, he will see is a bit like the Starship Enterprise, in terms of facilities. In correcting the imbalance, however, does he agree that the Arts Council needs to be careful about not potentially wrecking established institutions such as English National Opera, which was given very little notice of funding cuts? As a result, it is threatening legal action. A soft landing is needed. Does he agree that he needs to speak to the Arts Council to ensure that, when it makes such decisions in future, it has a plan in place to ensure that those institutions are at least protected and have a way in which to cope with the decision?

    Stuart Andrew

    I reiterate that the Arts Council is an arm’s length organisation. We have had several meetings to hear about the long processes that it has undertaken to consider each of the awards that it has made. We pushed it to increase the transition period of funding, recognising the difficulty that that may present to other people. We hope that both Arts Council England and English National Opera will work together—we certainly encourage them to—on the possibilities for the future of the organisation.

    Mr Speaker

    I call the shadow Minister.

    Barbara Keeley (Worsley and Eccles South) (Lab)

    We all support the fairer distribution of arts funding and the principle that communities outside London should get a fairer share so that everybody everywhere can enjoy the arts, but levelling up should not be about pitting arts organisations against one another. What we have seen is an attempt to address regional disparity by shifting some funding to the regions, but doing so from a funding pot that has been shrinking since 2010. Does the Minister agree that these very short timeframes and the lack of consultation on these cuts to funding could have a very damaging impact on the ecosystem of the arts?

    Stuart Andrew

    Well, I have to say that London will still be getting the lion’s share of funding from the Arts Council. I make no apology for what we are seeing in areas such as Blackburn, which had never received any funding: four projects there are now receiving funding. Why cannot talented artists in Blackburn get the same access to those opportunities as artists in London? I do not understand the problem.

  • Rupa Huq – 2022 Parliamentary Question on Arts Council Funding

    Rupa Huq – 2022 Parliamentary Question on Arts Council Funding

    The parliamentary question asked by Rupa Huq, the Independent MP for Ealing Central and Acton, in the House of Commons on 1 December 2022.

    Dr Rupa Huq (Ealing Central and Acton) (Ind)

    What steps she is taking to support the tourism sector and visitor economy.

    The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Stuart Andrew)

    The UK was one of the first countries to remove the barriers to both domestic and international visitors, and set out a post-covid tourism recovery plan in summer 2021. An inter-ministerial group for the visitor economy was formed this year, and will meet again in December to discuss cross-departmental policy priorities in support of this important sector.

    Dr Huq

    With the axe looming over the English National Opera and the Donmar Warehouse—both national attractions that have helped the tourism the Minister has described to be a multibillion-pound industry for so many years—and local newbies such as the Ealing Project venue and ActOne cinema facing a tough environment with the post-covid footfall downturn and looming bills, could the Government, now that they are in reset mode, reconsider the impact of Arts Council cuts on London so that we can get tourism flowing through our capital again, from centre to suburb?

    Stuart Andrew

    The Arts Council is an arm’s length body; it makes the decisions and has done so very carefully. It is working with various organisations that will be leaving the funding. However, it is right that we share the funding around the rest of the country; I make no apology for that. I want people not just to come to London to visit our wonderful facilities here, but to go around the whole country and experience what a great country we have to offer for tourism.

    Mrs Pauline Latham (Mid Derbyshire) (Con)

    I welcome what the Minister said about spreading the money around the country. I invite him to come to the Derwent valley mills world heritage site, which is key to the whole of the spine that goes through Derbyshire. It is in disrepair and we need to get tourism back on track for Belper in particular. I would also like him to come to adjacent sites where we have “the clusters”, which are very ancient roads, to see how he can help with some funding.

    Stuart Andrew

    It would be great to go from Qatar to Derbyshire and I would be more than happy to accept my hon. Friend’s invitation. She is right to talk about the many opportunities that we need to look at, including, particularly, the offer in the rest of the country for tourism from not just this country, but around the globe. One of my priorities is to get more people to come to London, of course, but then to visit other great counties such as Yorkshire, as I am sure you would agree, Mr Speaker.