Tag: Speeches

  • Carla Lockhart – 2022 Speech on the Online Safety Bill

    Carla Lockhart – 2022 Speech on the Online Safety Bill

    The speech made by Carla Lockhart, the DUP MP for Upper Bann, in the House of Commons on 5 December 2022.

    I welcome the fact that we are here today to discuss the Bill. It has been a long haul, and we were often dubious as to whether we would see it progressing. The Government have done the right thing by progressing it, because ultimately, as each day passes, harm is being caused by the lack of regulation and enforcement. While some concerns have been addressed, many have not. To that end, this must be not the end but the beginning of a legislative framework that is fit for purpose; one that is agile and keeps up with the speed at which technology changes. For me, probably the biggest challenge for the House and the Government is not how we start but how we end on these issues.

    Like many Members, I am quite conflicted when it comes to legal but harmful content. I know that is a debate for another day, but I will make one short point. I am aware of the concerns about free speech. As someone of faith, I am cognisant of the outrageous recent statement from the Crown Prosecution Service that it is “no longer appropriate” to quote certain parts of the Bible in public. I would have serious concerns about similar diktats and censorship being imposed by social media platforms on what are perfectly legitimate texts, and beliefs based on those texts. Of course, that is just one example, but it is a good example of why, because of the ongoing warfare of some on certain beliefs and opinions, it would be unwise to bestow such policing powers on social media outlets.

    When the Bill was first introduced, I made it very clear that it needed to be robust in its protection of children. In the time remaining, I wish to address some of the amendments that would strengthen the Bill in that regard, as well as the enforcement provisions.

    New clause 16 is a very important amendment. None of us would wish to endure the pain of a child or loved one self-harming. Sadly, we have all been moved by the very personal accounts from victims’ families of the pain inflicted by self-harm. We cannot fathom what is in the mind of those who place such content on the internet. The right hon. Member for Haltemprice and Howden (Mr Davis) and those co-signing the new clause have produced a very considered and comprehensive text, dealing with all the issues in terms of intent, degree of harm and so on, so I fully endorse and welcome new clause 16.

    Likewise, new clauses 45 and 46 would further strengthen the legislation by protecting children from the sharing of an intimate image without consent. Unfortunately, I have sat face to face—as I am sure many in this House have—with those who have been impacted by such cruel use of social media. The pain and humiliation it imposes on the victim is significant. It can cause scars that last a lifetime. While the content can be removed, the impact cannot be removed from the mind of the victim.

    Finally, I make mention of new clause 53. Over recent months I have engaged with campaigners who champion the rights and welfare of those with epilepsy. Those with this condition need to be safe on the internet from the very specific and callous motivation of those who target them because of their condition. We make this change knowing that such legislative protection will increase online protection. Special mention must once again go to young Zach, who has been the star in making this change. What an amazing campaign, one that says to society that no matter how young or old you are, you can bring about change in this House.

    This is a milestone Bill. I believe it brings great progress in offering protections from online harm. I believe it can be further strengthened in areas such as pornography. We only have to think that the British Board of Film Classification found that children are coming across pornography online as young as seven, with 51% of 11 to 13-year-olds having seen pornography at some point. That is damaging people’s mental health and their perception of what a healthy relationship should look and feel like. Ultimately, the Bill does not go far enough on that issue. It will be interesting to see how the other place deals with the Bill and makes changes to it. The day of the internet being the wild west, lawless for young and old, must end. I commend the Bill to the House.

  • Natalie Elphicke – 2022 Speech on the Online Safety Bill

    Natalie Elphicke – 2022 Speech on the Online Safety Bill

    The speech made by Natalie Elphicke, the Conservative MP for Dover, in the House of Commons on 5 December 2022.

    I rise to speak to new clause 55, which stands in my name. I am grateful to my many right hon. and hon. Friends who have supported it, both by putting their name to it and otherwise. I welcome the Minister and his engagement with the new clause and hope to hear from him further as we move through the debate.

    The new clause seeks to create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration. Members may have wondered how so many people—more than 44,000 this year alone—know who to contact to cross the channel, how to go about it and how much it will cost. Like any business, people smuggling relies on word of mouth, a shopfront or digital location on the internet, and advertising. As I will set out, in this context advertising is done not through an advert in the local paper but by posting a video and photos online.

    Nationalities who use the channel crossing routes are from an astonishing array of countries—from Eritrea and Vietnam to Iraq and Iran—but they all end up arriving on boats that leave from France. Since May 2022, there has been a massive increase in the number of Albanians crossing the channel in small boats. From May to September this year, Albanian nationals comprised 42% of small boat crossings, with more than 11,000 Albanians arriving by small boats, compared with 815 the entire previous year. It is little wonder that it is easy to find criminal gangs posting in Albanian on TikTok with videos showing cheery migrants with thumbs up scooting across the channel on dinghies and motoring into Britain with ease. Those videos have comments, which have been roughly translated as:

    “At 8 o’clock the next departure, hurry to catch the road”;

    “They passed again today! Get in touch today”;

    “Get on the road today, serious escape within a day, not…a month in the forest like most”;

    “The trips continue, contact us, we are the best and the fastest”;

    and

    “Every month, safe passage, hurry up”.

    However, far from being safe, the small boat crossings are harmful, dangerous and connected with serious crime here in the UK, including modern slavery, the drugs trade and people trafficking.

    With regard to the journey, there have been a number of deaths at sea. The Minister for Immigration recently stated that many people in processing centres

    “present with severe burns that they have received through the combination of salty water and diesel fuel in the dinghies.”—[Official Report, 28 November 2022; Vol. 723, c. 683.]

    That, of course, underlines why prevention, detection and interception of illegal entry is so important on our sea border. It also speaks to the harm and prevention of harm that my new clause seeks to address: to identify and disrupt the ability of those gangs to post on social media and put up photographs, thereby attracting new business, and communicate in relation to their illegal activity.

    The National Crime Agency has identified links with the criminal drugs trade, modern slavery and other serious and violent crime. That is because illegal immigration and modern slavery offences do not just happen abroad. A criminal enterprise of this scale has a number of operators both here in the UK and abroad. That includes people here in the UK who pay for the transit of another. When they do, they do not generally have the good fortune of that other individual in mind. There are particular concerns about young people and unaccompanied children as well as people who find themselves in debt bondage in modern slavery.

    That also includes people here in the UK who provide information, such as those TikTok videos, to a friend or contacts in a home country so that other people can make their own arrangements to travel. It includes people here in the UK who take photos of arrivals and post or message them to trigger success fees. Those fees are the evidence-based method of transacting in this illegal enterprise and are thought to be responsible for some of the most terrifying experiences of people making the crossing, including even a pregnant woman and others being forced into boats at gunpoint and knifepoint in poor weather when they did not want to go, and parents separated from their children at the water’s edge, with their children taken and threatened to coerce them into complying.

    Last year, 27 people died in the channel in a single day, in the worst small boat incident to date. A newspaper report about those deaths contains comment about a young man who died whose name was Pirot. His friend said of the arrangements for the journey:

    “Typically…the smugglers made deals with families at home. Sometimes they turned up at the camp in masks. The crossing costs about £3,000 per person, with cash demanded in full once their loved one had made it to Dover. One of the Iraqi Kurdish smugglers who arranged Pirot’s crossing has since deleted his Facebook page and WhatsApp account”.

    TikTok, WhatsApp and Facebook have all been identified as platforms actively used by the people smugglers. Action is needed in the Bill’s remit to protect people from people smugglers and save lives in the channel. The new offence would ensure that people here in the UK who promote illegal immigration and modern slavery face a stronger deterrent and, for the first time, real criminal penalties for their misdeeds. It would make it harder for the people smugglers to sell their wares. It would help to protect people who would be exploited and put at risk by those criminal gangs. The risk to life and injury, the risk of modern slavery, and the risks of being swept into further crime, both abroad and here in the UK, are very real.

    The new offence would be another in the toolbox to tackle illegal immigration and prevent modern slavery. I hope that when the Minister makes his remarks, he may consider further expansion of other provisions currently in the Bill but outside the scope of our discussions, such as the schedule 7 priority offences. New clause 55 would tackle the TikTok traffickers and help prevent people from risking their lives by taking these journeys across the English channel.

  • Jamie Stone – 2022 Speech on the Online Safety Bill

    Jamie Stone – 2022 Speech on the Online Safety Bill

    The speech made by Jamie Stone, the Liberal Democrat MP for Caithness, Sutherland and Easter Ross, in the House of Commons on 5 December 2022.

    Clearly I am on my feet now because I am the Liberal Democrat DCMS spokesman, but many is the time when, in this place, I have probably erred on the side of painting a rosy picture of my part of the world—the highlands—where children can play among the heather and enjoy themselves, and life is safe and easy. This week just gone I was pulled up short by two mothers I know who knew all about today. They asked whether I would be speaking. They told me of their deep concern for a youngster who is being bullied right now, to the point where she was overheard saying among her family that she doubted she would ever make the age of 21. I hope to God that that young person, who I cannot name, is reached out to before we reach the tragic level of what we have heard about already today. Something like that doesn’t half put a shadow in front of the sun, and a cold hand on one’s heart. That is why we are here today: we are all singing off the same sheet.

    The Liberal Democrats back new clause 17 in the name of the right hon. Member for Barking (Dame Margaret Hodge). Fundamental to being British is a sense of fair play, and a notion that the boss or bosses should carry the can at the end of the day. It should not be beyond the wit of man to do exactly what the right hon. Lady suggested, and nobble those who ultimately hold responsibility for some of this. We are pretty strong on that point.

    Having said all that, there is good stuff in the Bill. Obviously, it has been held up by the Government—or Governments, plural—which is regrettable, but it is easy to be clever after the fact. There is much in the Bill, and hopefully the delay is behind us. It has been chaotic, but we are pleased with the direction in which we are heading at the moment.

    I have three or four specific points. My party welcomes the move to expand existing offences on sharing intimate images of someone to include those that are created digitally, known as deep fakes. We also warmly welcome the move to create a new criminal offence of assisting or encouraging self-harm online, although I ask the Government for more detail on that as soon as possible. Thirdly, as others have mentioned, the proposed implementation of Zach’s law will make it illegal to post stuff that hits people with epilepsy.

    If the pandemic taught me one thing, it was that “media-savvy” is not me. Without my young staff who helped me during that period, it would have been completely beyond my capability to Zoom three times in one week. Not everyone out there has the assistance of able young people, which I had, and I am very grateful for that. One point that I have made before is that we would like to see specific objectives—perhaps delivered by Ofcom as a specific duty—on getting more media savvy out there. I extol to the House the virtue of new clause 37, tabled by my hon. Friend the Member for Twickenham (Munira Wilson). The more online savvy we can get through training, the better.

    At the end of the day, the Bill is well intentioned and, as we have heard, it is essential that it makes a real impact. In the case of the young person I mentioned who is in a dark place right now, we must get it going pretty dashed quick.

  • Suzanne Webb – 2022 Speech on the Online Safety Bill

    Suzanne Webb – 2022 Speech on the Online Safety Bill

    The speech made by Suzanne Webb, the Conservative MP for Stourbridge, in the House of Commons on 5 December 2022.

    This is the first time I have been able to speak in the Chamber for some time, due to a certain role I had that prevented me from speaking in here. It is an absolute honour and privilege, on my first outing in some time, to have the opportunity to speak specifically to new clause 53, which is Zach’s law. I am delighted and thrilled that the Government are supporting Zach’s law. I have supported it for more than two years, together with my hon. Friend the Member for Watford (Dean Russell). We heard during the Joint Committee on the Draft Online Safety Bill how those who suffer from epilepsy were sent flashing images on social media by vile trolls. Zach Eagling, whom the law is named after, also has cerebral palsy, and he was one of those people. He was sent flashing images after he took part in a charity walk around his garden. He was only nine years of age.

    Zach is inspirational. He is selflessly making a massive difference, and the new clause is world-leading. It is down to Zach, his mum, the UK Epilepsy Society, and of course the Government, that I am able to stand here to talk about new clause 53. I believe that the UK Epilepsy Society is the only charity in the world to change the law on any policy area, and that is new clause 53, which is pretty ground-breaking. I say thank you to Zach and the Epilepsy Society, who ensured that I and my hon. Friend the Member for Watford stepped up and played our part in that.

    Being on the Joint Committee on the Draft Online Safety Bill was an absolute privilege, with the excellent chairmanship of my hon. Friend the Member for Folkestone and Hythe (Damian Collins). People have been talking about the Bill’s accompanying Committee, which is an incredibly good thing. In the Joint Committee we talked about this: we should follow the Bill through all its stages, and also once it is on the statute books, to ensure that it keeps up with those tech companies. The Joint Committee was brought together by being focused on a skill set, and on bringing together the right skills. I am a technological luddite, but I brought my skills and understanding of audit and governance. My hon. Friend the Member for Watford brought technology and all his experience from his previous day job. As a result we had a better Bill by having a mix of experience and sharing our expertise.

    This Bill is truly world leading. New clause 53 is one small part of that, but it will make a huge difference to thousands of lives including, I believe, 600,000 who suffer from epilepsy. The simple reality is that the big tech companies can do better and need to step up. I have always said that we do not actually need the Bill or these amendments; we need the tech companies to do what they are supposed to do, and go out and regulate their consumer product. I have always strongly believed that.

    During my time on the Committee I learned that we must follow the money—that is what it is all about for the tech companies. We have been listening to horrific stories from grieving parents, some of whom I met briefly, and from those who suffered at the hands of racism, abuse, threats—the list is endless. The tech companies could stop that now. They do not need the Bill to do it and they should do the right thing. We should not have to get the Bill on to the statute books to enforce what those companies should be doing in the first place. We keep saying that this issue has been going on for five years. The tech companies know that this has been talked about for five years, so why are they not doing something? For me the Bill is for all those grieving families who have lost their beautiful children, those who have been at the mercy of keyboard warriors, and those who have received harm or lost their lives because the tech companies have not, but could have, done better. This is about accountability. Where are the tech companies?

    I wish to touch briefly on bereaved parents whose children have been at the mercy of technology and content. Many families have spent years and years still unable to understand their child’s death. We must consider imposing transparency on the tech companies. Those families cannot get their children back, but they are working hard to ensure that others do not lose theirs. Data should be given to coroners in the event of the death of a child to understand the circumstances. This is important to ensure there is a swift and humane process for the coroner to access information where there is reason to suspect that it has impacted on a child’s death.

    In conclusion, a huge hurrah that we have new clause 53, and I thank the Government for this ground-breaking Bill. An even bigger hurrah to Zach, Zach’s mum, and the brilliant Epilepsy Society, and, of course, to Zach’s law, which is new clause 53.

  • Liz Twist – 2022 Speech on the Online Safety Bill

    Liz Twist – 2022 Speech on the Online Safety Bill

    The speech made by Liz Twist, the Labour MP for Blaydon, in the House of Commons on 5 December 2022.

    I wish to address new clauses 16 and 28 to 30, and perhaps make a few passing comments on some others along the way. Many others who, like me, were in the Chamber for the start of the debate will I suspect feel like a broken record, because we keep revisiting the same issues and raising the same points again and again, and I am going to do exactly that.

    First, I will speak about new clause 16, which would create a new offence of encouraging or assisting serious self-harm. I am going to do so because I am the chair of the all-party parliamentary group on suicide and self-harm prevention, and we have done a good deal of work on looking at the issue of self-harm and young people in the last two years. We know that suicide is the leading cause of death in men aged under 50 years and females aged under 35 years, with the latest available figures confirming that 5,583 people in England and Wales tragically took their own lives in 2021. We know that self-harm is a strong risk factor for future suicidal ideation, so it is really important that we tackle this issue.

    The internet can be an invaluable and very supportive place for some people who are given the opportunity to access support, but for other people it is difficult. The information they see may provide access to content that acts to encourage, maintain or exacerbate self-harm and suicidal behaviours. Detailed information about methods can also increase the likelihood of imitative and copycat suicide, with risks such as contagion effects also present in the online environment.

    Richard Burgon (Leeds East) (Lab)

    I pay tribute to my hon. Friend for the work she has done. She will be aware of the case of my constituent Joe Nihill, who at the age of 23 took his own life after accessing suicide-related material on the internet. Of course, we fully support new clause 16 and amendment 159. A lot of content about suicide is harmful, but not illegal, so does my hon. Friend agree that what we really need is assurances from the Minister that, when this Bill comes back, it will include protections to ensure that adults such as Joe, who was aged 23, and adults accessing these materials through smaller platforms are fully protected and get the protection they really need?

    Liz Twist

    I thank my hon. Friend for those comments, and I most definitely agree with him. One of the points we should not lose sight of is that his constituent was 23 years of age—not a child, but still liable to be influenced by the material on the internet. That is one of the points we need to take forward.

    It is really important that we look at the new self-harm offence to make sure that this issue is addressed. That is something that the Samaritans, which I work with, has been campaigning for. The Government have said they will create a new offence, which we will discuss at a future date, but there is real concern that we need to address this issue as soon as possible through new clause 16. I ask the Minister to comment on that so that we can deal with the issue of self-harm straightaway.

    I now want to talk about internet and media literacy in relation to new clauses 29 and 30. YoungMinds, which works with young people, is supported by the Royal College of Psychiatrists, the British Psychological Society and the Mental Health Foundation in its proposals to promote the public’s media literacy for both regulated user-to-user services and search services, and to create a strategy to do this. Young people, when asked by YoungMinds what they thought, said they wanted the Online Safety Bill to include a requirement for such initiatives. YoungMinds also found that young people were frustrated by very broad, generalised and outdated messages, and that they want much more nuanced information—not generalised fearmongering, but practical ways in which they can address the issue. I do hope that the Government will take that on board, because if people are to be protected, it is important that we have a more sophisticated media literacy than is reflected in the broad messages we sometimes get at present.

    On new clause 28, I do believe there is a need for advocacy services to be supported by the Government to assist and support young people—not to take responsibilities away from them, but to assist and protect them. I want to make two other points. I see that the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) has left the Chamber again, but he raised an interesting and important point about the size of platforms covered by the Bill. I believe the Bill needs to cover those smaller or specialised platforms that people might have been pushed on to by changes to the larger platforms. I hope the Government will address that important issue in future, together with the issue of age, so that protection does not stop just with children, and we ensure that others who may have vulnerabilities are also protected.

    I will not talk about “legal but harmful” because that is not for today, but there is a lot of concern about those provisions, which we thought were sorted out and agreed on, suddenly being changed. There is a lot of trepidation about what might come in future, and the Minister must understand that we will be looking closely at any proposed changes.

    We have been talking about this issue for many years—indeed, since I first came to the House—and during the debate I saw several former Ministers and Secretaries of State with whom I have raised these issues. It is about time that we passed the Bill. People out there, including young people, are concerned and affected by these issues. The internet and social media are not going to stop because we want to make the Bill perfect. We must ensure that we have something in place. The legislation might be capable of revision in future, but we need it now for the sake of our young people and other vulnerable people who are accessing online information.

  • John Penrose – 2022 Speech on the Online Safety Bill

    John Penrose – 2022 Speech on the Online Safety Bill

    The speech made by John Penrose, the Conservative MP for Weston-super-Mare, in the House of Commons on 5 December 2022.

    It is a pleasure to follow Zach’s MP, the hon. Member for Batley and Spen (Kim Leadbeater). I particularly want to pick up on her final comments about the difficulties of platforms—not just small platforms, but larger ones—hosting extremist content, be it incels, the alt-right, the radical left or any other kind.

    I will speak to my new clauses 34 and 35, which seek to deal with both disinformation and misinformation. They are important amendments, because although the Bill has taken huge steps forward—we are led to believe that it may take a couple more in due course when the revised version comes back if the recommittal is passed—there are still whole categories of harm that it does not yet address. In particular, it focuses, rightly and understandably, on individual harms to children and illegal activities as they relate to adults, but it does not yet deal with anything to do with collective harms to our society and our democracy, which matter too.

    We have heard from former journalists in this debate. Journalists know it takes time and money to come up with a properly researched, authoritatively correct, accurate piece of journalism, but it takes a fraction of that time and cost to invent a lie. A lie will get halfway around the world before the truth has got its boots on, as the saying rightly goes. Incidentally, the hon. Member for Rotherham (Sarah Champion) said that it is wonderful that we are all learning so much. I share that sentiment; it is marvellous that we are all comparing and sharing our particular areas of expertise.

    One person who seems to have all areas of expertise under his belt is my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who chaired the Joint Committee. He rightly pointed out that this is a systems Bill, and it therefore deals with trying to prevent some things from happening—and yet it is completely silent on misinformation and disinformation, and their effect on us collectively, as a society and a democracy. New clauses 34 and 35 are an attempt to begin to address those collective harms alongside some individual harms we face. One of them deals with a duty of balance; the other deals with factual accuracy.

    The duty of balance is an attempt to address the problem as it relates to filter bubbles, because this is a systems Bill and because each of us has a tailored filter bubble, by which each of the major platforms, and some of the minor ones, work out what we are interested in and feed us more of the same. That is fine for people who are interested in fishing tackle; that is super. But if someone is interested in incels and they get fed more and more incel stuff, or they are vaguely left wing and get taken down a rabbit hole into the increasingly radical left—or alternatively alt-right, religious extremism or whatever it may be—pretty soon they get into echo chambers, and from echo chambers they get into radicalisation, and from radicalisation they can pretty soon end up in some very murky, dark and deep waters.

    There are existing rules for other old-world broadcasters; the BBC, ITV and all the other existing broadcasters have a duty of balance and undue prominence imposed on them by Ofcom. My argument is that we should consider ways to impose a similar duty of balance on the people who put together the programs that create our own individual filter bubbles, so that when someone is shown an awful lot of stuff about incels, or alt-right or radical left politics, somewhere in that filter bubble they will be sent something saying, “You do know that this is only part of the argument, don’t you? Do you know that there is another side to this? Here’s the alternative; here’s the balancing point.” We are not doing that at the moment, which is one of the reasons we have an increasingly divided societal and political debate, and that our public square as a society is becoming increasingly more fractious—and dangerous, in some cases. New clause 35 would fix that particular problem.

    New clause 34 would deal with the other point—the fact that a lie will get halfway around the world before the truth has got its boots on. It tries to deal with factual accuracy. Factual accuracy is not quite the same thing as truth. Truth is an altogether larger and more philosophical concept to get one’s head around. It is how we string together accurate and correct facts to create a narrative or an explanation. Factual accuracy is an essential building block for truth. We must at least try to ensure that we can all see when someone has made something up or invented something, whether it is that bleach is a good way to cure covid or whatever. When somebody makes something up, we need to know and it needs to be clear. In many cases that is clear, but in many cases, if it is a plausible lie, a deepfake or whatever it may be, it is not clear. We need to be able to see that easily, quickly and immediately, and say, “I can discount this, because I know that the person producing it is a serial liar and tells huge great big porkies, and I shouldn’t be trusting what they are sending me, or I can see that the actual item itself is clearly made up.”

    The duty of achieving balance already exists in rules and law in other parts of our society and is tried and tested—it has stood us very well and done a good job for us for 40 or 50 years, since TV and radio became ubiquitous—and the same is true, although not for quite such a long time, for factual accuracy. There are increasingly good methods of checking the factual accuracy of individual bits of content, and if necessary, in some cases of doing so in real time, too. For example, Adobe is leading a very large global grouping producing something called the Content Authenticity Initiative, which can tell if something is a deepfake, because it has an audit trail of where the image, the item or whatever it may be came from and how it has been updated, modified or changed during the course of its life.

    Dean Russell

    On that point, I want to raise the work that my hon. Friend the Member for Bosworth (Dr Evans), who is not in the Chamber at the moment, has done on body image. When images are photo-shopped and changed to give an idea of beauty that is very different from what is possible in the real world, that very much falls into the idea of truth. What are my hon. Friend’s thoughts on that point?

    John Penrose

    Addressing that is absolutely essential. That goes for any of the deepfake examples we have heard about, including from my right hon. Friend the Member for Basingstoke (Dame Maria Miller), because if we know that something has been changed—and the whole point about deepfake is that it is hard to tell—we can tell easily and say, “I know that is not right, I know that is not true, I know that is false, and I can aim away from it and treat it accordingly”.

    Just to make sure that everybody understands, this is not some piece of new tech magic; it is already established. Adobe, as I have said, is doing it with the Content Authenticity Initiative, which is widely backed by other very serious tech firms. Others in the journalism world are doing the same thing, with the Journalism Trust Initiative. There is NewsGuard, which produces trust ratings; the Trust Project, which produces trust indicators; and we of course have our own press regulators in this country, the Independent Press Standards Organisation and IMPRESS.

    I urge the Government and all here present not to be satisfied with where this Bill stands now. We have all heard how it can be improved. We have all heard that this is a new, groundbreaking and difficult area in which many other countries have not even got as far as we have, but we should not be in any way satisfied with where we are now. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said earlier that we need to approach this Bill in a spirit of being humble, and this is an area in which humility is absolutely essential. I hope all of us realise how much further we have to go, and I hope the Minister will say how he proposes to address these important and so far uncovered issues in due course.

  • Kim Leadbeater – 2022 Speech on the Online Safety Bill

    Kim Leadbeater – 2022 Speech on the Online Safety Bill

    The speech made by Kim Leadbeater, the Labour MP for Batley and Spen, in the House of Commons on 5 December 2022.

    I apologise for having left the debate for a short time; I had committed to speaking to a room full of young people about the importance of political education, which felt like the right thing to do, given the nature of the debate and the impact that the Bill will have on our young people.

    I am extremely relieved that we are continuing to debate the Bill, despite the considerable delays that we have seen; as I mentioned in this House previously, it is long overdue. I acknowledge that it is still groundbreaking in its scope and extremely important, but we must now ensure that it works, particularly for children and vulnerable adults, and that it goes some way to cleaning up the internet for everyone by putting users first and holding platforms to account.

    On new clause 53, I put on record my thanks to the Government for following through with their commitments to me in Committee to write Zach’s law in full into the Bill. My constituent Zach Eagling and his mum Clare came into Parliament a few weeks ago, and I know that hon. Members from both sides of the House were pleased to meet him to thank him for his incredible campaign to make the vile practice of epilepsy trolling completely illegal, with a maximum penalty of a five-year prison sentence. The inspirational Zach, his mum and the Epilepsy Society deserve enormous praise and credit for their incredible campaign, which will now protect the 600,000 people living with epilepsy in the UK. I am delighted to report that Zach and his mum have texted me to thank all hon. Members for their work on that.

    I will raise three areas of particular concern with the parts of the Bill that we are focusing on. First, on director liability, the Bill includes stiff financial penalties for platforms that I hope will force them to comply with these regulations, but until the directors of these companies are liable and accountable for ensuring that their platforms comply and treat the subject with the seriousness it requires, I do not believe that we will see the action needed to protect children and all internet users.

    Ultimately, if platforms enforce their own terms and conditions, remove illegal content and comply with the legal but harmful regulations—as they consistently tell us that they will—they have nothing to worry about. When we hear the stories of harm committed online, however, and when we hear from the victims and their families about the devastation that it causes, we must be absolutely watertight in ensuring that those who manage and operate the platforms take every possible step to protect every user on their platform.

    We must ensure that, to the directors of those companies, this is a personal commitment as part of their role and responsibility. As we saw with health and safety regulations, direct liability is the most effective way to ensure that companies implement such measures and are scrupulous in reviewing them. That is why I support new clause 17 and thank my right hon. Friend the Member for Barking (Dame Margaret Hodge) for her tireless and invaluable work on this subject.

    Let me turn to media literacy—a subject that I raised repeatedly in Committee. I am deeply disappointed that the Government have removed the media literacy duty that they previously committed to introducing. Platforms can boast of all the safety tools they have to protect users, talk about them in meetings, publicise them in press releases and defend them during Committee hearings, but unless users know that they are there and know exactly how to use them, and unless they are being used, their existence is pointless.

    Ofcom recently found that more than a third of children aged eight to 17 said they had seen something “worrying or nasty” online in the past 12 months, but only a third of children knew how to use online reporting or flagging functions. Among adults, a third of internet users were unaware of the potential for inaccurate or biased information online, and just over a third made no appropriate checks before registering their personal details online. Clearly, far more needs to be done to ensure that internet users of all ages are aware of online dangers and of the tools available to keep them safe.

    Although programmes such as Google’s “Be Internet Legends” assemblies are a great resource in schools—I was pleased to visit one at Park Road Junior Infant and Nursery School in Batley recently—we cannot rely on platforms to do this themselves. We have had public information campaigns on the importance of wearing seatbelts, and on the dangers of drink-driving and smoking, and the digital world is now one of the largest dangers most people face in their daily lives. The public sector clearly has a role to warn of the dangers and promote healthy digital habits.

    Let me give one example from the territory of legal but harmful content, which members have spoken about as opaque, challenging and thorny. I agree with all those comments, but if platforms have a tool within them that switches off legal but harmful content, it strikes me as incredibly important that users know what that tool does—that is, they know what information they may be subjected to if it is switched on, and they know exactly how to turn it off. Yet I have heard nothing from the Government since their announcement last week that suggests they will be taking steps to ensure that this tool is easily accessible to users of all ages and digital abilities, and that is exactly why there is a need for a proper digital media literacy strategy.

    I therefore support new clauses 29 and 30, tabled by my colleagues in the SNP, which would empower Ofcom to publish a strategy at least every three years that sets out the measures it is taking to promote media literacy among the public, including through educational initiatives and by ensuring that platforms take the steps needed to make their users aware of online safety tools.

    Finally, I turn to the categorisation of platforms under part 7 of the Bill. I feel extremely strongly about this subject and agree with many comments made by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). The categorisation system listed in the Bill is not fit for purpose. I appreciate that categorisation is largely covered in part 3 and schedule 10, but amendment 159, which we will be discussing in Committee, and new clause 1, which we are discussing today, are important steps towards addressing the Government’s implausible position—that the size of a platform equates to the level of risk. As a number of witnesses stated in Committee, that is simply not the case.

    It is completely irresponsible and narrow-minded to believe that there are no blind spots in which small, high-risk platforms can fester. I speak in particular about platforms relating to dangerous, extremist content —be it Islamist, right wing, incel or any other. These platforms, which may fall out of the scope of the Bill, will be allowed to continue to host extremist individuals and organisations, and their deeply dangerous material. I hope the Government will urgently reconsider that approach, as it risks inadvertently pushing people, including young people, towards greater harm online—either for individuals or for society as a whole.

    Although I am pleased that the Bill is back before us today, I am disappointed that aspects have been weakened since we last considered it, and urge the Government to consider closely some proposals we will vote on this evening, which would go a considerable way to ensuring that the online world is a safer place for children and adults, works in the interests of users, and holds platforms accountable and responsible for protecting us all online.

  • Adam Afriyie – 2022 Speech on the Online Safety Bill

    Adam Afriyie – 2022 Speech on the Online Safety Bill

    The speech made by Adam Afriyie, the Conservative MP for Windsor, in the House of Commons on 5 December 2022.

    I am pleased to follow my fairly close neighbour from Berkshire, the hon. Member for Reading East (Matt Rodda). He raised the issue of legal but harmful content, which I will come to, as I address some of the amendments before us.

    I very much welcome the new shape and focus of the Bill. Our primary duty in this place has to be to protect children, above almost all else. The refocusing of the Bill certainly does that, and it is now in a position where hon. Members from all political parties recognise that it is so close to fulfilling its function that we want it to get through this place as quickly as possible with today’s amendments and those that are forthcoming in the Lords and elsewhere in future weeks.

    The emerging piece of legislation is better and more streamlined. I will come on to further points about legal but harmful, but I am pleased to see that removed from the Bill for adults and I will explain why, given the sensitive case that the hon. Member for Reading East mentioned. The information that he talked about being published online should be illegal, so it would be covered by the Bill. Illegal information should not be published and, within the framework of the Bill, would be taken down quickly. We in this place should not shirk our responsibilities; we should make illegal the things that we and our constituents believe to be deeply harmful. If we are not prepared to do that, we cannot say that some other third party has a responsibility to do it on our behalf and we are not going to have anything to do with it, and they can begin to make the rules, whether they are a commercial company or a regulator without those specific powers.

    I welcome the shape of the Bill, but some great new clauses have been tabled. New clause 16 suggests that we should make it an offence to encourage self-harm, which is fantastic. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) has indicated that he will not press it to a vote, because the Government and all of us acknowledge that that needs to be dealt with at some point, so hopefully an amendment will be forthcoming in the near future.

    On new clause 23, it is clear that if a commercial company is perpetrating an illegal act or is causing harm, it should pay for it, and a proportion of that payment must certainly support the payments to victims of that crime or breach of the regulations. New clauses 45 to 50 have been articulately discussed by my right hon. Friend the Member for Basingstoke (Dame Maria Miller). The technology around revenge pornography and deepfakes is moving forward every day. With some of the fakes online today, it is not possible to tell that they are fakes, even if they are looked at under a microscope. Those areas need to be dealt with, but it is welcome that she will not necessarily press the new clauses to a vote, because those matters must be picked up and defined in primary legislation as criminal acts. There will then be no lack of clarity and we will not need the legal but harmful concept—that will not need to exist. Something will either be illegal, because it is harmful, or not.

    The Bill is great because it provides a framework that enables everything else that hon. Members in the House and people across the country may want to be enacted at a future date. It also enables the power to make those judgments to remain with this House—the democratically elected representatives of the people—rather than some grey bureaucratic body or commercial company whose primary interest is rightly to make vast sums of money for its shareholders. It is not for them to decide; it is for us to decide what is legal and what should be allowed to be viewed in public.

    On amendment 152, which interacts with new clause 11, I was in the IT industry for about 15 to 20 years before coming to this place, albeit with a previous generation of technology. When it comes to end-to-end encryption, I am reminded of King Canute, who said, “I’m going to pass a law so that the tide doesn’t come in.” Frankly, we cannot pass a law that bans mathematics, which is effectively what we would be trying to do if we tried to ban encryption. The nefarious types or evildoers who want to hide their criminal activity will simply use mathematics to do that, whether in mainstream social media companies or through a nefarious route. We have to be careful about getting rid of all the benefits of secure end-to-end encryption for democracy, safety and protection from domestic abuse—all the good things that we want in society—on the basis of a tiny minority of very bad people who need to be caught. We should not be seeking to ban encryption; we should be seeking to catch those criminals, and there are ways of doing so.

    I welcome the Bill; I am pleased with the new approach and I think it can pass through this House swiftly if we stick together and make the amendments that we need. I have had conversations with the Minister about what I am asking for today: I am looking for an assurance that the Government will enable further debate and table the amendments that they have suggested. I also hope that they will be humble, as my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said, and open to some minor adjustments, even to the current thinking, to make the Bill pass smoothly through the Commons and the Lords.

    I would like the Government to confirm that it is part of their vision that it will be this place, not a Minister of State, that decides every year—or perhaps every few months, because technology moves quickly—what new offences need to be identified in law. That will mean that Ofcom and the criminal justice system can get on to that quickly to ensure that the online world is a safer place for our children and a more pleasant place for all of us.

  • Matt Rodda – 2022 Speech on the Online Safety Bill

    Matt Rodda – 2022 Speech on the Online Safety Bill

    The speech made by Matt Rodda, the Labour MP for Reading East, in the House of Commons on 5 December 2022.

    I am grateful to have the opportunity to speak in this debate. I commend the right hon. Member for Basingstoke (Dame Maria Miller) on her work in this important area. I would like to focus my remarks on legal but harmful content and its relationship to knife crime, and to mention a very harrowing and difficult constituency case of mine. As we have heard, legal but harmful content can have a truly dreadful effect. I pay tribute to the families of the children who have been lost, who have attended the debate, a number of whom are still in the Public Gallery.

    Madam Deputy Speaker (Dame Rosie Winterton)

    Just to be clear, the hon. Gentleman’s speech must relate to the amendments before us today.

    Matt Rodda

    Thank you, Madam Deputy Speaker. A boy called Olly Stephens in my constituency was just 13 years old when he was stabbed and brutally murdered in an attack linked to online bullying. He died, sadly, very near his home. His parents had little idea of the social media activity in his life. It is impossible to imagine what they have been through. Our hearts go out to them.

    Harmful but legal content had a terrible effect on the attack on Olly. The two boys who attacked and stabbed him had been sharing enormous numbers of pictures and videos of knives, repeatedly, over a long period of time. There were often videos of teenagers playing with knives, waving them or holding them. They circulated them on 11 different social media platforms over a long period of time. None of those platforms took any action to take the content down. We all need to learn more about such cases to fully understand the impact of legal but harmful content. Even at this late stage, I hope that the Government will think again about the changes they have made to the Bill and include this area again in the Bill.

    There is a second aspect of this very difficult case that I want to mention: the fact that Olly’s murder was discussed on social media and was planned to some extent beforehand. The wider issues here underline the need for far greater regulation and moderation of social media, in particular teenagers’ use of these powerful sites. I am finding it difficult to talk about some of these matters, but I hope that the Government will take my points on board and address the issue of legal but harmful content, and that the Minister will think again about these important matters. Perhaps we will have an opportunity to discuss it in the Bill’s later stages.

  • Maria Miller – 2022 Speech on the Online Safety Bill

    Maria Miller – 2022 Speech on the Online Safety Bill

    The speech made by Maria Miller, the Conservative MP for Basingstoke, in the House of Commons on 5 December 2022.

    I rise to speak to the seven new clauses in my name and those of right hon. and hon. Members from across the House. The Government have kindly said publicly that they are minded to listen to six of the seven amendments that I have tabled on Report. I hope they will listen to the seventh, too, once they have heard my compelling arguments.

    First, I believe it is important that we discuss these amendments, because the Government have not yet tabled amendments. It is important that we in this place understand the Government’s true intention on implementing the Law Commission review in full before the Bill completes its consideration.

    Secondly, the law simply does not properly recognise as a criminal offence the posting online of intimate images—whether real or fake—without consent. Victims say that having a sexual image of them posted online without their consent is akin to a sexual assault. Indeed, Clare McGlynn went even further by saying that there is a big difference between a physical sexual assault and one committed online: victims are always rediscovering the online images and waiting for them to be redistributed, and cannot see when the abuse will be over. In many ways, it is even more acute.

    Just in case anybody in the Chamber is unaware of the scale of the problem after the various contributions that have been made, in the past five years more than 12,000 people reported to the revenge porn helpline almost 200,000 pieces of content that fall into that category. Indeed, since 2014 there have been 28,000 reports to the police of intimate images being distributed without consent.

    The final reason why I believe it is important that we discuss the new clauses is that Ofcom will be regulating online platforms based on their adherence to the criminal law, among other things. It is so important that the criminal law actually recognises where criminal harm is done, but at the moment, when it comes to intimate image abuse, it does not. Throughout all the stages of the Bill’s passage, successive Ministers have said very positive things to me about the need to address this issue in the criminal law, but we still have not seen pen being put to paper, so I hope the Minister will forgive me for raising this yet again so that he can respond.

    New clauses 45 to 50 simply seek to take the Law Commission’s recommendations on intimate image abuse and put them into law as far as the scope of the Bill will allow. New clause 45 would create a base offence for posting explicit images online without consent. Basing the offence on consent, or the lack of it, makes it comparable with three out of four offences already recognised in the Sexual Offences Act 2003. Subsection (10) of the new clause recognises that it is a criminal offence to distribute fake images, deepfakes or images using nudification software, which are currently not covered in law at all.

    New clauses 46 and 47 recognise cases where there is a higher level of culpability for the perpetrator, where they intend to cause alarm, distress or humiliation. Two in three victims report that they know the perpetrators, as a current or former partner. In evidence to the Public Bill Committee, on which I was very pleased to serve, we heard from the Anjelou Centre and Imkaan that some survivors of this dreadful form of abuse are also at risk of honour-based violence. There are yet more layers of abuse.

    New clause 48 would make it a crime to threaten to share an intimate image—this can be just as psychologically destructive as actually sharing it—and using the image to coerce, control or manipulate the victim. I pay real tribute to the team from the Law Commission, under the leadership of Penney Lewis, who did an amazing job of work over three years on their enquiry to collect this information. In the responses to the enquiry there were four mentions of suicide or contemplated suicide as a result of threats to share these sorts of images online without consent. Around one in seven young women and one in nine young men have experienced a threat to share an intimate or sexual image. One in four calls to the Revenge Porn Helpline relate to threats to share. The list of issues goes on. In 2020 almost 3,000 people, mostly men, received demands for money related to sexual images—“sextorsion”, as it is called. This new clause would make it clear that such threats are criminal, the police need to take action and there will be proper protection for victims in law.

    New clauses 49 and 50 would go further. The Law Commission is clear that intimate image abuse is a type of sexual offending. Therefore, victims should have the same protection afforded to those of other sexual offences. That is backed up by the legal committee of the Council of His Majesty’s District Judges, which argues that it is appropriate to extend automatic lifetime anonymity protections to victims, just as they would be extended to victims of offences under the Modern Slavery Act 2015. Women’s Aid underlined that point, recognising that black and minoritised women are also at risk of being disowned, ostracised or even killed if they cannot remain anonymous. The special measures in these new clauses provide for victims in the same way as the Domestic Abuse Act 2021.

    I hope that my hon. Friend the Minister can confirm that the Government intend to introduce the Law Commission’s full recommendations into the Bill, and that those in scope will be included before the Bill reaches its next stage in the other place. I also hope that he will outline how those measures not in scope of the Bill—specifically on the taking and making of sexual images without consent, which formed part of the Law Commission’s recommendations—will be addressed in legislation swiftly. I will be happy to withdraw my new clauses if those undertakings are made today.

    Finally, new clause 23, which also stands in my name, is separate from the Law Commission’s recommendations. It would require a proportion of the fines secured by Ofcom to be used to fund victims’ services. I am sure that the Treasury thinks that it is an innovative way of handling things, although one could argue that it did something similar only a few days ago with regard to the pollution of waterways by water companies. I am sure that the Minister might want to refer to that.

    The Bill identifies that many thousands more offences are committed as crimes than are currently recognised within law. I hope that the Minister can outline how appropriate measures will be put in place to ensure support for victims, who will now, possibly for the first time, have some measures in place to assist them. I raised earlier the importance of keeping the Bill and its effectiveness under review. I hope that the House will think about how we do that materially, so we do not end up having another five or 10 years without such a Bill and having to play catch-up in such a complex area.