Category: Technology

  • Gareth Snell – 2021 Comments on Digital Exclusion

    Gareth Snell – 2021 Comments on Digital Exclusion

    The comments made by Gareth Snell, the former Labour MP for Stoke-on-Trent Central on 5 January 2021.

    I sincerely believe the Government simply doesn’t know the level of digital exclusion that exists in some parts of the U.K. Simply saying ‘switch to remote learning’ doesn’t address access to laptops, the internet or at home ICT literacy to support that learning.

  • Gavin Newlands – 2020 Speech on Online Harms

    Gavin Newlands – 2020 Speech on Online Harms

    The speech made by Gavin Newlands, the SNP MP for Paisley and Renfrewshire North, in the House of Commons on 19 November 2020.

    We have had another excellent, if curtailed, debate today. I thank the right hon. and learned Member for Kenilworth and Southam (Jeremy Wright), the hon. Member for Kingston upon Hull North (Dame Diana Johnson) and the hon. Member for Congleton (Fiona Bruce) for securing it and the Backbench Business Committee for facilitating it. I do not have time to discuss and praise the various speeches that we have had, but I particularly praise the right hon. and learned Member for Kenilworth and Southam, who opened the debate. I thought his speech was fantastic and immensely powerful; nobody could ignore what he said. Take note, Minister: if an SNP Member and a Tory Member can agree so wholeheartedly, actions surely must follow.

    We spend more and more of our time online, whether we are interacting with others or are passive consumers of content—the growth of Netflix is testament to the latter. As we spend more time online, the harms that were historically the preserve of the physical world are shifting to the online world. We have seen the growth in online financial scams and their increasing sophistication.

    I have a number of constituents, as I am sure do other hon. Members, who have been scammed out of tens of thousands of pounds and lost everything, in part because the scammers were able to manipulate Google keywords advertising to drive traffic to their site and begin the scamming process. The pandemic and lockdown have seen an increase in those scams, as the perpetrators know people are spending more time online than normal.

    Since the start of the pandemic, the level of disinformation around vaccination and healthcare has grown exponentially. Anti-vaxxers have already targeted the newly developed vaccines that we all hope will get us out of this situation. Such disinformation campaigns have always been dangerous, particularly for young people who are usually the main recipients of vaccines, but now present an even bigger danger to public health.

    These lies—that is what they are—are propagated via the platforms of social media companies, which should have a responsibility to tackle such anti-science, anti-reason and anti-fact campaigns quickly and directly. It is not good enough for Mark Zuckerberg and the like to parrot free speech as if it were a “get out of jail free” card. Free speech comes with responsibilities; it does not give people the right to place others at risk of illness and death.

    Just as children were most at risk from the anti-vaxxers until the pandemic hit, it is children who are most at risk from online harassment and abuse, in particular young women and girls. A recent report by Plan International on girls’ rights in the digital world makes extremely depressing reading. More than a fifth of girls have received abuse on a photo or status they have posted, and nearly a quarter have felt harassed by someone contacting them regularly on social media. The net result of the abuse, harassment and pressure is that nearly half of all girls are afraid to give their opinions on social media, for fear of the response, and 13% have stopped going on social media completely to avoid negative responses. Less than a week before the international day for the elimination of violence against women and girls, those figures are shocking.

    A toxic environment is stopping women and girls participating in the online world on the same basis as boys and men. It feeds into a dangerous and violent misogyny that is on the rise on social media, again largely unchecked by the big tech companies until it becomes a big PR issue. It is no surprise that so many executive positions in those companies are occupied by men and so few by women.

    For most households, online communication is now a fundamental part of daily life, whether it is streaming content or keeping in touch with family and friends on social media, but too often the regulation of online activities that cause harm seems to be stuck in the last century, when the internet was something we read about in newspapers or heard about on one of our four TV channels. The world has moved on dramatically in the past two decades, but the legislative framework has not. It is especially important that the victims of online harms, whether it be abuse, harassment or financial scams, feel able to report their experiences to the police or other relevant authorities. If big tech will not act, it falls to the Government to protect our citizens.

    I understand that the pressures on the Government at the moment are absolutely huge, but so are the risks for individuals and for society the longer these harms are allowed to proliferate. I urge the Government to heed the contributions of Members right across the House and bring forward concrete plans to introduce the Bill as soon as possible.

  • Stephen Timms – 2020 Speech on Online Harms

    Stephen Timms – 2020 Speech on Online Harms

    The speech made by Stephen Timms, the Labour MP for East Ham, in the House of Commons on 19 November 2020.

    I want to raise just two points: first, the current epidemic of online frauds; and, secondly, the online sale of the illegal weapons used on our streets in gang violence.

    First, the Pension Scams Industry Group has told the current Work and Pensions Committee inquiry that 40,000 people have suffered the devastation of being scammed out of their pension in five years. Much of that is online. Mark Taber told us he has reported to the Financial Conduct Authority this year 380 scam adverts on Google. It is a crime, but after weeks or months the FCA just issues a warning. The Transparency Task Force told us of

    “high-profile, known crooks…running rings around the regulators”,

    and:

    “Paid keyword search is a highly efficient means for pensions & savings scammers to target their victims.”

    Another witness told us that there is

    “a big increase in social media scams”.

    Which? said that

    “we need to look at what sort of responsibilities should be given to those online platforms to protect their users from scams.”

    A director at Aviva told us that it

    “had to take down 27 fake domains linked to our brand… It is very difficult and it takes a very long time to engage the web domain providers to get it down.”

    He called big technology companies “key enablers of fraud”, and he made a call

    “to extend the Online Harms Bill to include the advertising of fraudulent investments”.

    I think that should be done, and I want to ask the Minister if it will be in the legislation.

    Secondly, the Criminal Justice Act 1988 bans the sale and import of a list of weapons: disguised knives, butterfly knives, flick knives, gravity knives, stealth knives, zombie knives, sword sticks, push daggers, blowpipes, telescopic truncheons and batons. But all of them are available online for delivery in the post. That is how most weapons used on the streets in London are obtained. As we debated in the Offensive Weapons Bill Committee in 2018, companies should not sell in the UK products that it is illegal to purchase here.

    The Under-Secretary of State for the Home Department, the hon. Member for Louth and Horncastle (Victoria Atkins), said in Committee that the Home Office was working with the Department for Digital, Culture, Media and Sport on these online harms, and looking at

    “what more we can do to ensure…companies act responsibly and do not facilitate sales of ‘articles with a blade or point’ or ‘corrosive products’ in their platforms.”––[Official Report, Offensive Weapons Public Bill Committee, 11 September 2018; c. 280.]

    What I want to ask the Minister is: will that promise be fulfilled in the coming legislation?

  • Damian Hinds – 2020 Speech on Online Harms

    Damian Hinds – 2020 Speech on Online Harms

    The speech made by Damian Hinds, the Conservative MP for East Hampshire, in the House of Commons on 19 November 2020.

    There are so many aspects to this, including misinformation on the pandemic, disinformation and foreign influence operations, harassment, engagement algorithms, the effect on our politics and public discourse, the growth in people gambling on their own, scammers and chancers, and at the very worst end, radicalisation and, as we have heard from many colleagues, sexual exploitation. I am grateful to the Backbench Business Committee for granting time for the debate, but this is not one subject for debate but about a dozen, and it needs a lot more time at these formative stages, which I hope the Government will provide. My brief comments will be specifically about children.

    When I was at the Department for Education, I heard repeatedly from teenagers who were worried about the effect on their peers’ mental health of the experience of these curated perfect lives, with the constant scoring of young people’s popularity and attractiveness and the bullying that no longer stops when a young person comes through their parents’ front door but stays with them overnight. I heard from teachers about the effect of technology on sleep and concentration and on taking too much time from other things that young people should be doing in their growing up. I take a lot of what will be in this legislation as read, so what I will say is not an exclusive list, but I have three big asks of what the legislation and secondary legislation should cover for children. By children, I mean anybody up to the age of 16 or 18. Let us not have any idea that there is a separate concept of a digital age of consent that is in some way different.

    First, the legislation will of course tackle the promotion of harms such as self-harm and eating disorders, but we need to go further and tackle the prevalence and normalisation of content related to those topics so that fewer young people come across it in the first place. Secondly, on compulsive design techniques such as autoplay, infinite scroll and streak rewards, I do not suggest that the Government should get in the business of designing applications, but there need to be natural breaks, just as there always were when children’s telly came to an end or in running out of coins at the amusement arcade, to go and do something else. Actually, we need to go further, with demetrification—an ugly word but an important concept—because children should not be worrying about their follower-to-following ratio or how many likes they get when they post a photograph. Bear in mind that Facebook managed to survive without likes up to 2009.

    Thirdly, we need to have a restoration of reality, discouraging and, at the very least, clearly marking doctored photos and disclosing influencers’ product placements and not allowing the marketing of selfie facial enhancements to young children. It is not only about digital literacy and resilience, though that plays a part. The new material in schools from this term is an important step, but it will need to be developed further.

    It has always been hard growing up, but it is a lot harder to do it live in the glare of social media. This generation will not get another chance at their youth. That is why, yes, it is important that we get it right, but it is also important that we get it done and we move forward now.

  • Stephen Doughty – 2020 Speech on Online Harms

    Stephen Doughty – 2020 Speech on Online Harms

    The speech made by Stephen Doughty, the Labour MP for Cardiff South and Penarth, in the House of Commons on 19 November 2020.

    Many of us took part in a debate on these issues in Westminster Hall recently. I do not want to repeat all the comments I made then, but I have seen the wide range of online harms in my constituency of Cardiff South and Penarth, and the online harms leading to real-world harms, violence and hatred on our streets.

    In that Westminster Hall debate, I spoke about the range of less well-known platforms that the Government must get to grips with—the likes of Telegram, Parler, BitChute and various other platforms that are used by extremist organisations. I pay tribute to the work that HOPE not Hate and other organisations are doing. I declare an interest as a parliamentary friend of HOPE not Hate and commend to the Minister and the Government its excellent report on online regulation that was released just this week.

    I wish to give one example of why it is so crucial that the Government act, and act now, and it relates to the behaviour of some of the well-known platforms. In the past couple of weeks, I have spoken to one of those platforms: YouTube—Google. It is not the first time that I have spoken to YouTube; I have previously raised concerns about its content on many occasions as a members of the Home Affairs Committee. It was ironic to be asked to take part in a programme to support local schools on internet safety and being safe online, when at the same time YouTube, despite my personally having reported instances of far-right extremism, gang violence and other issues that specifically affect my constituency, has refused to remove that content. YouTube has not removed it, despite my reporting it.

    I am talking about examples of gang videos involving convicted drug dealers in my constituency; videos of young people dripping in simulated blood after simulated stabbings; videos encouraging drug dealing and violence and involving young people as actors in a local park, just hundreds of metres from my own house—but they have not been removed, on grounds of legitimate artistic expression. There are examples of extremist right-wing organisations promoting hatred against Jews, black people and the lesbian, gay, bisexual and transgender community that I have repeatedly reported, but they were still on there at the start of this debate. The only conclusion I can draw is that these companies simply do not give a damn about what the public think, what parents think, what teachers think, what all sides of the House think, what Governments think or what the police think, because they are failing to act, having been repeatedly warned. That is why the Government must come in and regulate, and they must do it sooner rather than later.

    We need to see action taken on content relating to proscribed organisations—I cannot understand how that content is online when those organisations are proscribed by the Government—where there are clear examples of extremism, hate speech and criminality. I cannot understand why age verification is not used even as a minimum standard on some of these gang videos and violent videos, which perhaps could be justified in some parallel world, when age verification is used for other content. Some people talk about free speech. The reality is that these failures are leading to a decline in freedom online and in safety for our young people.

  • Chris Elmore – 2020 Speech on Online Harms

    Chris Elmore – 2020 Speech on Online Harms

    The speech made by Chris Elmore, the Labour MP for Ogmore, in the House of Commons on 19 November 2020.

    I thank the right hon. and learned Member for Kenilworth and Southam (Jeremy Wright) for securing the debate with the hon. Member for Congleton (Fiona Bruce). I pay particular tribute to him, because when he was Culture Secretary, he and Margot James, who is no longer in this place, spearheaded this legislation. They are a credit to the House for ensuring that this was a priority for the Government then. I know how important the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Boston and Skegness (Matt Warman), thinks this is, but some of us—me included—have been talking about this issue for more than three and a half years, and this Bill needs to come forward. The delays just are not acceptable, and too many people are at risk.

    I pay tribute to the hon. Member for Folkestone and Hythe (Damian Collins) for not only his speech but his chairmanship of the DCMS Committee, which he did without fear or favour. He took on the platforms, and they did not like it. All credit to him for standing up for what he believes in and trying to take on these giants.

    In the two minutes I have left, I want to talk about the inquiry of my all-party parliamentary group on social media in relation to child harm, which the right hon. and learned Member for Kenilworth and Southam touched on. The Internet Watch Foundation is a charity that works with tech industries and is partly funded by them. It also works with law enforcement agencies and is funded by the Government and currently by the European Union. It removes self-generated images of child abuse. It removes URLs of children who have been coerced and groomed into taking images of themselves in a way that anyone in this House would find utterly disgusting and immoral. That is its sole, core purpose.

    The problem is extremely complex. The IWF has seen a 50% increase in public reports of suspected child abuse over the past year, but the take-down rate of URLs has dropped by 89%. I have pressed DCMS Ministers and Cabinet Office Ministers to ensure that IWF funding will continue, to address the fact that these URLs are not being taken down and to put more resources into purposefully tackling this abhorrent problem of self-generated harm, whether the children are groomed through platforms, live streaming or gaming.

    The platforms have not gone far enough. They are not acknowledging the problem in front of them. I honestly believe that if a future Bill provides the power for the platforms to decide what is appropriate and for Ofcom to make recommendations or fine them on that basis, it is a flawed system.

    It is self-regulation with a regulator—it does not make any sense. The platforms themselves say that it does not work.

    In closing, will the Minister please—please—get a grip on the issues that the IWF is raising, continue its funding, and do all that he can to protect children from the harm that many of them face in their bedrooms and homes across the UK?

  • Damian Collins – 2020 Speech on Online Harms

    Damian Collins – 2020 Speech on Online Harms

    The speech made by Damian Collins, the Conservative MP for Folkestone and Hythe, in the House of Commons on 19 November 2020.

    I congratulate my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright) on his excellent speech introducing this debate. We need to be clear that the online harms White Paper response from the Government is urgently needed, as is the draft Bill. We have been discussing this for several years now. When I was Chair of the Digital, Culture, Media and Sport Committee, we published a report in the summer of 2018 asking for intervention on online harms and calling for a regulatory system based on a duty of care placed on the social media companies to act against harmful content.

    There are difficult decisions to be made in assessing what harmful content is and assessing what needs to be done, but I do not believe those decisions should be made solely by the chief executives of the social media companies. There should be a legal framework that they have to work within, just as people in so many other industries do. It is not enough to have an online harms regulatory system based just on the terms and conditions of the companies themselves, in which all Parliament and the regulator can do is observe whether those companies are administering their own policies.

    We must have a regulatory body that has an auditing function and can look at what is going on inside these companies and the decisions they make to try to remove and eliminate harmful hate speech, medical conspiracy theories and other more extreme forms of harmful or violent content. Companies such as Facebook say that they remove 95% of harmful content. How do we know? Because Facebook tells us. Has anyone checked? No. Can anyone check? No; we are not allowed to check. Those companies have constantly refused to allow independent academic bodies to go in and scrutinise what goes on within them. That is simply not good enough.

    We should be clear that we are not talking about regulating speech. We are talking about regulating a business model. It is a business model that prioritises the amplification of content that engages people, and it does not care whether or not that content is harmful. All it cares about is the engagement. So people who engage in medical conspiracy theories will see more medical conspiracy theories. A young person who engages with images of self-harm will see more images of self-harm. No one is stepping in to prevent that. How do we know that Facebook did all it could to stop the live broadcast of a terrorist attack in Christchurch, New Zealand? No one knows. We have only Facebook’s word for it, and the scale of that problem could have been a lot worse.

    The tools and systems of these companies are actively directing people to harmful content. People often talk about how easy it is to search for this material. Companies such as Facebook will say, “We downgrade this material on our site to make it hard to find,” but they direct people to it. People are not searching for it—it is being pushed at them. Some 70% of what people watch on YouTube is selected for them by YouTube, not searched for by them. An internal study done by Facebook in Germany in 2016, which the company suppressed and was leaked to the media this year, showed that 60% of people who joined Facebook groups that shared extremist material did so at the recommendation of Facebook, because they had engaged with material like that before. That is what we are trying to regulate—a business model that is broken—and we desperately need to move on with online harms.

  • Diana Johnson – 2020 Speech on Online Harms

    Diana Johnson – 2020 Speech on Online Harms

    The speech made by Diana Johnson, the Labour MP for Kingston upon Hull North, in the House of Commons on 19 November 2020.

    I pay tribute to the right hon. and learned Member for Kenilworth and Southam (Jeremy Wright) and the hon. Member for Congleton (Fiona Bruce) for securing this debate. Today is World Children’s Day, when we are asked to imagine a better future for every child, and I will focus my remarks on an online harm that the Government could act on quickly to protect our children. Commercial pornography websites are profiteering from exposing children in the UK to hardcore violent pornography—pornography that it would be illegal to sell to children offline and that it would be illegal to sell even to adults, unless purchased in a licensed sex shop.

    Three years ago, Parliament passed legislation to close this disastrous regulation gap. Three years on, the Government have still not implemented it. Assurances that the regulation gap will be filled by the forthcoming online harms legislation do not stand up to objective scrutiny. This is a child protection disaster happening now, and the Government could and, I hope, will act now.

    Children are being exposed to online pornography at an alarming scale, and during the covid-19 pandemic, there is no doubt that the figures will have increased even more with children more often having unsupervised online access. The issue is the widespread availability and severity of online pornography accessible at home. It is no longer about adult magazines on the top shelf in the newsagent. Contemporary pornography is also overwhelmingly violent and misogynistic, and it feeds and fuels the toxic attitudes that we see particularly towards women and girls.

    Back in 2017, Parliament passed part 3 of the Digital Economy Act. Enacted, it would prohibit commercial pornography websites from making their content available to anyone under the age of 18 and create a regulator and an enforcement mechanism. It was backed by the leading children’s charities, including the National Society for the Prevention of Cruelty to Children and Barnardo’s, as well as the majority of parents. However, in 2019, the Government announced that they would not be implementing part 3 of the 2017 Act. In the online harms White Paper in February, the Government said that any verification

    “will only apply to companies that provide services or use functionality on their websites which facilitate the sharing of user generated content or user interactions”.

    That is not good enough. Parliament has already spoken. We have said what we want to happen. I expect the Government to build on part 3 of the 2017 Act. It is set out and is ready to go to. They should act on it now.

  • Jeremy Wright – 2020 Speech on Online Harms

    Jeremy Wright – 2020 Speech on Online Harms

    The speech made by Jeremy Wright, the Conservative MP for Kenilworth and Southam, in the House of Commons on 19 November 2020.

    I beg to move,

    That this House recognises the need to take urgent action to reduce and prevent online harms; and urges the Government to bring forward the Online Harms Bill as soon as possible.

    The motion stands in my name and those of the hon. Member for Kingston upon Hull North (Dame Diana Johnson) and my hon. Friend the Member for Congleton (Fiona Bruce). I begin by thanking the Backbench Business Committee for finding time for what I hope the House will agree is an important and urgent debate. I am conscious that a great number of colleagues wish to speak and that they have limited time in which to do so, so I will be brief as I can. I know also that there are right hon. and hon. Members who wished to be here to support the motion but could not be. I mention, in particular, my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Digital, Culture, Media and Sport Committee, who is chairing the Committee as we speak.

    I hope that today’s debate will largely be about solutions, but perhaps we should begin with the scale of the problem. The term “online harms” covers many things, from child sexual exploitation to the promotion of suicide, hate speech and intimidation, disinformation perpetrated by individuals, groups and even nation states, and many other things. Those problems have increased with the growth of the internet, and they have grown even faster over recent months as the global pandemic has led to us all spending more time online.

    Let me offer just two examples. First, between January and April this year, as we were all starting to learn about the covid-19 virus, there were around 80 million interactions on Facebook with websites known to promulgate disinformation on that subject. By contrast, the websites of the World Health Organisation and the US Centres for Disease Control and Prevention each had around 6 million interactions. Secondly, during roughly the same period, online sex crimes recorded against children were running at more than 100 a day. The online platforms have taken some action to combat the harms I have mentioned, and I welcome that, but it is not enough, as the platforms themselves mostly recognise.

    Sir John Hayes (South Holland and The Deepings) (Con)

    You may have noticed, Mr Deputy Speaker, that I am ostentatiously wearing purple. I have been missioned to do so because it is World Pancreatic Cancer Day. We have been asked to emphasise it, because raising awareness of that disease is important.

    My right hon. and learned Friend is right to highlight the horror of degrading and corrupting pornography. Indeed, the Government have no excuse for not doing more, because the Digital Economy Act 2017 obliges them to do so. Why do we not have age verification, as was promised in that Act and in our manifesto? It is a straightforward measure that the Government could introduce to save lives in the way my right hon. and learned Friend describes.

    Jeremy Wright

    I agree with my right hon. Friend, but I will be careful, Mr Deputy Speaker, in what I say about age verification, because I am conscious that a judicial review case is in progress on that subject. However, I agree that that is something that we could and should do, and not necessarily in direct conjunction with an online harms Bill.

    Digital platforms should also recognise that a safer internet is, in the end, good for business. Their business model requires us to spend more and more time online, and we will do that only if we feel safe there. The platforms should recognise that Governments must act in that space, and that people of every country with internet access quite properly expect them to. We have operated for some time on the principle that what is unacceptable offline is unacceptable online. How can it be right that actions and behaviours that cause real harm and would be controlled and restricted in every other environment, whether broadcast media, print media or out on the street, are not restricted at all online?

    I accept that freedom of speech online is important, but I cannot accept that the online world is somehow sacred space where regulation has no place regardless of what goes on there. Given the centrality of social media to modern political debate, should we rely on the platforms alone to decide which comments are acceptable and which are unacceptable, especially during election campaigns? I think not, and for me the case for online regulation is clear. However, it must be the right kind of regulation—regulation that gives innovation and invention room to grow, that allows developing enterprises to offer us life-enhancing services and create good jobs, but that requires those enterprises to take proper responsibility for their products and services, and for the consequences of their use. I believe that that balance is to be found in the proposed duty of care for online platforms, as set out in the Government’s White Paper of April last year.

    I declare an interest as one of the Ministers who brought forward that White Paper at the time, and I pay tribute to all those in government and beyond, including the talented civil servants at the Department for Digital, Culture, Media and Sport, who worked so hard to complete it.

    This duty of care is for all online companies that deal with user-generated content to keep those who use their platforms as safe as they reasonably can.

    Jim Shannon (Strangford) (DUP)

    We have covered some important information. Does the right hon. and learned Gentleman agree that there needs to be a new social media regulator with the power to audit and impact social media algorithms to ensure that they do not cause harm? Such a regulator would enable that to happen.

    Jeremy Wright

    I agree that we need a regulator and will come on to exactly that point. The hon. Gentleman is entirely right, for reasons that I will outline in just a moment.

    I recognise that what I am talking about is not the answer to every question in this area, but it would be a big step towards a safer online world if designed with sufficient ambition and implemented with sufficient determination. The duty of care should ask nothing unreasonable of the digital platforms. It would be unreasonable, for example, to suggest that every example of harmful content reaching a vulnerable user would automatically be a breach of the duty of care. Platforms should be obliged to put in place systems to protect their users that are as effective as they can be, not that achieve the impossible.

    However, meeting that duty of care must mean doing more than is being done now. It should mean proactively scanning the horizon for those emerging harms that the platforms are best placed to see and designing mitigation for them, not waiting for terrible cases and news headlines to prompt action retrospectively. The duty of care should mean changing algorithms that prioritise the harmful and the hateful because they keep our attention longer and cause us to see more adverts. When a search engine asked about suicide shows a how-to guide on taking one’s own life long before it shows the number for the Samaritans, that is a design choice. The duty of care needs to require a different design choice to be made. When it comes to factual inquiries, the duty of care should expect the prioritisation of authoritative sources over scurrilous ones.

    It is reasonable to expect these things of the online platforms. Doing what is reasonable to keep us safe must surely be the least we expect of those who create the world in which we now spend so much of our time. We should legislate to say so, and we should legislate to make sure that it happens. That means regulation, and as the hon. Gentleman suggests, it means a regulator—one that has the independence, the resources and the personnel to set and investigate our expectations of the online platforms. For the avoidance of doubt, our expectations should be higher than the platforms’ own terms and conditions. However, if the regulator we create is to be taken seriously by these huge multinational companies, it must also have the power to enforce our expectations. That means that it must have teeth and a range of sanctions, including individual director liability and site blocking in extreme cases.

    We need an enforceable duty of care for online platforms to begin making the internet a safer place. Here is the good news for the Minister, who I know understands this agenda well. So often, such debates are intended to persuade the Government to change direction, to follow a different policy path. I am not asking the Government to do that, but rather to continue following the policy path they are already on—I just want them to move faster along that path. I am not pretending that it is an easy path. There will be complex and difficult judgments to be made and significant controversy in what will be groundbreaking and challenging legislation, but we have shied away from this challenge for far too long.

    The reason for urgency is not only that, while we delay, lives continue to be ruined by online harms, sufficient though that is. It is also because we have a real opportunity and the obligation of global leadership here. The world has looked with interest at the prospectus we have set out on online harms regulation, and it now needs to see us follow through with action so that we can leverage our country’s well-deserved reputation for respecting innovation and the rule of law to set a global standard in a balanced and effective regulatory approach. We can only do that when the Government bring forward the online harms Bill for Parliament to consider and, yes, perhaps even to improve. We owe it to every preyed-upon child, every frightened parent and everyone abused, intimidated or deliberately misled online to act, and to act now.

  • Amanda Solloway – 2020 Comments on First Woman on the Moon

    Amanda Solloway – 2020 Comments on First Woman on the Moon

    The comments made by Amanda Solloway, the Science Minister, on 13 October 2020.

    The prospect of the first woman landing on the Moon in the coming years will be a source of inspiration for thousands of young people across the UK who may be considering a career in space or science.

    Today’s historic agreement, backed by £16 million of UK funding, underlines our commitment to strengthening the UK’s role in the global space sector, building on our existing strengths in satellites, robotics and communications to grow our economy and improve life on Earth.