CultureSpeeches

Damian Collins – 2022 Speech on the Online Safety Bill

The speech made by Damian Collins, the Conservative MP for Folkestone and Hythe, in the House of Commons on 5 December 2022.

As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.

The right hon. Member for Barking (Dame Margaret Hodge) was kind in her words about me and my right hon. Friend the Member for Croydon South (Chris Philp). I know that my successor will continue in the same tradition and, more importantly, that he is supported by a team of officials who have dedicated, in some cases, years of their career to the Bill, who care deeply about it and who want to see it introduced with success. I had better be nice to them because some of them are sitting in the Box.

It is easy to consider the Bill on Report as it now, thinking about some areas where Members think it goes too far and other areas where Members think it does not quite go far enough, but let us not lose sight of the fact that we are establishing a world-leading regulatory system. It is not the first in the world, but it goes further than any other system in the world in the scope of offences. Companies will have to show priority activity in identifying and mitigating the harm of the unlawful activity. A regulator will be empowered to understand what is going on inside the companies, challenge them on the way that they enforce their codes and hold them to account for that. We currently have the ability to do none of those things. Creating a regulator with that statutory power and the power to fine and demand evidence and information is really important.

The case of Molly Russell has rightly been cited as so important many times in this debate. One of the hardships was not just the tragedy that the family had to endure and the cold, hard, terrible fact—presented by the coroner—that social media platforms had contributed to the death of their daughter, but that it took years for the family and the coroner, going about his lawful duty, to get hold of the information that was required and to bring it to people’s attention. I have had conversations with social media companies about how they combat self-harm and suicide, including with TikTok about what they were doing to combat the “blackout challenge”, which has led to the death of children in this country and around the world. They reassure us that they have systems in place to deal with that and that they are doing all that they can, but we do not know the truth. We do not know what they can see and we have no legal power to readily get our hands on that information and publish it. That will change.

This is a systems Bill—the hon. Member for Pontypridd (Alex Davies-Jones) and I have had that conversation over the Dispatch Boxes—because we are principally regulating the algorithms and artificial intelligence that drive the recommendation tools on platforms. The right hon. Member for Barking spoke about that, as have other Members. When we describe pieces of content, they are exemplars of the problem, but the biggest problem is the systems effect. If people posted individually and organically, and that sat on a Facebook page or a YouTube channel that hardly anyone saw, the amount of harm done would be very small. The fact is, however, that those companies have created systems to promote content to people by data-profiling them to keep them on their site longer and to get them coming back more frequently. That has been done for a business reason—to make money. Most of the platforms are basically advertising platforms making money out of other people’s content.

That point touches on every issue that Members have raised so far today. The Bill squarely makes the companies fully legally liable for their business activity, what they have designed to make money for themselves and the detriment that that can cause other people. That amplification of content, giving people more of what they think they want, is seen as a net positive, and people think that it therefore must always be positive, but it can be extremely damaging and negative.

That is why the new measures that the Government are introducing on combating self-harm and suicide are so important. Like other Members, I think that the proposal from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) is important, and I hope that the Government’s amendment will address the issue fully. We are talking not just about the existing, very high bar in the law on assisting suicide, which almost means being present and part of the act. The act of consistently, systematically promoting content that exacerbates depression, anxiety and suicidal feelings among anyone, but particularly young people, must be an offence in law and the companies must be held to account for that.

When Ian Russell spoke about his daughter’s experience, I thought it was particularly moving when he said that police officers were not allowed to view the content on their own. They worked in shifts for short periods of time, yet that content was pushed at a vulnerable girl by a social media platform algorithm when she was on her own, probably late at night, with no one else to see it and no one to protect her. That was done in a systematic way, consistently, over a lengthy period of time. People should be held to account for that. It is outrageous—it is disgusting—that that was allowed to happen. Preventing that is one of the changes that the Bill will help us to deliver.

Mr David Davis

I listened with interest to the comments of the right hon. Member for Barking (Dame Margaret Hodge) about who should be held responsible. I am trying to think through how that would work in practice. Frankly, the adjudication mechanism, under Ofcom or whoever it might be, would probably take a rather different view in the case of a company: bluntly, it would go for “on the balance of probabilities”, whereas with an individual it might go for “beyond reasonable doubt”. I am struggling —really struggling—with the question of which would work best. Does my hon. Friend have a view?

Damian Collins

My right hon. Friend raises a very good question. As well as having a named individual with criminal liability for the supplying of information, should there be somebody who is accountable within a company, whether that comes with criminal sanctions or not—somebody whose job it is to know? As all hon. Members know if they have served on the Digital, Culture, Media and Sport Committee, which I chaired, on the Public Accounts Committee or on other Select Committees that have questioned people from the big tech companies, the frustrating thing is that no matter who they put up, it never seems to be the person who actually knows.

There needs to be someone who is legally liable, whether or not they have criminal liability, and is the accountable officer. In the same way as in a financial institution, it is really important to have someone whose job it is to know what is going on and who has certain liabilities. The Bill gives Ofcom the power to seek information and to appoint experts within a company to dig information out and work with the company to get it, but the companies need to feel the same sense of liability that a bank would if its systems had been used to launder money and it had not raised a flag.

Dame Margaret Hodge rose—

Damian Collins

I will dare to give way to yet another former Committee Chair—the former chair of the Public Accounts Committee.

Dame Margaret Hodge

I draw all hon. Members’ attention to issues relating to Barclays Bank in the wake of the economic crisis. An authority—I think it was the Serious Fraud Office—attempted to hold both the bank and its directors to account, but it failed because there was not a corporate criminal liability clause that worked. It was too difficult. Putting such a provision in the Bill would be a means of holding individual directors as well as companies to account, whatever standard of proof was used.

Damian Collins

I thank the right hon. Lady for that information.

Let me move on to the debate about encryption, which my right hon. Friend the Member for Haltemprice and Howden has mentioned. I think it is important that Ofcom and law enforcement agencies be able to access information from companies that could be useful in prosecuting cases related to terrorism and child sexual exploitation. No one is suggesting that encrypted messaging services such as WhatsApp should be de-encrypted, and there is no requirement in the Bill for encryption to end, but we might ask how Meta makes money out of WhatsApp when it appears to be free. One way in which it makes money is by gathering huge amounts of data and information about the people who use it, about the names of WhatsApp groups and about the websites people visit before and after sending messages. It gathers a lot of background metadata about people’s activity around using the app and service.

If someone has visited a website on which severe illegal activity is taking place and has then used a messaging service, and the person to whom they sent the message has done the same, it should be grounds for investigation. It should be easy for law enforcement to get hold of the relevant information without the companies resisting. It should be possible for Ofcom to ask questions about how readily the companies make that information available. That is what the Government seek to do through their amendments on encryption. They are not about creating a back door for encryption, which could create other dangers, and not just on freedom of expression grounds: once a back door to a system is created, even if it is only for the company itself or for law enforcement, other people tend to find their way in.

Ian Paisley (North Antrim) (DUP)

I thank the hon. Member for jointly sponsoring my private Member’s Bill, the Digital Devices (Access for Next of Kin) Bill. Does he agree that the best way to make progress is to ensure open access for the next of kin to devices that a deceased person leaves behind?

Damian Collins

The hon. Member makes an important point. Baroness Kidron’s amendment has been referred to; I anticipate that future amendments in the House of Lords will also seek to address the issue, which our Joint Committee looked at carefully in our pre-legislative scrutiny.

It should be much easier than it has been for the Russell family and the coroner to gain access to such important information. However, depending on the nature of the case, there may well be times when it would be wrong for families to have access. I think there has to be an expedited and official process through which the information can be sought, rather than a general provision, because some cases are complicated. There should not be a general right in law, but it needs to be a lot easier than it is. Companies should make the information available much more readily than they have done. The Molly Russell inquest had to be delayed for four months because of the late release of thousands of pages of information from Meta to the coroner. That is clearly not acceptable either.

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled an amendment relating to small and risky platforms. The categorisation of platforms on the basis of size was linked to duties under the “legal but harmful” provisions, which we expect now to change. The priority illegal harms apply to platforms of all sizes. Surely when illegal activity is taking place on any platform of any size—I hope that the Minister will clarify this later—Ofcom must have the right to intervene and start asking questions. I think that, in practice, that is how we should expect the system to work.

Like other Members who served on the Joint Committee —I am thinking particularly of my hon. Friends the Members for Watford (Dean Russell) and for Stourbridge (Suzanne Webb), both of whom spoke so passionately about this subject, and the hon. Member for Ochil and South Perthshire (John Nicolson) raised it as well—I was delighted to see that the Government had tabled amendments to cover Zach’s law. The fact that someone can deliberately seek out a person with epilepsy and target that person with flashing images with the intention of causing a seizure is a terrible example of the way in which systems can be abused. It is wrong for the platforms to be neutral and have no obligation to identify and stop that action, but the action is wrong in practice as well, and it demonstrates the need for us to ensure that the law keeps pace with the nature of new offences. I was very proud to meet Zach and his mother in October. I said to them then that their work had changed the law, and I am glad that the Government have tabled those amendments.

Dean Russell

May I pay tribute to my hon. Friend for his chairmanship of the Joint Committee last year? We covered a wide range of challenging ethical, moral and technical decisions, with work across both Houses, and I think that the decisions contained in our report informed many of the Government amendments, but it was my hon. Friend’s chairmanship that helped to guide us through that period.

Damian Collins

I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.

There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.

We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.

Priti Patel

My hon. Friend is making a very thoughtful speech. This is an important point, because it relates to criminality fuelled by online activity. We have discussed that before in the context of advertising. Tools already exist throughout Government to pick up such criminality, but we need the Bill to integrate them and drive the right outcomes—to stop this criminality, to secure the necessary prosecutions, and to bring about the deterrent effect that my hon. Friend the Member for Dover (Mrs Elphicke) is pursuing.

Damian Collins rose—

Mrs Natalie Elphicke (Dover) (Con)

Will my right hon. Friend give way?

Damian Collins

Of course.

Mrs Elphicke

I am grateful to my right hon. Friend raising this and for his support in this important area that affects our constituencies so much. I will be speaking later to the details of this, which go beyond the advertising payment to the usage, showing and sharing of this. As he has mentioned schedule 7, does he agree that there is—as I have set out in my amendment—a strong case for making sure that it covers all those illegal immigration and modern slavery offences, given the incredible harm that is being caused and that we see on a day-to-day basis?

Damian Collins

I agree with my hon. Friend, which is why I think it is important that immigration offences were included in schedule 7 of the Bill. I think this is something my right hon. Friend the Member for Croydon South felt strongly about, having been Immigration Minister before he was a tech Minister. It is right that this has been included in the scope of the Bill and I hope that when the code of practice is developed around that, the scope of those offences will be made clear.

On whether advertising should be included as well as other postings, it may well be that at this time the Online Safety Bill is not necessarily the vehicle through which that needs to be incorporated. It could be done separately through the review of the online advertising code. Either way, these are loopholes that need to be closed, and the debate around the Online Safety Bill has brought about a recognition of what offences can be brought within the regulatory scope of the Bill and where Ofcom can have a role in enforcing those measures. Indeed, the measures on disinformation in the National Security Bill are good example of that. In some ways it required the National Security Bill to create the offence, and then the offence could be read across into the Online Safety Bill and Ofcom could play a role in regulating the platforms to ensure that they complied with requests to take down networks of Russian state-backed disinformation. Something similar could work with immigration offences as well, but whether it is done that way or through the online advertising review or through new legislation, this is a loophole that needs to be closed.