CultureSpeeches

David Davis – 2022 Speech on the Online Safety Bill

The speech made by David Davis, the Conservative MP for Haltemprice and Howden, in the House of Commons on 5 December 2022.

I do not agree with every detail of what the hon. Member for Rotherham (Sarah Champion) said, but I share her aims. She has exactly the right surname for what she does in standing up for children.

To avoid the risk of giving my Whip a seizure, I congratulate the Government and the Minister on all they have done so far, both in delaying the Bill and in modifying their stance.

My hon. Friend the Member for Solihull (Julian Knight), who is no longer in the Chamber, said that this is five Bills in one and should have had massively more time. At the risk of sounding like a very old man, there was a time when this Bill would have had five days on Report. That is what should have happened with such a big Bill.

Opposition Members will not agree, but I am grateful that the Government decided to remove the legal but harmful clause. The simple fact is that the hon. Member for Pontypridd (Alex Davies-Jones) and I differ not in our aim—my new clause 16 is specifically designed to protect children—but on the method of achieving it. Once upon a time, there was a tradition that this Chamber would consider a Companies Bill every year, because things change over time. We ought to have a digital Bill every year, specifically to address not legal but harmful but, “Is it harmful enough to be made illegal?” Obviously, self-harm material is harmful enough to be made illegal.

The hon. Lady and I have similar aims, but we have different perspectives on how to attack this. My perspective is as someone who has seen many pieces of legislation go badly wrong despite the best of intentions.

The Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), knows he is a favourite of mine. He did a fantastic job in his previous role. I think this Bill is a huge improvement, but he has a lot more to do, as he recognises with the Bill returning to Committee.

One area on which I disagree with many of my hon. and right hon. Friends is the question of encryption. The Bill allows Ofcom to issue notices directing companies to use “accredited technology,” but it might as well say “magic,” because we do not know what is meant by “accredited technology.” Clause 104 will create a pressure to undermine the end-to-end encryption that is not only desirable but crucial to our telecommunications. The clause sounds innocuous and legalistic, especially given that the notices will be issued to remove terrorist or child sexual exploitation content, which we all agree has no place online.

Damian Collins

Rather than it being magic, does my right hon. Friend agree that a company could not ignore it if we demystified the process? By saying there is an existing technology that is available and proven to work, the company would have to explain why it is not using that technology or something better.

Mr Davis

I will come back to that in some detail.

The first time I used encryption it was one-time pads and Morse, so it was a long time ago. The last time was much more recent. The issue here is that clause 104 causes pressure by requiring real-time decryption. The only way to do that is by either having it unencrypted on the server, having it weakly encrypted or creating a back door. I am talking not about metadata, which I will come back to in a second, but about content. In that context, if the content needs to be rapidly accessible, it is bound to lead to weakened encryption.

This is perhaps a debate for a specialist forum, but it is very dangerous in a whole series of areas. What do we use encryption for? We use it for banking, for legal and privileged conversations, and for conversations with our constituents and families. I could go on and on about the areas in which encryption matters.

Adam Afriyie (Windsor) (Con)

My right hon. Friend will be aware that the measure will encompass every single telephone conversation when it switches to IP. That is data, too.

Mr Davis

That is correct. The companies cannot easily focus the measure on malicious content alone, and that is the problem. With everything we do in dealing with enforcing the law, we have to balance the extent to which we make the job of the law enforcement agency possible—ideally, easy—against the rights we take away from innocent citizens. That is the key balance. Many bad things happen in households but we do not require people to live in houses with glass walls. That shows the intrinsic problem we have.

That imposition on privacy cannot sit comfortably with anybody who takes privacy rights seriously. As an aside, let me say to the House that the last thing we need, given that we want something to happen quickly, or at least effectively and soon, is to find ourselves in a Supreme Court case or a European Court case on privacy imposition. I do not think that is necessary. That is where I think the argument stands. If we end up in a case like that, it will not be about paedophiles or criminals; it will be about the weakening of the encryption of the data of an investigative journalist or a whistleblower. That is where it will come back to haunt us and we have to put that test on it. That is my main opening gambit.

I am conscious that everybody has spoken for quite a long time, so I am trying to make this short. However, the other thing I wish to say is that we have weapons, particularly in terms of metadata. If I recall correctly, Facebook takes down about 300,000 or so sites for paedophile content alone and millions for other reasons; so the use of metadata is very important. Europol carried out a survey of what was useful in terms of the data arising from the internet, social media and the like, and content was put at No. 7, after all sorts of other data. I will not labour the point, but I just worry about this. We need to get it right and so far we have taken more of a blunderbuss approach than a rifle shot. We need to correct that, which is what my two amendments are about.

The other thing I briefly wish to talk about is new clause 16, which a number of people have mentioned in favourable terms. It will make it an offence to encourage or assist another person to self-harm—that includes suicide. I know that the Government have difficulties getting their proposed provisions right in how they interact with other legislation—the suicide legislation and so on. I will be pressing the new clause to a vote. I urge the Government to take this new clause and to amend the Bill again in the Lords if it is not quite perfect. I want to be sure that this provision goes into the legislation. It comes back to the philosophical distinction involving “legal but harmful”, a decision put first in the hands of a Minister and then in the hands of an entirely Whip-chosen statutory instrument Committee, neither of which are trustworthy vehicles for the protection of free speech. My approach will take it from there and put it in the hands of this Chamber and the other place. Our control, in as much as we control the internet, should be through primary legislation, with maximum scrutiny, exposure and democratic content. If we do it in that way, nobody can argue with us and we will be world leaders, because we are pretty much the only people who can do that.

As I say, we should come back to this area time and time again, because this Bill will not be the last shot at it. People have talked about the “grey area”. How do we assess a grey area? Do I trust Whitehall to do it? No, I do not; good Minister though we have, he will not always be there and another Minister will be in place. We may have the British equivalent of Trump one day, who knows, and we do not want to leave this provision in that context. We want this House, and the public scrutiny that this Chamber gets, to be in control of it.

Sir William Cash (Stone) (Con)

Many years ago, in the 1970s, I was much involved in the Protection of Children Bill, which was one of the first steps in condemning and making illegal explicit imagery of children and their involvement in the making of such films. We then had the broadcasting Acts and the video Acts, and I was very much involved at that time in saying that we ought to prohibit such things in videos and so on. I got an enormous amount of flack for that. We have now moved right the way forward and it is tremendous to see not only the Government but the Opposition co-operating together on this theme. I very much sympathise with not only what my right hon. Friend has just said—I am very inclined to support his new clause for that reason— but with what the right hon. Member for Barking (Dame Margaret Hodge) said. I was deeply impressed by the way in which she presented the argument about the personal liability of directors. We cannot distinguish between a company and the people who run it, and I am interested to hear what the Government have to say in reply to that.

Mr Davis

I very much agree with my hon. Friend on that. He and I have been allies in the past—and sometimes opponents—and he has often been far ahead of other people. I am afraid that I do not remember the example from the 1970s, as that was before even my time here, but I remember the intervention he made in the 1990s and the fuss it caused. From that point of view, I absolutely agree with him. My new clause is clearly worded and I hope the House will give it proper consideration. It is important that we put something in the Bill on this issue, even if the Government, quite properly, amend it later.

I wish to raise one last point, which has come up as we have talked through these issues. I refer to the question of individual responsibility. One or two hon. Ladies on the Opposition Benches have cited algorithmic outcomes. As I said to the right hon. Member for Barking, I am worried about how we place the responsibility, and how it would lead the courts to behave, and so on. We will debate that in the next few days and when the Bill comes back again.

There is one other issue that nothing in this Bill covers, and I am not entirely sure why. Much of the behaviour pattern is algorithmic and it is algorithmic with an explicit design. As a number of people have said, it is designed as clickbait; it is designed to bring people back. We may get to a point, particularly if we come back to this year after year, of saying, “There are going to be rules about your algorithms, so you have to write it into the algorithm. You will not use certain sorts of content, pornographic content and so on, as clickbait.” We need to think about that in a sophisticated and subtle way. I am looking at my hon. Friend the Member for Folkestone and Hythe (Damian Collins), the ex-Chairman of the Select Committee, on this issue. If we are going to be the innovators—and we are the digital world innovators— we have to get this right.

Damian Collins

My right hon. Friend is right to raise this important point. The big area here is not only clickbait, but AI-generated recommendation tools, such as a news feed on Facebook or “next up” on YouTube. Mitigating the illegal content on the platforms is not just about content moderation and removal; it is about not promoting.

Mr Davis

My hon. Friend is exactly right about that. I used the example of clickbait as shorthand. The simple truth is that “AI-generated” is also a misnomer, because these things are not normally AI; they are normally algorithms written specifically to recommend and to maximise returns and revenue. We are not surprised at that. Why should we be? After all, these are commercial companies we are talking about and that is what they are going to do. Every commercial company in the world operates within a regulatory framework that prevents them from making profits out of antisocial behaviour.

Aaron Bell (Newcastle-under-Lyme) (Con)

On the AI point, let me say that the advances we have seen over the weekend are remarkable. I have just asked OpenAI.com to write a speech in favour of the Bill and it is not bad. That goes to show that the risks to people are not just going to come from algorithms; people are going to be increasingly scammed by AI. We need a Bill that can adapt with the times as we move forward.

Mr Davis

Perhaps we should run my speech against—[Laughter.] I am teasing. I am coming to the end of my comments, Madam Deputy Speaker. The simple truth is that these mechanisms—call them what you like—are controllable if we put our mind to it. It requires subtlety, testing the thing out in practice and enormous expert input, but we can get this right.