Tag: Damian Collins

  • Damian Collins – 2014 Parliamentary Question to the Department for Culture Media and Sport

    Damian Collins – 2014 Parliamentary Question to the Department for Culture Media and Sport

    The below Parliamentary question was asked by Damian Collins on 2015-01-15.

    To ask the Secretary of State for Culture, Media and Sport, what support Sport England has given to organisations in Folkestone and Hythe constituency since 2010.

    Mrs Helen Grant

    Since 2010 Sport England has invested £212,836 of National Lottery and Exchequer funding in 19 community sports projects in Folkestone and Hythe constituency.

  • Damian Collins – 2014 Parliamentary Question to the Ministry of Defence

    Damian Collins – 2014 Parliamentary Question to the Ministry of Defence

    The below Parliamentary question was asked by Damian Collins on 2015-01-15.

    To ask the Secretary of State for Defence, when he expects his Department to respond to the recommendations of the 2014 report of the All-Party Parliamentary Group on Gurkha Welfare.

    Anna Soubry

    The Government is very grateful to the All-Party Parliamentary Group on Gurkha Welfare for their comprehensive report into historic Gurkha grievances. This report raised a number of issues, which the Government has been examining closely. Our response will be published shortly.

  • Damian Collins – 2014 Parliamentary Question to the Department for Energy and Climate Change

    Damian Collins – 2014 Parliamentary Question to the Department for Energy and Climate Change

    The below Parliamentary question was asked by Damian Collins on 2015-01-15.

    To ask the Secretary of State for Energy and Climate Change, what assessment he has made of the future operating life of Dungeness B nuclear power station; and if he will make a statement.

    Matthew Hancock

    On 20 January 2015, EdF announced that the life of Dungeness B has been extended to 2028. This was a decision for EdF as the owner and operator to make in consultation with the regulator.

    There is no regulatory requirement for nuclear plant operators to gain permission from the Office for Nuclear Regulation (ONR), for a plant life extension. Plant operators must instead demonstrate that plant will continue to run safely and in compliance with site licence conditions in the course of regular ONR assessments. However, in the case of Dungeness B, EdF and the ONR agreed it would be beneficial for ONR to review the life extension proposals in advance of the life extension announcement, which they have done.

    Separately, in order to ensure that the extension does not impact on the UK taxpayer, the Nuclear Decommissioning Authority (NDA) reviewed the impact of the decision on the UK’s Nuclear Liability Fund last year. The NDA approved the life extension to the plant. The assessment showed net savings rather than net costs.

  • Damian Collins – 2022 Comments on Penny Mordaunt Becoming Prime Minister

    Damian Collins – 2022 Comments on Penny Mordaunt Becoming Prime Minister

    The comments made by Damian Collins, the Conservative MP for Folkestone and Hythe, on Twitter on 20 October 2022.

    I was proud to support Penny Mordaunt campaign for the leadership of Conservative Party this summer, and I hope she stands again now. She has the quality and experience to unite the party and rebuild trust in government.

  • Damian Collins – 2022 Statement Following G20 Digital Ministers’ Meeting

    Damian Collins – 2022 Statement Following G20 Digital Ministers’ Meeting

    The statement made by Damian Collins, the Minister for Tech and the Digital Economy, in Bali, Indonesia on 2 September 2022.

    The diverse membership and collective economic power of the G20 makes it one of the most important international meetings where the challenges facing global digital economies are discussed. It is right that G20 Digital Ministers continue to work together to deliver solutions for the benefit of citizens around the world, based on democratic values and human rights.

    In my speech to the G20 digital ministers I condemned Russia’s unprovoked and brutal war in Ukraine, as well as their use of cyber attacks and aggressive state sponsored disinformation campaigns to cause further disruption around the world.

    I also thanked the Indonesian Presidency for ensuring G20 discussions advanced in some key areas. Progress was made on shared priorities including digital connectivity, skills and literacy, and data free flow with trust.

    It has also been a positive opportunity to develop the UK’s relationship with Indonesia. I am pleased that Minister Plate and I share the same enthusiasm for the joint projects like developing the Satria 2 satellites which will improve connections for rural communities and help close Indonesia’s digital divide. Our discussions will be a firm foundation to build the UK-Indonesia relationship on digital and technology over the coming years.

    The UK will support further progress under future Presidencies, starting with India in 2023. The UK and India have a strong relationship and I was pleased to meet my counterpart, Minister Ashwini Vaishnav, to discuss our mutual digital and tech interests. We agreed to launch the UK-India Strategic Tech Dialogue this year which will promote data, economic growth, and diversifying telecoms supply chains in our two countries.

  • Damian Collins – 2022 Statement on the Online Safety Bill

    Damian Collins – 2022 Statement on the Online Safety Bill

    The statement made by Damian Collins, the Parliamentary Under-Secretary of State at the Department for Digital, Culture, Media and Sport, in the House of Commons on 12 July 2022.

    Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.

    Relative to the point of order from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.

    We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discussing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.

    Sir Jeremy Wright (Kenilworth and Southam) (Con)

    I warmly welcome my hon. Friend to his position. He will understand that those of us who have followed the Bill in some detail since its inception had some nervousness as to who might be standing at that Dispatch Box today, but we could not be more relieved that it is him. May I pick up on his point about the point of order from our right hon. Friend the Member for Haltemprice and Howden (Mr Davis)? Does he agree that an additional point to add to his list is that, unusually, this legislation has a remarkable amount of cross-party consensus behind its principles? That distinguishes it from some of the other legislation that perhaps we should not consider in these two weeks. I accept there is plenty of detail to be examined but, in principle, this Bill has a lot of support in this place.

    Damian Collins

    I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.

    Mr Speaker

    And the other way, as well.

    Damian Collins

    Exactly. The concept at the heart of this legislation is simple. Tech companies, like those in every other sector, must take appropriate responsibility for the consequences of their business decisions. As they continue to offer their users the latest innovations that enrich our lives, they must consider safety as well as profit. They must treat their users fairly and ensure that the internet remains a place for robust debate. The Bill has benefited from input and scrutiny from right across the House. I pay tribute to my predecessor, my hon. Friend the Member for Croydon South (Chris Philp), who has worked tirelessly on the Bill, not least through 50 hours of Public Bill Committee, and the Bill is better for his input and work.

    We have also listened to the work of other Members of the House, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the right hon. Member for Barking (Dame Margaret Hodge), my right hon. Friend the Member for Haltemprice and Howden and the Chair of the Select Committee, my hon. Friend the Member for Solihull (Julian Knight), who have all made important contributions to the discussion of the Bill.

    We have also listened to those concerned about freedom of expression online. It is worth pausing on that, as there has been a lot of discussion about whether the Bill is censoring legal speech online and much understandable outrage from those who think it is. I asked the same questions when I chaired the Joint Committee on the Bill. This debate does not reflect the actual text of the Bill itself. The Bill does not require platforms to restrict legal speech—let us be absolutely clear about that. It does not give the Government, Ofcom or tech platforms the power to make something illegal online that is legal offline. In fact, if those concerned about the Bill studied it in detail, they would realise that the Bill protects freedom of speech. In particular, the Bill will temper the huge power over public discourse wielded by the big tech companies behind closed doors in California. They are unaccountable for the decisions they make on censoring free speech on a daily basis. Their decisions about what content is allowed will finally be subject to proper transparency requirements.

    Dame Maria Miller (Basingstoke) (Con)

    My hon. Friend did not have the joy of being on the Bill Committee, as I did with my hon. Friend the Member for Croydon South (Chris Philp), who was the Minister at that point. The point that my hon. Friend has just made about free speech is so important for women and girls who are not able to go online because of the violent abuse that they receive, and that has to be taken into account by those who seek to criticise the Bill. We have to make sure that people who currently feel silenced do not feel silenced in future and can participate online in the way that they should be able to do. My hon. Friend is making an excellent point and I welcome him to his position.

    Damian Collins

    My right hon. Friend is entirely right on that point. The structure of the Bill is very simple. There is a legal priority of harms, and things that are illegal offline will be regulated online at the level of the criminal threshold. There are protections for freedom of speech and there is proper transparency about harmful content, which I will come on to address.

    Joanna Cherry (Edinburgh South West) (SNP)

    Does the Minister agree that, in moderating content, category 1 service providers such as Twitter should be bound by the duties under our domestic law not to discriminate against anyone on the grounds of a protected characteristic? Will he take a look at the amendments I have brought forward today on that point, which I had the opportunity of discussing with his predecessor, who I think was sympathetic?

    Damian Collins

    The hon. and learned Lady makes a very important point. The legislation sets regulatory thresholds at the criminal law level based on existing offences in law. Many of the points she made are covered by existing public law offences, particularly in regards to discriminating against people based on their protected characteristics. As she well knows, the internet is a reserved matter, so the legal threshold is set at where UK law stands, but where law may differ in Scotland, the police authorities in Scotland can still take action against individuals in breach of the law.

    Joanna Cherry

    The difficulty is that Twitter claims it is not covered by the Equality Act 2010. I have seen legal correspondence to that effect. I am not talking about the criminal law here. I am talking about Twitter’s duty not to discriminate against women, for example, or those who hold gender critical beliefs in its moderation of content. That is the purpose of my amendment today—it would ensure that Twitter and other service providers providing a service in the United Kingdom abide by our domestic law. It is not really a reserved or devolved matter.

    Damian Collins

    The hon. and learned Lady is right. There are priority offences where the companies, regardless of their terms of service, have to meet their obligations. If something is illegal offline, it is illegal online as well. There are priority areas where the company must proactively look for that. There are also non-priority areas where the company should take action against anything that is an offence in law and meets the criminal threshold online. The job of the regulator is to hold them to account for that. They also have to be transparent in their terms of service as category 1 companies. If they have clear policies against discrimination, which they on the whole all do, they will have to set out what they would do, and the regulator can hold them to account to make sure they do what they say. The regulator cannot make them take down speech that is legal or below a criminal threshold, but they can hold them to account publicly for the decisions they make.

    One of the most important aspects of this Bill with regard to the category 1 companies is transparency. At the moment, the platforms make decisions about curating their content—who to take down, who to suppress, who to leave up—but those are their decisions. There is no external scrutiny of what they do or even whether they do what they say they will do. As a point of basic consumer protection law, if companies say in their terms of service that they will do something, they should be held to account for it. What is put on the label also needs to be in the tin and that is what the Bill will do for the internet.

    I now want to talk about journalism and the role of the news media in the online world, which is a very important part of this Bill. The Government are committed to defending the invaluable role of a free media. Online safety legislation must protect the vital role of the press in providing people with reliable and accurate sources of information. Companies must therefore put in place protections for journalistic content. User-to-user services will not have to apply their safety duties in part 3 of the Bill to news publishers’ content shared on their services. News publishers’ content on their own sites will also not be in scope of regulation.

    New clause 19 and associated amendments introduce a further requirement on category 1 services to notify a recognised news publisher and offer a right of appeal before removing or moderating its content or taking any action against its account. This new provision will reduce the risk of major online platforms taking over-zealous, arbitrary or accidental moderation decisions against news publisher content, which plays an invaluable role in UK democracy and society.

    We recognise that there are cases where platforms must be able to remove content without having to provide an appeal, and the new clause has been drafted to ensure that platforms will not be required to provide an appeal before removing content that would give rise to civil or criminal liability to the service itself, or where it amounts to a relevant offence as defined by the Bill. This means that platforms can take down without an appeal content that would count as illegal content under the Bill.

    Moreover, in response to some of the concerns raised, in particular by my right hon. and learned Friend the Member for Kenilworth and Southam as well as by other Members, about the danger of creating an inadvertent loophole for bad actors, we have committed to further tightening the definition of “recognised news provider” in the House of Lords to ensure that sanctioned entities, such as RT, cannot benefit from these protections.

    As the legislation comes into force, the Government are committed to ensuring that protections for journalism and news publisher content effectively safeguard users’ access to such content. We have therefore tabled amendments 167 and 168 to require category 1 companies to assess the impact of their safety duties on how news publisher and journalistic content are treated when hosted on the service. They must then demonstrate the steps they are taking to mitigate any impact.

    In addition, a series of amendments, including new clause 20, will require Ofcom to produce a report assessing the impact of the Online Safety Bill on the availability and treatment of news publisher content and journalistic content on category 1 services. This will include consideration of the impact of new clause 19, and Ofcom must do this within two years of the relevant provisions being commenced.

    The Bill already excludes comments sections on news publishers’ sites from the Bill’s safety duties. These comments are crucial for enabling reader engagement with the news and encouraging public debate, as well as for the sustainability of the news media. We have tabled a series of amendments to strengthen these protections, reflecting the Government’s commitment to media freedom. The amendments will create a higher bar for removing the protections in place for comments sections on recognised news publishers’ sites by ensuring that these can only be brought into the scope of regulation via primary legislation.

    Government amendments 70 and 71 clarify the policy intention of the clause 13 adult safety duties to improve transparency about how providers treat harmful content, rather than incentivise its removal. The changes respond to concerns raised by stakeholders that the drafting did not make it sufficiently clear that providers could choose simply to allow any form of legal content, rather than promote, restrict or remove it, regardless of the harm to users.

    This is a really important point that has sometimes been missed in the discussion on the Bill. There are very clear duties relating to illegal harm that companies must proactively identify and mitigate. The transparency requirements for other harmful content are very clear that companies must set out what their policies are. Enforcement action can be taken by the regulator for breach of their policies, but the primary objective is that companies make clear what their policies are. It is not a requirement for companies to remove legal speech if their policies do not allow that.

    Dame Margaret Hodge (Barking) (Lab)

    I welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, the hon. Member for Croydon South (Chris Philp)—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.

    Damian Collins

    The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.

    Dame Margaret Hodge

    I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.

    Damian Collins

    The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.

    In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.

    The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.

    Chris Philp (Croydon South) (Con)

    I congratulate the Minister on his appointment, and I look forward to supporting him in his role as he previously supported me in mine. I think he made an important point a minute ago about content that is legal but considered to be harmful. It has been widely misreported in the press that this Bill censors or prohibits such content. As the Minister said a moment ago, it does no such thing. There is no requirement on platforms to censor or remove content that is legal, and amendment 71 to clause 13 makes that expressly clear. Does he agree that reports suggesting that the Bill mandates censorship of legal content are completely inaccurate?

    Damian Collins

    I am grateful to my hon. Friend, and as I said earlier, he is absolutely right. There is no requirement for platforms to take down legal speech, and they cannot be directed to do so. What we have is a transparency requirement to set out their policies, with particular regard to some of the offences I mentioned earlier, and a wide schedule of things that are offences in law that are enforced through the Bill itself. This is a very important distinction to make. I said to him on Second Reading that I thought the general term “legal but harmful” had added a lot of confusion to the way the Bill was perceived, because it created the impression that the removal of legal speech could be required by order of the regulator, and that is not the case.

    Debbie Abrahams (Oldham East and Saddleworth) (Lab)

    I congratulate the Minister on his promotion and on his excellent chairmanship of the prelegislative scrutiny Committee, which I also served on. Is he satisfied with the Bill in relation to disinformation? It was concerning that there was only one clause on disinformation, and we know the impact—particularly the democratic impact—that that has on our society at large. Is he satisfied that the Bill will address that?

    Damian Collins

    It was a pleasure to serve alongside the hon. Lady on the Joint Committee. There are clear new offences relating to knowingly false information that will cause harm. As she will know, that was a Law Commission recommendation; it was not in the draft Bill but it is now in the Bill. The Government have also said that as a consequence of the new National Security Bill, which is going through Parliament, we will bring in a new priority offence relating to disinformation spread by hostile foreign states. As she knows, one of the most common areas for organised disinformation has been at state level. As a consequence of the new national security legislation, that will also be reflected in schedule 7 of this Bill, and that is a welcome change.

    The Bill requires all services to take robust action to tackle the spread of illegal content and activity. Providers must proactively reduce the risk on their services of illegal activity and the sharing of illegal content, and they must identify and remove illegal content once it appears on their services. That is a proactive responsibility. We have tabled several interrelated amendments to reinforce the principle that companies must take a safety-by-design approach to managing the risk of illegal content and activity on their services. These amendments require platforms to assess the risk of their services being used to commit, or to facilitate the commission of, a priority offence and then to design and operate their services to mitigate that risk. This will ensure that companies put in place preventive measures to mitigate a broad spectrum of factors that enable illegal activity, rather than focusing solely on the removal of illegal content once it appears.

    Henry Smith (Crawley) (Con)

    I congratulate my hon. Friend on his appointment to his position. On harmful content, there are all too many appalling examples of animal abuse on the internet. What are the Government’s thoughts on how we can mitigate such harmful content, which is facilitating wildlife crime? Might similar online protections be provided for animals to the ones that clause 53 sets out for children?

    Damian Collins

    My hon. Friend raises an important point that deserves further consideration as the Bill progresses through its parliamentary stages. There is, of course, still a general presumption that any illegal activity that could also constitute illegal activity online—for example, promoting or sharing content that could incite people to commit violent acts—is within scope of the legislation. There are some priority illegal offences, which are set out in schedule 7, but the non-priority offences also apply if a company is made aware of content that is likely to be in breach of the law. I certainly think this is worth considering in that context.

    In addition, the Bill makes it clear that platforms have duties to mitigate the risk of their service facilitating an offence, including where that offence may occur on another site, such as can occur in cross-platform child sexual exploitation and abuse—CSEA—offending, or even offline. This addresses concerns raised by a wide coalition of children’s charities that the Bill did not adequately tackle activities such as breadcrumbing—an issue my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Select Committee, has raised in the House before—where CSEA offenders post content on one platform that leads to offences taking place on a different platform.

    We have also tabled new clause 14 and a related series of amendments in order to provide greater clarity about how in-scope services should determine whether they have duties with regard to content on their services. The new regulatory framework requires service providers to put in place effective and proportionate systems and processes to improve user safety while upholding free expression and privacy online. The systems and processes that companies implement will be tailored to the specific risk profile of the service. However, in many cases the effectiveness of companies’ safety measures will depend on them making reasonable judgments about types of content. Therefore, it is essential to the effective functioning of the framework that there is clarity about how providers should approach these judgments. In particular, such clarity will safeguard against companies over-removing innocuous content if they wrongly assume mental elements are present, or under-removing content if they act only where all elements of an offence are established beyond reasonable doubt. The amendments make clear that companies must consider all reasonably available contextual information when determining whether content is illegal content, a fraudulent advert, content that is harmful to children, or content that is harmful to adults.

    Kirsty Blackman (Aberdeen North) (SNP)

    I was on the Bill Committee and we discussed lots of things, but new clause 14 was not discussed: we did not have conversations about it, and external organisations have not been consulted on it. Is the Minister not concerned that this is a major change to the Bill and it has not been adequately consulted on?

    Damian Collins

    As I said earlier, in establishing the threshold for priority illegal offences, the current threshold of laws that exist offline should provide good guidance. I would expect that as the codes of practice are developed, we will be able to make clear what those offences are. On the racial hatred that the England footballers received after the European championship football final, people have been prosecuted for what they posted on Twitter and other social media platforms. We know what race hate looks like in that context, we know what the regulatory threshold should look at and we know the sort of content we are trying to regulate. I expect that, in the codes of practice, Ofcom can be very clear with companies about what we expect, where the thresholds are and where we expect them to take enforcement action.

    Dame Caroline Dinenage (Gosport) (Con)

    I congratulate my hon. Friend on taking his new position; we rarely have a new Minister so capable of hitting the ground running. He makes a crucial point about clearness and transparency for both users and the social media providers and other platforms, because it is important that we make sure they are 100% clear about what is expected of them and the penalties for not fulfilling their commitments. Does he agree that opaqueness—a veil of secrecy—has been one of the obstacles, and that a whole raft of content has been taken down for the wrong reasons while other content has been left to proliferate because of the lack of clarity?

    Damian Collins

    That is entirely right, and in closing I say that the Bill does what we have always asked for it to do: it gives absolute clarity that illegal things offline must be illegal online as well, and be regulated online. It establishes clear responsibilities and liabilities for the platforms to do that proactively. It enables a regulator to hold the platforms to account on their ability to tackle those priority illegal harms and provide transparency on other areas of harmful content. At present we simply do not know about the policy decisions that companies choose to make: we have no say in it; it is not transparent; we do not know whether they do it. The Bill will deliver in those important regards. If we are serious about tackling issues such as fraud and abuse online, and other criminal offences, we require a regulatory system to do that and proper legal accountability and liability for the companies. That is what the Bill and the further amendments deliver.

  • Damian Collins – 2022 Speech on Channel 4 Privatisation

    Damian Collins – 2022 Speech on Channel 4 Privatisation

    The speech made by Damian Collins, the Conservative MP for Folkestone and Hythe, in the House of Commons on 14 June 2022.

    Before I start, I would like to do as the shadow Secretary of State did and declare my entry in the Register of Members’ Financial Interests. I, too, was a guest of Channel 4 at the BAFTA ceremony. I would also declare, as other Members from across the House have done, that I am a fan of “Derry Girls”, as, I am sure, as part of his cross-community work, is the hon. Member for North Antrim (Ian Paisley). This is a channel that makes great programmes that are part of our national psyche and it is an important part of our broadcasting landscape.

    However, I say to Opposition Members and some on our side that I have an honest disagreement with Channel 4 and with people who are opposing privatisation; the company, although well run, is running into such strong industry headwinds that this cannot be taken off the table and it has to considered seriously. As Channel 4 said in its own “The Next Episode” response to the Government’s White Paper, all options have to be considered. That has to include the option of privatisation.

    The challenges to the sector are very real. A lot has been made of the fact that the last financial year was a successful one for Channel 4 and for the UK advertising industry. There was a major spike in advertising revenues. That is partly to do with a major surge in advertising spend coming out of the pandemic, which saw a big increase in revenues for all broadcasters. The pandemic also meant the delay to the European championships and the Olympics, and such major international tournaments traditionally have a considerable inflationary impact on the advertising market. So we have to look at this in a wider context: the increases in ad revenues seen in 2021 may not be repeated; and the diversion away from linear television advertising—traditional spot advertising—to digital media is a continuing trend. Channel 4 may be the leading UK broadcaster in that respect, but currently only 16% of its revenues come from digital advertising. Although it wants to move that target to 30% by 2025, that may still be a significant challenge.

    If there is a major challenge to the TV industry, to the advertising industry, and if there were a recession—TV advertising is traditionally one of the earliest and worst-hit sectors—Channel 4 would be much more vulnerable to the economic shocks that would come, because it does not have other revenue sources. These trends may be familiar across PSBs, which have seen long-term declines in revenue if they are commercial, and in audience numbers, including at peak time. However, the BBC can make money from making programmes. ITV can make money from making programmes, for itself and for other people. Channel 4 does not have that option.

    Let us look at the period before the pandemic. In trying to observe a trend, that is probably the fairest thing to do, because we do not yet quite know what impact the pandemic has had, in terms of lockdown in 2020 and recovery in 2021. What does the picture look like? I think everyone here would agree that when Channel 4 was set up its purpose was to invest its money in UK original productions made by independent production companies. It was set up at a time when the BBC and the ITV companies largely made most of their stuff in house, so it was a necessary vehicle to get financial investment into the independent production sector. This was a sector where Sky, Amazon and Netflix did not exist, and it was far more reliant on that funding.

    If we look at what has happened to Channel 4, and this is true for other PSBs as well, we see that in 2006 it spent £516 million in first-run original content. In 2019, the year before the pandemic, the figure was £436 million, so we have seen a 15% decline. That declining revenue also bought a lot less as well, because inflation in the TV production market is making it more and more expensive to make programmes. So in 2006 Channel 4 broadcast 3,388 hours of first-run original content, whereas in 2019 it broadcast 2,473 hours, which represents a decline of 27%. This trend away from traditional broadcasters towards digital markets, with the pressure that has on their budgets and the declining amount of money they can afford to spend on new programming, has been a trend for a number of years now. The concern we must have is that if there was a shock in the digital ad market and if Channel 4 cannot hit its targets of allowing digital revenues to grow as broadcast revenues decline, it is much more vulnerable. It does not have the reserves and it does not have the ability to make money elsewhere. That is why even Channel 4 is proposing significant changes to its remit.

    Kevin Brennan

    The hon. Gentleman says that Channel 4 is proposing this, but that proposal was a direct response to a request from the Secretary of State to propose alternative sources of revenue. It was not initiated by Channel 4 because of its concerns about its finances.

    Damian Collins

    As I pointed out earlier in the debate, in that document Channel 4 itself says that it requires a radical reset of its role. If it is to take the opportunity of the changing digital landscape in the future, it needs to be in a position to invest more money. That extra investment will not come from advertising revenues. Channel 4 has been the most successful traditional UK broadcaster in switching to digital, but even there the best one can say about the last few years is that the increase in digital revenues has just about kept pace with the decline in broadcasting revenues. Digital is not raising more money incrementally for Channel 4 to invest in programming at a time when new entrants to the market are increasing their spend significantly—by hundreds of millions of pounds. The danger is that Channel 4, with its unique voice, will be less able to compete, less able to commission, and will run less new programming than it could in the past and that other broadcasters will do. That has to be addressed.

    Channel 4 has said that its role needs to be radically reset. It is calling for its digital streaming service, All 4, to be global—to reach a global audience—to increase ad revenues. That is a sensible idea, but the independent production companies that make programmes for Channel 4 would have to give their consent to being unable to sell their programming internationally on their own, as they would in other territories. It calls for the creation of a joint venture in which Channel 4 holds a minority stake that would raise £1 billion to invest in new programming over the next five years. That would be a sensible measure to bring in a significant extra boost in revenue, although it would only bring Channel 4 back to where it was in 2006. As part of that joint venture, Channel 4 would have the intellectual property rights for programming and make money from selling those programmes. Channel 4 believes that may be within its current remit, although it would significantly change the spirit of the remit. The independent production companies might have concerns about that extension, but it is probably necessary.

    The idea that the status quo can continue is wrong. It would be wrong of us to assume that it can continue and to say that we will deal with this problem, if it comes, in the future, and in the meantime see Channel 4 gradually wither on the vine, with declining revenues, declining investment in programming, unable to compete, until the point where it cannot go on and requires a bail-out from the Government or the other PSBs. That is the risk we are taking.

    The Government’s “Up Next” White Paper is not an ideological tract; it is a sensible and serious at look at real issues in the TV sector. We may have different views on what the right format would be; Channel 4 has put forward its ideas and other bidders will do the same. I think the bidders will be more than the traditional players; others will bid as well and we should look at those options, but they will all be options for change, suggesting a way that Channel 4 can raise more money to invest in what we want it to do—making great programmes.

  • Damian Collins – 2021 Speech on the Department for Digital, Culture, Media and Sport

    Damian Collins – 2021 Speech on the Department for Digital, Culture, Media and Sport

    The speech made by Damian Collins, the Conservative MP for Folkestone and Hythe, in the House of Commons on 10 March 2021.

    The Government have provided substantial support for the cultural, sporting and creative sectors since the start of the covid pandemic. This has been welcome but also essential, as many organisations within these sectors rely on revenue from tickets and events to survive. Through no fault of their own, they have been required to close, and the cultural recovery fund, in addition to the funding to support sports and TV and film production, has helped many important bodies to keep going that otherwise might have closed for good.

    However, we now need to focus on the road ahead, through to the lifting of the covid social contact restrictions on 21 June and beyond. The coronavirus has challenged the whole of our society, but it has also exposed further weaknesses in sectors that in some cases we already knew about. The point has been well made about the need for pandemic insurance for the events industry. Events and live performances have already become incredibly important to the music sector, because the remuneration that artists get from on-demand streaming services is relatively low, but these events will not take place unless an insurance scheme can be put in place.

    This is not just about events that could be held this summer; it needs to be done on an ongoing basis. It could be some time before the industry has any certainty, because new variants of covid might require further restrictions on the capacity of audiences and therefore restrict the viability of the event itself. Just as, several years ago, the Government partnered with the insurance industry to create Flood Re to minimise the risk of flood insurance and reduce the costs, we need a similar scheme to help to make insuring live events viable and reduce the cost to people putting on those events.

    In football, the lack of a strong national governing body for the sport that is able to ensure fair dealing in financial matters has been badly exposed. Many football clubs were in great distress before the pandemic struck. Clubs in the championship division of the English football league were routinely spending more than they earned each year on players’ salaries alone, and were running a financially unsustainable model. There has been no real recognition of the impact of the covid restrictions on professional football. The money within the game has not been enough to solve all the problems, and the support that has been given is minimal. Many clubs continue to rack up large debts. At the moment, a lot of the football league is being run on unpaid taxes. It is believed that the amount of unpaid taxes owed to HMRC by football clubs could be in the hundreds of millions of pounds. We need a proper financial regulator for football to ensure that clubs are run on a sustainable basis for the long term, but in the short term we may need to look at how some sort of financial assistance can be given to those most in distress. Clubs outside the premier league are largely community assets, and they need to be run in a sustainable way.

    I want to make two other points briefly. The last 12 months have exposed just how influential disinformation and hate speech on social media can be, particularly in relation to anti-vaccine campaigns to undermine confidence in the vaccine and spread conspiracy theories about the pandemic. It makes the bringing forward of the online harms Bill this year so important for the Department, and we must also ensure that there are proper resources for Ofcom, as the regulator, to ensure that there can be proper auditing and inspection of the way social media companies respond to campaigns of disinformation and hate speech, and other speech that can cause harm through social media networks. We have been talking about this for many years and I am glad that the Bill is coming, but it is also an imperative.

    Finally, the pandemic has also had a big impact on the advertising industry and broadcasting revenues from advertising, just as other media have struggled with revenue from advertising. There is no guarantee that this money will bounce back, particularly as audiences are increasingly diverting their attention to online services—social media to receive news and on-demand platforms to view content. Increasingly, many people spend time not watching broadcast material at all, but playing games and doing other things online. This potentially undermines the public service broadcasting model in this country. I welcome the fact that we have the PSB review, but we need to understand that the long-term impacts of rising production costs for television due to the impact of Netflix and Amazon Prime and of declining advertising revenues because of switching audience attention are fundamentally changing the market, and if we have media that—

    Madam Deputy Speaker (Dame Rosie Winterton)

    Order. I am afraid we do have to move on.

  • Damian Collins – 2020 Speech on Online Harms

    Damian Collins – 2020 Speech on Online Harms

    The speech made by Damian Collins, the Conservative MP for Folkestone and Hythe, in the House of Commons on 19 November 2020.

    I congratulate my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright) on his excellent speech introducing this debate. We need to be clear that the online harms White Paper response from the Government is urgently needed, as is the draft Bill. We have been discussing this for several years now. When I was Chair of the Digital, Culture, Media and Sport Committee, we published a report in the summer of 2018 asking for intervention on online harms and calling for a regulatory system based on a duty of care placed on the social media companies to act against harmful content.

    There are difficult decisions to be made in assessing what harmful content is and assessing what needs to be done, but I do not believe those decisions should be made solely by the chief executives of the social media companies. There should be a legal framework that they have to work within, just as people in so many other industries do. It is not enough to have an online harms regulatory system based just on the terms and conditions of the companies themselves, in which all Parliament and the regulator can do is observe whether those companies are administering their own policies.

    We must have a regulatory body that has an auditing function and can look at what is going on inside these companies and the decisions they make to try to remove and eliminate harmful hate speech, medical conspiracy theories and other more extreme forms of harmful or violent content. Companies such as Facebook say that they remove 95% of harmful content. How do we know? Because Facebook tells us. Has anyone checked? No. Can anyone check? No; we are not allowed to check. Those companies have constantly refused to allow independent academic bodies to go in and scrutinise what goes on within them. That is simply not good enough.

    We should be clear that we are not talking about regulating speech. We are talking about regulating a business model. It is a business model that prioritises the amplification of content that engages people, and it does not care whether or not that content is harmful. All it cares about is the engagement. So people who engage in medical conspiracy theories will see more medical conspiracy theories. A young person who engages with images of self-harm will see more images of self-harm. No one is stepping in to prevent that. How do we know that Facebook did all it could to stop the live broadcast of a terrorist attack in Christchurch, New Zealand? No one knows. We have only Facebook’s word for it, and the scale of that problem could have been a lot worse.

    The tools and systems of these companies are actively directing people to harmful content. People often talk about how easy it is to search for this material. Companies such as Facebook will say, “We downgrade this material on our site to make it hard to find,” but they direct people to it. People are not searching for it—it is being pushed at them. Some 70% of what people watch on YouTube is selected for them by YouTube, not searched for by them. An internal study done by Facebook in Germany in 2016, which the company suppressed and was leaked to the media this year, showed that 60% of people who joined Facebook groups that shared extremist material did so at the recommendation of Facebook, because they had engaged with material like that before. That is what we are trying to regulate—a business model that is broken—and we desperately need to move on with online harms.

  • Damian Collins – 2020 Speech on the Trade Bill

    Damian Collins – 2020 Speech on the Trade Bill

    Below is the text of the speech made by Damian Collins, the Conservative MP for Folkestone and Hythe, in the House of Commons on 20 May 2020.

    I wish to speak in support of the Bill, but also to address the importance of scrutiny by Parliament of digital trade provisions in proposed future UK trading agreements. This is a vital and fast-moving sector that is very important to the British economy. Technology touches almost all aspects of our national life, as indeed these proceedings themselves make clear.

    One of the most important new trade agreements being negotiated right now is the one with the United States, but we need to make sure that the digital trade provisions of a deal do not impact on other areas of domestic law, in particular our ability to legislate to create new responsibilities for large social media companies to act against harmful content online. The example of the recently negotiated trade deal between the USA, Canada and Mexico, which I understand is the basis for the start of the American approach to negotiations with the UK, shows how the danger can lie in the detail of these agreements.

    The agreement states that the signatories shall not

    “adopt or maintain measures that treat a supplier or user of an interactive computer service as an information content provider in determining liability for harms related to information stored, processed, transmitted, distributed, or made available by the service, except to the extent the supplier or user has, in whole or in part, created, or developed the information.”

    What that means, in short, is that while a social media platform can be used to disseminate harmful content, and indeed the algorithms of that platform could be used to promote it, the liability lies solely with the person who created that content, and it could be impossible to identify that person, except perhaps through data held by the social media platform they have used. In this context, the harmful content being shared on social media could include a wide range of dangerous material from content that promotes fraud, violent conduct, self-harm, cyber-bullying or unlawful interference in elections. This provision was included in the US-Canada-Mexico trade agreement, despite opposition from prominent members of the United States Congress, including the Speaker, Nancy Pelosi, and Senators Mark Warner and Ted Cruz.

    The provision is based on the provisions in US law known as section 230 of the US Communications Decency Act. Section 230 provides broad unconditional immunity ​to internet platforms from civil liability for unlawful third-party content they distribute. This sweeping immunity gives internet-based entities an unnecessary and unfair commercial advantage over various law-abiding bricks-and-mortar businesses and content creators. Section 230 immunity is unconditional. The platform can even be designed to attract illegal or harmful content, to know about that illegal or harmful content, have a role in generating and editing it, actively increase its reach and refuse to do anything about it, profit from it and help hide the identity of third-party lawbreakers, and still not be civilly liable.

    The grant of immunity for online services under section 230 was supposed to be in exchange for the act of voluntary filtering in a proactive and effective way, yet we all know that there are constant complaints about the failure of major tech companies to act as swiftly as we would like to see against content that could cause harm to others. If such a provision were required in the UK-US trade agreement, it would severely limit our ability to tackle online harms, as we would be prevented from creating legal liabilities, or to tackle companies failing in their duty of care to act against harmful content.

    This prompts the question whether international trade agreements should be used to fix such important matters of domestic policy. There is growing cross-party consensus on that point in the US Congress as well. In the UK, these should always be matters on which Parliament has the last word. Indeed, in America, those who have advocated the inclusion of section 230 provisions in trade agreements, do so knowing that they will make it harder for them to be removed in US law itself. The Secretary of State for International Trade has assured me that the Government will not accept trade agreements that would limit the scope of Parliament to legislate to create responsibilities to act against harmful content online. I agree with her that that should be our priority, but we need to understand that that will require a different approach to the negotiations on digital trade from that which was followed by Canada with America. We should not include the provisions based on section 230 in a UK-US trade agreement.

    Having trade agreements for digital services, data and technology with other major markets around the world is greatly in our national interest, but we need to make sure that they give us the freedom to act against known harms and the freedom to enforce standards designed to protect the public interest, just as we would seek to do in any other industry.