Skip to main content

Public Bill Committees

Debated on Thursday 16 June 2022

Online Safety Bill (Eleventh sitting)

The Committee consisted of the following Members:

Chairs: † Sir Roger Gale, Christina Rees

Ansell, Caroline (Eastbourne) (Con)

† Bailey, Shaun (West Bromwich West) (Con)

† Blackman, Kirsty (Aberdeen North) (SNP)

Carden, Dan (Liverpool, Walton) (Lab)

† Davies-Jones, Alex (Pontypridd) (Lab)

† Double, Steve (St Austell and Newquay) (Con)

† Fletcher, Nick (Don Valley) (Con)

Holden, Mr Richard (North West Durham) (Con)

† Keeley, Barbara (Worsley and Eccles South) (Lab)

Leadbeater, Kim (Batley and Spen) (Lab)

† Miller, Dame Maria (Basingstoke) (Con)

Mishra, Navendu (Stockport) (Lab)

† Moore, Damien (Southport) (Con)

Nicolson, John (Ochil and South Perthshire) (SNP)

† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)

† Russell, Dean (Watford) (Con)

† Stevenson, Jane (Wolverhampton North East) (Con)

Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks

† attended the Committee

Public Bill Committee

Thursday 16 June 2022


[Sir Roger Gale in the Chair]

Online Safety Bill

Good morning, ladies and gentlemen. I am entirely properly reminded that today is the anniversary of Jo Cox’s death, which is why Kim Leadbeater, the hon. Member for Batley and Spen, is not with us this morning. I am sure that our thoughts and good wishes are with her and the family.

Hear, hear.

Clause 69

OFCOM’s guidance about duties set out in section 68

We start with amendment 127 to clause 69. It is up to the Committee, but I am minded to allow this debate to go slightly broader and take the stand part debate with it.

I beg to move amendment 127, in clause 69, page 60, line 26, after “must” insert—

“within six months of this Act being passed”.

As ever, it is a pleasure to serve under your chairship, Sir Roger. The thoughts and prayers of us all are with my hon. Friend the Member for Batley and Spen and all her friends and family.

Labour welcomes the clause, which sets out Ofcom’s duties to provide guidance to providers of internet services. It is apparent, however, that we cannot afford to kick the can down the road and delay implementation of the Bill any further than necessary. With that in mind, I urge the Minister to support the amendment, which would give Ofcom an appropriate amount of time to produce this important guidance.

It is a pleasure, once again, to serve under your august chairmanship, Sir Roger. I associate the Government with the remarks that you and the shadow Minister made, marking the anniversary of Jo Cox’s appalling murder, which shook the entire House when it happened. She will never be forgotten.

The Government are sympathetic to the intent of the amendment, which seeks to ensure that guidance for providers on protecting children from online pornography is put in place as quickly as possible. We of course sympathise with that objective, but we feel that the Secretary of State must retain the power to determine when to bring in the provisions of part 5, including the requirement under the clause for Ofcom to produce guidance, to ensure that implementation of the framework comprehensively and effectively regulates all forms of pornography online. That is the intention of the whole House and of this Committee.

Ofcom needs appropriate time and flexibility to get the guidance exactly right. We do not want to rush it and consequently see loopholes, which pornography providers or others might seek to exploit. As discussed, we will be taking a phased approach to bringing duties under the Bill into effect. We expect prioritisation for the most serious harms as quickly as possible, and we expect the duties on illegal content to be focused on most urgently. We have already accelerated the timescales for the most serious harms by putting priority illegal content in the various schedules to the Bill.

Ofcom is working hard to prepare implementation. We are all looking forward to the implementation road map, which it has committed to produce before the summer. For those reasons, I respectfully resist the amendment.

Question put, That the amendment be made.

Clause 69 ordered to stand part of the Bill.

Clause 70

Duty to notify OFCOM

Question proposed, That the clause stand part of the Bill.

It is a pleasure to serve with you in the Chair again, Sir Roger. I add my tribute to our former colleague, Jo Cox, on this sad anniversary. Our thoughts are with her family today, including our colleague and my hon. Friend, the Member for Batley and Spen.

We welcome the “polluter pays” principle on which this and the following clauses are founded. Clause 70 establishes a duty for providers to notify Ofcom if their revenue is at or above the specified threshold designated by Ofcom and approved by the Secretary of State. It also creates duties on providers to provide timely notice and evidence of meeting the threshold. The Opposition do not oppose those duties. However, I would be grateful if the Minister could clarify what might lead to a provider or groups of providers being exempt from paying the fee. Subsection (6) establishes that

“OFCOM may provide that particular descriptions of providers of regulated services are exempt”,

subject to the Secretary of State’s approval. Our question is what kinds of services the Minister has in mind for that exemption.

Turning to clauses 71 to 76, as I mentioned, it is appropriate that the cost to Ofcom of exercising its online safety functions is paid through an annual industry fee, charged to the biggest companies with the highest revenues, and that smaller companies are exempt but still regulated. It is also welcome that under clause 71, Ofcom can make reference to factors beyond the provider’s qualifying worldwide revenue when determining the fee that a company must pay. Acknowledging the importance of other factors when computing that fee can allow for a greater burden of the fees to fall on companies whose activities may disproportionately increase Ofcom’s work on improving safety.

My hon. Friend the Member for Pontypridd has already raised our concerns about the level of funding needed for Ofcom to carry out its duties under the Bill. She asked about the creation of a new role: that of an adviser on funding for the online safety regulator. The impact assessment states that the industry fee will need to average around £35 million a year for the next 10 years to pay for operating expenditure. Last week, the Minister referred to a figure of around £88 million that has been announced to cover the first two years of the regime while the industry levy is implemented, and the same figure was used on Second Reading by the Secretary of State. Last October’s autumn Budget and spending review refers on page 115 to

“over £110 million over the SR21 period for the government’s new online safety regime through the passage and implementation of the Online Safety Bill, delivering on the government’s commitment to make the UK the safest place to be online.”

There is no reference to the £88 million figure or to Ofcom in the spending review document. Could the Minister tell us a bit more about that £88 million and the rest of the £110 million announced in the spending review, as it is relevant to how Ofcom is going to be resourced and the industry levy that is introduced by these clauses?

The Opposition feel it is critical that when the Bill comes into force, there is no gap in funding that would prevent Ofcom from carrying out its duties. The most obvious problem is that the level of funding set out in the spending review was determined when the Bill was in draft form, before more harms were brought into scope. The Department for Digital, Culture, Media and Sport has also confirmed that the figure of £34.9 million a year that is needed for Ofcom to carry out its online safety duties was based on the draft Bill.

We welcome many of the additional duties included in the Bill since its drafting, such as on fraudulent advertising, but does the Minister think the same level of funding will be adequate as when the calculation was made, when the Bill was in draft form? Will he reconsider the calculations his Department has made of the level of funding that Ofcom will need for this regime to be effective in the light of the increased workload that this latest version of the Bill introduces?

In March 2021, Ofcom put out a press release stating that 150 people would be employed in the new digital and technology hub in Manchester, but that that number would be reached in 2025. Therefore, as well as the level of resource being based on an old version of the Bill, the timeframe reveals a gap of three years until all the staff are in place. Does the Minister believe that Ofcom will have everything that is needed from the start, and in subsequent years as the levy gets up and going, in order to carry out its duties?

Of course, this will depend on how long the levy might need to be in place. My understanding of the timeframe is that first, the Secretary of State must issue guidance to Ofcom about the principles to be included in the statement of principles that Ofcom will use to determine the fees payable under clause 71. Ofcom must consult with those affected by the threshold amount to inform the final figure it recommends to the Secretary of State, and must produce a statement about what amounts comprise the provider’s qualifying world revenue and the qualifying period. That figure and Ofcom’s guidance must be agreed by the Secretary of State and laid before Parliament. Based on those checks and processes, how quickly does the Minister envisage the levy coming into force?

The Minister said last week that Ofcom is resourced for this work until 2023-24. Will the levy be in place by then to fund Ofcom’s safety work into 2024-25? If not, can the Minister confirm that the Government will cover any gaps in funding? I am sure he will agree, as we all do, that the duties in the Bill must be implemented as quickly as possible, but the necessary funding must also be in place so that Ofcom as a regulator can enforce the safety duty.

I have just a short comment on these clauses. I very much applaud the Government’s approach to the funding of Ofcom through this mechanism. Clause 75 sets out clearly that the fees payable to Ofcom under section 71 should only be

“sufficient to meet, but…not exceed the annual cost to OFCOM”.

That is important when we start to think about victim support. While clearly Ofcom will have a duty to monitor the efficacy of the mechanisms in place on social media platforms, it is not entirely clear to me from the evidence or conversations with Ofcom whether it will see it as part of its duty to ensure that other areas of victim support are financed through those fees.

It may well be that the Minister thinks it more applicable to look at this issue when we consider the clauses on fines, and I plan to come to it at that point, but it would be helpful to understand whether he sees any role for Ofcom in ensuring that there is third-party specialist support for victims of all sorts of crime, including fraud or sexual abuse.

Let me start by associating myself with the remarks by the hon. Member for Worsley and Eccles South. We are in complete concurrence with the concept that the polluter should pay. Where there are regulatory costs caused by the behaviour of the social media firms that necessitates the Bill, it is absolutely right that those costs should fall on them and not on the general taxpayer. I absolutely agree with the principles that she outlined.

The hon. Lady raised a question about clause 70(6) and the potential exemption from the obligation to pay fees. That is a broadly drawn power, and the phrasing used is where

“OFCOM consider that an exemption…is appropriate”

and where the Secretary of State agrees. The Bill is not being prescriptive; it is intentionally providing flexibility in case there are circumstances where levying the fees might be inappropriate or, indeed, unjust. It is possible to conceive of an organisation that somehow exceeds the size threshold, but so manifestly does not need regulation that it would be unfair or unjust to levy the fees. For example, if a charity were, by some accident of chance, to fall into scope, it might qualify. But we expect social media firms to pay these bills, and I would not by any means expect the exemption to be applied routinely or regularly.

On the £88 million and the £110 million that have been referenced, the latter amount is to cover the three-year spending review period, which is the current financial year—2022-23—2023-24 and 2024-25. Of that £110 million, £88 million is allocated to Ofcom in the first two financial years; the remainder is allocated to DCMS for its work over the three-year period of the spending review. The £88 million for Ofcom runs out at the end of 2023-24.

The hon. Lady then asked whether the statutory fees in these clauses will kick in when the £88 million runs out—whether they will be available in time. The answer is yes. We expect and intend that the fees we are debating will become effective in 2024-25, so they will pick up where the £88 million finishes.

Ofcom will set the fees at a level that recoups its costs, so if the Bill becomes larger in scope, for example through amendments in the Commons or the Lords—not that I wish to encourage amendments—and the duties on Ofcom expand, we would expect the fees to be increased commensurately to cover any increased cost that our legislation imposes.

Before the Minister gets past this point—I think he has reached the point of my question—the fees do not kick in for two years. The figure is £88 million, but the point I was making is that the scope of the Bill has already increased. I asked about this during the evidence session with Ofcom. Fraudulent advertising was not included before, so there are already additional powers for Ofcom that need to be funded. I was questioning whether the original estimate will be enough for those two years.

That covers the preparatory work rather than the actual enforcement work that will follow. For the time being, we believe that it is enough, but of course we always maintain an active dialogue with Ofcom.

Finally, there was a question from my right hon. Friend the Member for Basingstoke, who asked how victims will be supported and compensated. As she said, Ofcom will always pay attention to victims in its work, but we should make it clear that the fees we are debating in these clauses are designed to cover only Ofcom’s costs and not those of third parties. I think the costs of victim support and measures to support victims are funded separately via the Ministry of Justice, which leads in this area. I believe that a victims Bill is being prepared that will significantly enhance the protections and rights that victims have—something that I am sure all of us will support.

Question put and agreed to.

Clause 70 accordingly ordered to stand part of the Bill.

Clauses 71 to 76 ordered to stand part of the Bill.

Clause 77

General duties of OFCOM under section 3 of the Communications Act

Question proposed, That the clause stand part of the Bill.

We welcome clause 77, which is an important clause that seeks to amend Ofcom’s existing general duties in the Communications Act 2003. Given the prevalence of illegal harms online, as we discussed earlier in proceedings, it is essential that the Communications Act is amended to reflect the important role that Ofcom will have as a new regulator.

As the Minister knows, and as we will discuss shortly when we reach amendments to clause 80, we have significant concerns about the Government’s approach to size versus harm when categorising service providers. Clause 77(4) amends section 3 of the Communications Act by inserting new subsection (4A). New paragraph (4A)(d) outlines measures that are proportionate to

“the size or capacity of the provider”,

and to

“the level of risk of harm presented by the service in question, and the severity of the potential harm”.

We know that harm, and the potential of accessing harmful content, is what is most important in the Bill—it says it in the name—so I am keen for my thoughts on the entire categorisation process to be known early on, although I will continue to press this issue with the Minister when we debate the appropriate clause.

Labour also supports clause 78. It is vital that Ofcom will have a duty to publish its proposals on strategic priorities within a set time period, and ensuring that that statement is published is a positive step towards transparency, which has been so crucially missing for far too long.

Similarly, Labour supports clause 79, which contains a duty to carry out impact assessments. That is vital, and it must be conveyed in the all-important Communications Act.

As the shadow Minister has set out, these clauses ensure that Ofcom’s duties under the Communications Act 2003 are updated to reflect the new duties that we are asking it to undertake—I think that is fairly clear from the clauses. On the shadow Minister’s comment about size and risk, I note her views and look forward to debating that more fully in a moment.

Question put and agreed to.

Clause 77 accordingly ordered to stand part of the Bill.

Clauses 78 and 79 ordered to stand part of the Bill.

Clause 80

Meaning of threshold conditions etc

Question proposed, That the clause stand part of the Bill.

With this it will be convenient to discuss the following:

Amendment 80, in schedule 10, page 192, line, at end insert—

“(c) the assessed risk of harm arising from that part of the service.”

This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

Amendment 81, in schedule 10, page 192, line 39, after “functionality” insert—

“and at least one specified condition about the assessed risk of harm”

This amendment is linked to Amendment 80.

Amendment 82, in schedule 10, page 192, line 41, at end insert—

‘(4A) At least one specified condition about the assessed risk of harm must provide for a service assessed as posing a very high risk of harm to its users to meet the Category 1 threshold.”

This amendment is linked to Amendment 80, it widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

That schedule 10 be the Tenth schedule to the Bill.

Clause 81 stand part.

Clause 82 stand part.

Thank you for your efforts in chairing our meeting today, Sir Roger. My thoughts are with the hon. Member for Batley and Spen and her entire family on the anniversary of Jo Cox’s murder; the SNP would like to echo that sentiment.

I want to talk about my amendment, and I start with a quote from the Minister on Second Reading:

“A number of Members…have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered.”—[Official Report, 19 April 2022; Vol. 712, c. 133.]

I appreciate that the Minister may still be thinking about that. He might accept all of our amendments; that is entirely possible, although I am not sure there is any precedent. The possibility is there that that might happen.

Given how strong I felt that the Minister was on the issue on Second Reading, I am deeply disappointed that there are no Government amendments to this section of the Bill. I am disappointed because of the massive risk of harm caused by some very small platforms—it is not a massive number—where extreme behaviour and radicalisation is allowed to thrive. It is not just about the harm to those individuals who spend time on those platforms and who are radicalised, presented with misinformation and encouraged to go down rabbit holes and become more and more extreme in their views. It is also about the risk of harm to other people as a result of the behaviour inspired in those individuals. We are talking about Jo Cox today; she is in our memories and thoughts. Those small platforms are the ones that are most likely to encourage individuals towards extremely violent acts.

If the Bill is to fulfil its stated aims and take the action we all want to see to prevent the creation of those most heinous, awful crimes, it needs to be much stronger on small, very high-risk platforms. I will make no apologies for that. I do not care if those platforms have small amounts of profits. They are encouraging and allowing the worst behaviours to thrive on their platforms. They should be held to a higher level of accountability. It is not too much to ask to class them as category 1 platforms. It is not too much to ask them to comply with a higher level of risk assessment requirements and a higher level of oversight from Ofcom. It is not too much to ask because of the massive risk of harm they pose and the massive actual harm that they create.

Those platforms should be punished for that. It is one thing to punish and criminalise the behaviour of users on those platforms—individual users create and propagate illegal content or radicalise other users—but the Bill does not go far enough in holding those platforms to account for allowing that to take place. They know that it is happening. Those platforms are set up as an alternative place—a place that people are allowed to be far more radical that they are on Twitter, YouTube, Twitch or Discord. None of those larger platforms have much moderation, but the smaller platforms encourage such behaviour. Links are put on other sites pointing to those platforms. For example, when people read vaccine misinformation, there are links posted to more radical, smaller platforms. I exclude Discord because, given its number of users, I think it would be included in one of the larger-platform categories anyway. It is not that there is not radical behaviour on Discord—there is—but I think the size of its membership excludes it, in my head certainly, from the category of the very smallest platforms that pose the highest risk.

We all know from our inboxes the number of people who contact us saying that 5G is the Government trying to take over their brains, or that the entire world is run by Jewish lizard people. We get those emails on a regular basis and those theories are propagated on the smallest platforms. Fair enough—some people may not take any action as a result of the radicalisation that they have experienced as a result of their very extreme views. But some people will take action and that action may be simply enough to harm their friends or family, it may be simply enough to exclude them and drag them away from the society or community that they were previously members of or it might, in really tragic cases, be far more extreme. It might lead people to cause physical or mental harm to others intentionally as a result of the beliefs that they have had created and fostered on those platforms.

That is why we have tabled the amendments. This is the one area that the Government have most significantly failed in writing this Bill, by not ensuring that the small, very high-risk platforms are held to the highest level of accountability and are punished for allowing these behaviours to thrive on their platforms. I give the Minister fair warning that unless he chooses to accept the amendments, I intend to push them to a vote. I would appreciate it if he gave assurances, but I do not believe that any reassurance that he could give would compare to having such a measure in the Bill. As I say, for me the lack of this provision is the biggest failing of the entire Bill.

I echo the comments of the hon. Member for Aberdeen North. I completely agree with everything she has just said and I support the amendments that she has tabled.

The Minister knows my feelings on the Government’s approach to categorisation services; he has heard my concerns time and time again. However, it is not just me who believes that the Government have got their approach really wrong. It is also stakeholders far and wide. In our evidence sessions, we heard from HOPE not hate and the Antisemitism Policy Trust specifically on this issue. In its current form, the categorisation process is based on size versus harm, which is a fundamentally flawed approach.

The Government’s response to the Joint Committee that scrutinised the draft Bill makes it clear that they consider that reach is a key and proportional consideration when assigning categories and that they believe that the Secretary of State’s powers to amend those categories are sufficient to protect people. Unfortunately, that leaves many alternative platforms out of category 1, even if they host large volumes of harmful material.

The duty of care approach that essentially governs the Bill is predicated on risk assessment. If size allows platforms to dodge the entry criteria for managing high risk, there is a massive hole in the regime. Some platforms have already been mentioned, including BitChute, Gab and 4chan, which host extreme racist, misogynistic, homophobic and other extreme content that radicalises people and incites harm. And the Minister knows that.

I take this opportunity to pay tribute to my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), who has campaigned heavily on the issue since the horrendous and tragic shooting in Keyham in his constituency. One of my big concerns about the lack of focus on violence against women and girls in the Bill, which we have mentioned time and time again, is the potential for the rise of incel culture online, which is very heavily reported on these alternative platforms—these high-harm, high-risk platforms.

I will just give one example. A teacher contacted me about the Bill. She talked about the rise of misogyny and trying to educate her class on what was happening. At the end of the class, a 15-year-old boy—I appreciate that he is under 18 and is a child, so would come under a different category within the Bill, but I will still give the example. He came up to her and said: “Miss, I need to chat to you. This is something I’m really concerned about. All I did was google, ‘Why can’t I get a girlfriend?’” He had been led down a rabbit hole into a warren of alternative platforms that tried to radicalise him with the most extreme content of incel culture: women are evil; women are the ones who are wrong; it is women he should hate; it is his birth right to have a girlfriend, and he should have one; and he should hate women. That is the type of content that is on those platforms that young, impressionable minds are being pointed towards. They are being radicalised and it is sadly leading to incredibly tragic circumstances, so I really want to push the Minister on the subject.

We share the overarching view of many others that this crucial risk needs to be factored into the classification process that determines which companies are placed in category 1. Otherwise, the Bill risks failing to protect adults from substantial amounts of material that causes physical and psychological harm. Schedule 10 needs to be amended to reflect that.

Amendment 80 is fundamental to the creation of a safer internet and to saving lives. Organisations such as the Mental Health Foundation have repeatedly warned us of the dangers of smaller providers that host exceptionally dangerous suicide-related content, which would clearly not be considered category 1 under the Bill as it stands. Suicide forums frequently glamorise suicide and are commonly used to share information about novel methods of suicide. There is clear evidence that when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one, but that suicides occur in people who would not otherwise have taken their own lives.

There are therefore important public health reasons to minimise the discussion of dangerous and effective suicide methods and avoid discussion of them in the public domain. Addressing the most dangerous suicide-related content is an area where the Bill could really save lives. It is therefore inexplicable that a Bill intended to increase online safety does not seek to do that.

I appreciate the shadow Minister’s bringing that issue up. Would she agree that, given we have constraints on broadcast and newspaper reporting on suicide for these very reasons, there can be no argument against including such a measure in the Bill?

I completely agree. Those safeguards are in place for that very reason. It seems a major omission that they are not also included in the Online Safety Bill if we are truly to save lives.

The Bill’s own pre-legislative scrutiny Committee recommended that the legislation should

“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model.”

The Government replied that they

“want the Bill to be targeted and proportionate for businesses and Ofcom and do not wish to impose disproportionate burdens on small companies.”

It is, though, entirely appropriate to place a major regulatory burden on small companies that facilitate the glorification of suicide and the sharing of dangerous methods through their forums. It is behaviour that is extraordinarily damaging to public health and makes no meaningful economic or social contribution.

Amendment 82 is vital to our overarching aim of having an assessed risk of harm at the heart of the Bill. The categorisation system is not fit for purpose and will fail to capture so many of the extremely harmful services that many of us have already spoken about.

I want to remind Committee members of what my hon. Friend is talking about. I refer to the oral evidence we heard from Danny Stone, from the Antisemitism Policy Trust, on these small, high-harm platforms. He laid out examples drawn from the work of the Community Security Trust, which released a report called “Hate Fuel”. The report looked at

“various small platforms and highlighted that, in the wake of the Pittsburgh antisemitic murders, there had been 26 threads…with explicit calls for Jews to be killed. One month prior to that, in May 2020, a man called Payton Gendron found footage of the Christchurch attacks. Among this was legal but harmful content, which included the “great replacement” theory, GIFs and memes, and he went on a two-year journey of incitement.”

A week or so before the evidence sitting,

“he targeted and killed 10 people in Buffalo. One of the things that he posted was:

‘Every Time I think maybe I shouldn’t commit to an attack I spend 5 min of /pol/’—

which is a thread on the small 4chan platform—

‘then my motivation returns’.”

Danny Stone told us that the kind of material we are seeing, which is legal but harmful, is inspiring people to go out and create real-world harm. When my hon. Friend the Member for Pontypridd asked him how to amend this approach, he said:

“You would take into account other things—for example, characteristics are already defined in the Bill, and that might be an option”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 128, Q203-204.]

I do hope that, as my hon. Friend urges, the Minister will look at all these options, because this is a very serious matter.

I completely agree with my hon. Friend. The evidence we heard from Danny Stone from the Antisemitism Policy Trust clearly outlined the real-world harm that legal but harmful content causes. Such content may be legal, but it causes mass casualties and harm in the real world.

There are ways that we can rectify that in the Bill. Danny Stone set them out in his evidence and the SNP amendments, which the Labour Front Bench supports wholeheartedly, outline them too. I know the Minister wants to go further; he has said as much himself to this Committee and on the Floor of the House. I urge him to support some of the amendments, because it is clear that such changes can save lives.

Schedule 10 outlines the regulations specifying threshold conditions for categories of part 3 services. Put simply, as the Minister knows, Labour has concerns about the Government’s plans to allow thresholds for each category to be set out in secondary legislation. As we have said before, the Bill has already faced significant delays at the hands of the Government and we have real concerns that a reliance on secondary legislation further kicks the can down the road.

We also have concerns that the current system of categorisation is inflexible in so far as we have no understanding of how it will work if a service is required to shift from one category to another, and how long that would take. How exactly will that work in practice? Moreover, how long would Ofcom have to preside over such decisions?

We all know that the online space is susceptible to speed, with new technologies and ways of functioning popping up all over, and very often. Will the Minister clarify how he expects the re-categorisation process to occur in practice? The Minister must accept that his Department has been tone deaf on this point. Rather than an arbitrary size cut-off, the regulator must use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net.

Labour welcomes clause 81, which sets out Ofcom’s duties in establishing a register of categories of certain part 3 services. As I have repeated throughout the passage of the Bill, having a level of accountability and transparency is central to its success. However, we have slight concerns that the wording in subsection (1), which stipulates that the register be established

“as soon as reasonably practicable”,

could be ambiguous and does not give us the certainty we require. Given the huge amount of responsibility the Bill places on Ofcom, will the Minister confirm exactly what he believes the stipulation means in practice?

Finally, we welcome clause 82. It clarifies that Ofcom has a duty to maintain the all-important register. However, we share the same concerns I previously outlined about the timeframe in which Ofcom will be compelled to make such changes. We urge the Minister to move as quickly as he can, to urge Ofcom to do all they can and to make these vital changes.

As we have heard, the clauses set out how different platforms will be categorised with the purpose of ensuring duties are applied in a reasonable and proportionate way that avoids over-burdening smaller businesses. However, it is worth being clear that the Online Safety Bill, as drafted, requires all in-scope services, regardless of their user size, to take action against content that is illegal and where it is necessary to protect children. It is important to re-emphasise the fact that there is no size qualification for the illegal content duties and the duties on the protection of children.

It is also important to stress that under schedule 10 as drafted there is flexibility, as the shadow Minister said, for the Secretary of State to change the various thresholds, including the size threshold, so there is an ability, if it is considered appropriate, to lower the size thresholds in such a way that more companies come into scope, if that is considered necessary.

It is worth saying in passing that we want these processes to happen quickly. Clearly, it is a matter for Ofcom to work through the operations of that, but our intention is that this will work quickly. In that spirit, in order to limit any delays to the process, Ofcom can rely on existing research, if that research is fit for purpose under schedule 10 requirements, rather than having to do new research. That will greatly assist moving quickly, because the existing research is available off the shelf immediately, whereas commissioning new research may take some time. For the benefit of Hansard and people who look at this debate for the application of the Bill, it is important to understand that that is Parliament’s intention.

I will turn to the points raised by the hon. Member for Aberdeen North and the shadow Minister about platforms that may be small and fall below the category 1 size threshold but that are none the less extremely toxic, owing to the way that they are set up, their rules and their user base. The shadow Minister mentioned several such platforms. I have had meetings with the stakeholders that she mentioned, and we heard their evidence. Other Members raised this point on Second Reading, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy). As the hon. Member for Aberdeen North said, I signalled on Second Reading that the Government are listening carefully, and our further work in that area continues at pace.

I am not sure that amendment 80 as drafted would necessarily have the intended effect. Proposed new sub-paragraph (c) to schedule 10(1) would add a risk condition, but the conditions in paragraph (1) are applied with “and”, so they must all be met. My concern is that the size threshold would still apply, and that this specific drafting of the amendment would not have the intended effect.

We will not accept the amendments as drafted, but as I said on Second Reading, we have heard the representations—the shadow Minister and the hon. Member for Aberdeen North have made theirs powerfully and eloquently—and we are looking carefully at those matters. I hope that provides some indication of the Government’s thinking. I thank the stakeholders who engaged and provided extremely valuable insight on those issues. I commend the clause to the Committee.

I thank the Minister for his comments. I still think that such platforms are too dangerous not to be subject to more stringent legislation than similar-sized platforms. For the Chair’s information, I would like to press amendment 80 to a vote. If it falls, I will move straight to pressing amendment 82 to a vote, missing out amendment 81. Does that makes sense, Chair, and is it possible?

No, I am afraid it is not. We will deal with the amendments in order.

Question put and agreed to.

Clause 80 accordingly ordered to stand part of the Bill.

Schedule 10

Categories of regulated user-to-user services and regulated search services: regulations

Now we come to those amendments, which have not yet been moved. The problem is that amendment 82 is linked to amendment 80. I think I am right in saying that if amendment 80 falls, amendment 82 will fall. Does the hon. Lady want to move just amendment 82?

Thank you for your advice, Chair. I will move amendment 80. Should it be accepted, I would be keen to move to other two.

Amendment proposed: 80,in schedule 10, page 192, line 19, at end insert—

“(c) the assessed risk of harm arising from that part of the service.”—(Kirsty Blackman.)

This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

As I indicated, that means that amendments 81 and 82 now fall. Just for the hon. Lady’s information, ordinarily, where an amendment has been moved in Committee, it would not be selected to be moved on the Floor of the House on Report. However, the Minister has indicated that he is minded to look at this again. If, of course, the Government choose to move an amendment on Report, that then would be put to the House.

On a point of order, Sir Roger. My understanding was that it was previously the case that amendments could not be re-moved again on Report, but that modern practice in the past few years in the House has been that amendments that have been pushed to a vote in Committee are then allowed to be resubmitted on Report, whether or not the Minister has indicated that this is the case.

The hon. Lady is correct. I am advised that, actually, the ruling has changed, so it can be. We will see—well, I won’t, but the hon. Lady will see what the Minister does on report.

Schedule 10 agreed to.  

Clauses 81 and 82 ordered to stand part of the Bill.  

Clause 83

OFCOM’s register of risks, and risk profiles, of Part 3

I beg to move amendment 34, in clause 83, page 72, line 12, at end insert—

“(d) the risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”

This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.

Labour welcomes clause 83, which places a duty on Ofcom to carry out risk assessments to identify and assess a range of potential risks of harm presented by part 3 services. However we are concerned about subsection (9), which says:

“OFCOM must from time to time review and revise the risk assessments and risk profiles so as to keep them up to date”

That seems a fairly woolly concept even for the Minister to try to defend, so I would be grateful if he clarified exactly what demands will be placed on Ofcom to review those risk assessments and risk profiles. He will know that those are absolutely central to the Bill, so some clarification is required here. Despite that, Labour agrees that it will be a significant advantage for Ofcom to oversee the risk of harm presented by the regulated services.

However, harm should not be limited to those in the UK. Amendment 34 would therefore require Ofcom’s risk assessment to consider risks to adults and children throughout the production, publication and dissemination of illegal content. I have already spoken on this issue, in the debate on amendment 25 to clause 8, so I will keep my comments brief. As the Minister knows, online harms are global in nature, and amendment 34 seeks to ensure that the risk of harm presented by regulated services is not just limited to those in the UK. As we have mentioned previously, research shows us that there is some very damaging, often sexually violent, content being streamed abroad. Labour fears that the current provisions in the legislation will not be far-reaching enough to capture the true essence of the risk of harm that people may face when online.

Labour supports the intentions of clause 84, which outlines that Ofcom must produce guidance to assist providers in complying with their duties to carry out illegal content risk assessments

“As soon as reasonably practicable”.

Of course, the Minister will not be surprised that Labour has slight reservations about the timing around those important duties, so I would appreciate an update from the Minister on the conversations he has had with Ofcom about the practicalities of its duties.

I did not indicate at the start of the debate that I would take the clause stand part and clause 84 stand part together, but I am perfectly relaxed about it and very happy to do so, as the hon. Lady has spoken to them. If any other colleague wishes to speak to them, that is fine by me.

Perhaps I might start with amendment 34, which the shadow Minister just spoke to. We agree that it is very important to consider the risks posed to victims who are outside of the territory of the United Kingdom. However, for the reasons I will elaborate on, we believe that the Bill as drafted achieves that objective already.

First, just to remind the Committee, the Bill already requires companies to put in place proportionate systems and processes to prevent UK users from encountering illegal content. Critically, that includes where a UK user creates illegal content via an in-scope platform, but where the victim is overseas. Let me go further and remind the Committee that clause 9 requires platforms to prevent UK users from encountering illegal content no matter where that content is produced or published. The word “encounter” is very broadly defined in clause 189 as meaning

“read, view, hear or otherwise experience content”.

As such, it will cover a user’s contact with any content that they themselves generate or upload to a service.

Critically, there is another clause, which we have discussed previously, that is very important in the context of overseas victims, which the shadow Minister quite rightly raises. The Committee will recall that subsection (9) of clause 52, which is the important clause that defines illegal content, makes it clear that that content does not have to be generated, uploaded or accessed in the UK, or indeed to have anything to do with the UK, in order to count as illegal content towards which the company has duties, including risk assessment duties. Even if the illegal act—for example, sexually abusing a child—happens in some other country, not the UK, it still counts as illegal content under the definitions in the Bill because of clause 52(9). It is very important that those duties will apply to that circumstance. To be completely clear, if an offender in the UK uses an in-scope platform to produce content where the victim is overseas, or to share abuse produced overseas with other UK users, the platform must tackle that, both through its risk assessment duties and its other duties.

As such, the entirely proper intent behind amendment 34 is already covered by the Bill as drafted. The shadow Minister, the hon. Member for Pontypridd, has already referred to the underlying purpose of clauses 83 and 84. As we discussed before, the risk assessments are central to the duties in the Bill. It is essential that Ofcom has a proper picture of the risks that will inform its various regulatory activities, which is why these clauses are so important. Clause 84 requires Ofcom to produce guidance to services to make sure they are carrying out those risk assessments properly, because it is no good having a token risk assessment or one that does not properly deal with the risks. The guidance published under clause 84 will ensure that happens. As such, I will respectfully resist amendment 34, on the grounds that its contents are already covered by the Bill.

I am grateful for the Minister’s clarification. Given his assurances that its contents are already covered by the Bill, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 83 ordered to stand part of the Bill.

Clause 84 ordered to stand part of the Bill.

Clause 85

Power to require information

With this it will be convenient to discuss the following:

Clauses 86 to 91 stand part.

Schedule 11 stand part.

Labour supports clause 85, which gives Ofcom the power to require the provision of any information it requires in order to discharge its online safety functions. We strongly believe that, in the interests of transparency, Ofcom as the regulator must have sufficient power to require a service provider to share its risk assessment in order to understand how that service provider is identifying risks. As the Minister knows, we feel that that transparency should go further, and that the risk assessments should be made public. However, we have already had that argument during a previous debate, so I will not repeat those arguments—on this occasion, at least.

Labour also supports clause 86, and we particularly welcome the clarification that Ofcom may require the provision of information in any form. If we are to truly give Ofcom the power to regulate and, where necessary, investigate service providers, we must ensure that it has sufficient legislative tools to rely on.

The Bill gives some strong powers to Ofcom. We support the requirement in clause 87 to name a senior manager, but again, we feel those provisions should go further. Both users and Ofcom must have access to the full range of tools they need to hold the tech giants to account. As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator, and even then, those measures might not come in until two years after the Bill is in place. Surely the top bosses at social media companies should be held criminally liable for systemic and repeated failures to ensure online safety as soon as the Bill comes into force, so can the Minister explain the reasons for the delay?

The Minister will be happy to hear that Labour supports clause 88. It is important to have an outline on the face of the Bill of the circumstances in which Ofcom can require a report from a skilled person. It is also important that Ofcom has the power to appoint, or give notice to a provider requiring them to appoint, a skilled person, as Labour fears that without those provisions in subsections (3) and (4), the ambiguity around defining a so-called skilled person could be detrimental. We therefore support the clause, and have not sought to amend it at this stage.

Again, Labour supports all the intentions of clause 89 in the interests of online safety more widely. Of course, Ofcom must have the power to force a company to co-operate with an investigation.

Again, we support the need for clause 90, which gives Ofcom the power to require an individual to attend an interview. That is particularly important in the instances outlined in subsection (1), whereby Ofcom is carrying out an investigation into the failure or possible failure of a provider of a regulated service to comply with a relevant requirement. Labour has repeatedly called for such personal responsibility, so we are pleased that the Government are ensuring that the Bill includes sufficient powers for Ofcom to allow proper scrutiny.

Labour supports clause 91 and schedule 11, which outlines in detail Ofcom’s powers of entry, inspection and audit. I did not think we would support this much, but clearly we do. We want to work with the Government to get this right, and we see ensuring Ofcom has those important authorisation powers as central to it establishing itself as a viable regulator of the online space, both now and for generations to come. We will support and have not sought to amend the clauses or schedule 11 for the reasons set out.

I want to make a brief comment echoing the shadow Minister’s welcome for the inclusion of senior managers and named people in the Bill. I agree that that level of personal liability and responsibility is the only way that we will be able to hold some of these incredibly large, unwieldy organisations to account. If they could wriggle out of this by saying, “It’s somebody else’s responsibility,” and if everyone then disagreed about whose responsibility it was, we would be in a much worse place, so I also support the inclusion of these clauses and schedule 11.

I am delighted by the strong support that these clauses have received from across the aisle. I hope that proves to be a habit-forming development.

On the shadow Minister’s point about publishing the risk assessments, to repeat the point I made a few days ago, under clause 64, which we have already debated, Ofcom has the power—indeed, the obligation—to compel publication of transparency reports that will make sure that the relevant information sees the light of day. I accept that publication is important, but we believe that objective is achieved via the transparency measures in clause 64.

On the point about senior management liability, which again we debated near the beginning of the Bill, we believe—I think we all agree—that this is particularly important for information disclosure. We had the example, as I mentioned at the time, of one of the very large companies refusing to disclose information to the Competition and Markets Authority in relation to a competition matter and simply paying a £50 million fine rather than complying with the duties. That is why criminal liability is so important here in relation to information disclosure.

To reassure the shadow Minister, on the point about when that kicks in, it was in the old version of the Bill, but potentially did not commence for two years. In this new version, updated following our extensive and very responsive listening exercise—I am going to get that in every time—the commencement of this particular liability is automatic and takes place very shortly after Royal Assent. The delay and review have been removed, for the reason the hon. Lady mentioned, so I am pleased to confirm that to the Committee.

The shadow Minister described many of the provisions. Clause 85 gives Ofcom powers to require information, clause 86 gives the power to issue notices and clause 87 the important power to require an entity to name that relevant senior manager, so they cannot wriggle out of their duty by not providing the name. Clause 88 gives the power to require companies to undergo a report from a so-called skilled person. Clause 89 requires full co-operation with Ofcom when it opens an investigation, where co-operation has been sadly lacking in many cases to date. Clause 90 requires people to attend an interview, and the introduction to schedule 11 allows Ofcom to enter premises to inspect or audit the provider. These are very powerful clauses and will mean that social media companies can no longer hide in the shadows from the scrutiny they so richly deserve.

Question put and agreed to.

Clause 85 accordingly ordered to stand part of the Bill.

Clauses 86 to 91 ordered to stand part of the Bill.

Schedule 11

OFCOM’s powers of entry, inspection and audit

Amendment made: 4, in schedule 11, page 202, line 17, leave out

“maximum summary term for either-way offences”

and insert

“general limit in a magistrates’ court”.—(Chris Philp.)

Schedule 11, as amended, agreed to.

Clause 92

Offences in connection with information notices

Question proposed, That the clause stand part of the Bill.

The Minister will be pleased to hear that we, again, support these clauses. We absolutely support the Bill’s aims to ensure that information offences and penalties are strong enough to dissuade non-compliance. However, as we said repeatedly, we feel that the current provisions are lacking.

As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator. I am grateful that the Minister has confirmed that the measures will come into force with immediate effect following Royal Assent, rather than waiting two years. That is welcome news. The Government should require that top bosses at social media companies be criminally liable for systemic and repeated failures on online safety, and I am grateful for the Minister’s confirmation on that point.

As these harms are allowed to perpetuate, tech companies cannot continue to get away without penalty. Will the Minister confirm why the Bill does not include further penalties, in the form of criminal offences, should a case of systemic and repeated failures arise? Labour has concerns that, without stronger powers, Ofcom may not feel compelled or equipped to sanction those companies who are treading the fine line of doing just enough to satisfy the requirements outlined in the Bill as it stands.

Labour also welcomes clause 93, which sets out the criminal offences that can be committed by named senior managers in relation to their entity’s information obligations. It establishes that senior managers who are named in a response to an information notice can be held criminally liable for failing to prevent the relevant service provider from committing an information offence. Senior managers can only be prosecuted under the clause where the regulated provider has already been found liable for failing to comply with Ofcom’s information request. As I have already stated, we feel that this power needs to go further if we are truly to tackle online harm. For far too long, those at the very top have known about the harm that exists on their platforms, but they have failed to take action.

Labour supports clause 94 and we have not sought to amend at this stage. It is vital that provisions are laid in the Bill, such as those in subsection (3), which specify actions that a person may take to commit an offence of this nature. We all want to see the Bill keep people safe online, and at the heart of doing so is demanding a more transparent approach from those in silicon valley. My hon. Friend the Member for Worsley and Eccles South made an excellent case for the importance of transparency earlier in the debate but, as the Minister knows, and as I have said time and again, the offences must go further than just applying to simple failures to provide information. We must consider a systemic approach to harm more widely, and that goes far beyond simple information offences.

There is no need to repeat myself. Labour supports the need for clause 95 as it stands and we support clause 96, which is in line with penalties for other information offences that already exist.

I am delighted to discover that agreement with the Governments clauses continues to provoke a tsunami of unanimity across the Committee. I sense a gathering momentum behind these clauses.

As the shadow Minister mentioned, the criminal offences here are limited to information provision and disclosure. We have debated the point before. The Government’s feeling is that going beyond the information provision into other duties for criminal liability would potentially go a little far and have a chilling effect on the companies concerned.

Also, the fines that can be levied—10% of global revenue—run into billions of pounds, and there are the denial of service provisions, where a company can essentially be disconnected from the internet in extreme cases; these do provide more than adequate enforcement powers for the other duties in the Bill. The information duties are so fundamental—that is why personal criminal liability is needed. Without the information, we cannot really make any further assessment of whether the duties are being met.

The shadow Minister has set out what the other clauses do: clause 92 creates offences; clause 93 introduces senior managers’ liability; clause 94 sets out the offences that can be committed in relation to audit notices issued by Ofcom; clause 95 creates offences for intentionally obstructing or delaying a person exercising Ofcom’s power; and clause 96 sets out the penalties for the information offences set out in the Bill, which of course include a term of imprisonment of up to two years. Those are significant criminal offences, which I hope will make sure that executives working for social media firms properly discharge those important duties.

Question put and agreed to.

Clause 92 accordingly ordered to stand part of the Bill.

Clauses 93 to 95 ordered to stand part of the Bill.

Clause 96

Penalties for information offences

Amendment made: 2, in clause 96, page 83, line 15, leave out

“maximum summary term for either-way offences”

and insert

“general limit in a magistrates’ court”—(Chris Philp.)

Clause 96, as amended, ordered to stand part of the Bill.

Clause 97

Co-operation and disclosure of information: overseas regulators

Question proposed, That the clause stand part of the Bill.

Again, Labour supports the intentions of clause 97—the collegiality continues. We know that the Bill’s aims are to protect people across the UK, but we know that online harms often originate elsewhere. That is why it is vital that Ofcom has powers to co-operate with an overseas regulator, as outlined in subsection (1).

However, we do have concerns about subsection (2), which states:

“The power conferred by subsection (1) applies only in relation to an overseas regulator for the time being specified in regulations made by the Secretary of State.”

Can the Minister confirm exactly how that will work in practice? He knows that Labour Members have tabled important amendments to clause 123. Amendments 50 and 51, which we will consider later, aim to ensure that Ofcom has the power to co-operate and take action through the courts where necessary. The same issue applies here: Ofcom must be compelled and have the tools available at its disposal to work internationally where required.

Labour supports clause 98, which amends section 393 of the Communications Act 2003 to include new provisions. That is obviously a vital step, and we particularly welcome subsection (2), which outlined that, subject to the specific exceptions in section 393 of the 2003 Act, Ofcom cannot disclose information with respect to a business that it has obtained by exercising its powers under this Bill without the consent of the business in question. This is once again an important step in encouraging transparency across the board.

We support clause 99, which places a duty on Ofcom to consult the relevant intelligence service before Ofcom discloses or publishes any information that it has received from that intelligence service. For reasons of national security, it is vital that the relevant intelligence service is included in Ofcom’s reasoning and approach to the Bill more widely.

We broadly support the intentions of clause 100. It is vital that Ofcom is encouraged to provide information to the Secretary of State of the day, but I would be grateful if the Minister could confirm exactly how the power will function in reality. Provision of information to assist in the formulation of policy, as we know, is a very broad spectrum in the Communications Act. We want to make sure the powers are not abused—I know that is a concern shared on his own Back Benches—so I would be grateful for the Minister’s honest assessment of the situation.

We welcome clause 101, which amends section 26 of the Communications Act and provides for publication of information and advice for various persons, such as consumers. Labour supports the clause as it stands. We also welcome clause 102, which, importantly, sets out the circumstances in which a statement given to Ofcom can be used in evidence against that person. Again, this is an important clause in ensuring that Ofcom has the powers it needs to truly act as a world-leading regulator, which we all want it to be. Labour supports it and has chosen not to table any amendments.

I am delighted that support for the Government’s position on the clauses continues and that cross-party unanimity is taking an ever stronger hold. I am sure the Whips Office will find that particularly reassuring.

The shadow Minister asked a question about clause 100. Clause 100 amends section 24B of the Communications Act 2003, which allows Ofcom to provide information to the Secretary of State to assist with the formulation of policy. She asked me to clarify what that means, which I am happy to do. In most circumstances, Ofcom will be required to obtain the consent of providers in order to share information relating to their business. This clause sets out two exceptions to that principle. If the information required by the Secretary of State was obtained by Ofcom to determine the proposed fees threshold, or in response to potential threats to national security or to the health or safety of the public, the consent of the business is not required. In those instances, it would obviously not be appropriate to require the provider’s consent.

It is important that users of regulated services are kept informed of developments around online safety and the operation of the regulatory framework.

This specifically relates to the Secretary of State, but would the Minister expect both Ofcom and his Department to be working with the Scottish Government and the Northern Ireland Executive? I am not necessarily talking about sharing all the information, but where there are concerns that it is very important for those jurisdictions to be aware of, will he try to ensure that he has a productive relationship with both devolved Administrations?

I thank the hon. Member for her question. Where the matter being raised or disclosed touches on matters of devolved competence—devolved authority—then yes, I would expect that consultation to take place. Matters concerning the health and safety of the public are entirely devolved, I think, so I can confirm that in those circumstances it would be appropriate for the Secretary of State to share information with devolved Administration colleagues.

The shadow Minister has eloquently, as always, touched on the purpose of the various other clauses in this group. I do not wish to try the patience of the Committee, particularly as lunchtime approaches, by repeating what she has ably said already, so I will rest here and simply urge that these clauses stand part of the Bill.

Question put and agreed to.

Clause 97 accordingly ordered to stand part of the Bill.

Clauses 98 to 102 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Adjourned till this day at Two o’clock.

Online Safety Bill (Twelfth sitting)

The Committee consisted of the following Members:

Chairs: † Sir Roger Gale, Christina Rees

Ansell, Caroline (Eastbourne) (Con)

† Bailey, Shaun (West Bromwich West) (Con)

† Blackman, Kirsty (Aberdeen North) (SNP)

Carden, Dan (Liverpool, Walton) (Lab)

† Davies-Jones, Alex (Pontypridd) (Lab)

Double, Steve (St Austell and Newquay) (Con)

† Fletcher, Nick (Don Valley) (Con)

Holden, Mr Richard (North West Durham) (Con)

† Keeley, Barbara (Worsley and Eccles South) (Lab)

Leadbeater, Kim (Batley and Spen) (Lab)

† Miller, Dame Maria (Basingstoke) (Con)

Mishra, Navendu (Stockport) (Lab)

Moore, Damien (Southport) (Con)

Nicolson, John (Ochil and South Perthshire) (SNP)

† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)

† Russell, Dean (Watford) (Con)

† Stevenson, Jane (Wolverhampton North East) (Con)

Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks

† attended the Committee

Public Bill Committee

Thursday 16 June 2022


[Sir Roger Gale in the Chair]

Online Safety Bill

Clause 103

Notices to deal with terrorism content or CSEA content (or both)

There are amendments to clause 103 that are not owned by any member of the Committee. Nobody has indicated that they wish to take them up, and therefore they fall.

Question proposed, That the clause stand part of the Bill.

With this it will be convenient to consider clauses 105 and 106 stand part.

Under this chapter, Ofcom will have the power to direct companies to use accredited technology to identify child sexual exploitation and abuse content, whether communicated publicly or privately by means of a service, and to remove that content quickly. Colleagues will be aware that the Internet Watch Foundation is one group that assists companies in doing that by providing them with “hashes” of previously identified child sexual abuse material in order to prevent the upload of such material to their platforms. That helps stop the images of victims being recirculated again and again. Tech companies can then notify law enforcement of the details of who has uploaded the content, and an investigation can be conducted and offenders sharing the content held to account.

Those technologies are extremely accurate and, thanks to the quality of our datasets, ensure that companies are detecting only imagery that is illegal. There are a number of types of technology that Ofcom could consider accrediting, including image hashing. A hash is a unique string of letters and numbers that can be applied to an image and matched every time a user attempts to upload a known illegal image to a platform.

PhotoDNA is another type, created in 2009 in a collaboration between Microsoft and Professor Hany Farid at the University of Berkeley. PhotoDNA is a vital tool in the detection of CSEA online. It enables law enforcement, charities, non-governmental organisations and the internet industry to find copies of an image even when it has been digitally altered. It is one of the most important technical developments in online child protection. It is extremely accurate, with a failure rate of one in 50 billion to 100 billion. That gives companies a high degree of certainty that what they are removing is illegal, and a firm basis for law enforcement to pursue offenders.

Lastly, there is webpage blocking. Most of the imagery that the Internet Watch Foundation removes from the internet is hosted outside the UK. While it is waiting for removal, it can disable public access to an image or webpage by adding it to our webpage blocking list. That can be utilised by search providers to de-index known webpages containing CSAM. I therefore ask the Minister, as we continue to explore this chapter, to confirm exactly how such technologies can be utilised once the Bill receives Royal Assent.

Labour welcomes clause 105, which confirms, in subsection (2), that where a service provider is already using technology on a voluntary basis but it is ineffective, Ofcom can still intervene and require a service provider to use a more effective technology, or the same technology in a more effective way. It is vital that Ofcom is given the power and opportunity to intervene in the strongest possible sense to ensure that safety online is kept at the forefront.

However, we do require some clarification, particularly on subsections (9) and (10), which explain that Ofcom will only be able to require the use of tools that meet the minimum standards for accuracy for detecting terrorism and/or CSEA content, as set out by the Secretary of State. Although minimum standards are of course a good thing, can the Minister clarify the exact role that the Secretary of State will have in imposing these minimum standards? How will this work in practice?

Once again, Labour does not oppose clause 106 and we have not sought to amend it at this stage. It is vital that Ofcom has the power to revoke a notice under clause 103(1) if there are reasonable grounds to believe that the provider is not complying with it. Only with these powers can we be assured that service providers will be implored to take their responsibilities and statutory duties, as outlined in the Bill, seriously.

I have a few questions, concerns and suggestions relating to these clauses. I think it was the hon. Member for Don Valley who asked me last week about the reports to the National Crime Agency and how that would work—about how, if a human was not checking those things, there would be an assurance that proper reports were being made, and that scanning was not happening and reports were not being made when images were totally legal and there was no problem with them. [Interruption.] I thought it was the hon. Member for Don Valley, although it may not have been. Apologies—it was a Conservative Member. I am sorry for misnaming the hon. Member.

The hon. Member for Pontypridd made a point about the high level of accuracy of the technologies. That should give everybody a level of reassurance that the reports that are and should be made to the National Crime Agency on child sexual abuse images will be made on a highly accurate basis, rather than a potentially inaccurate one. Actually, some computer technology—particularly for scanning for images, rather than text—is more accurate than human beings. I am pleased to hear those particular statistics.

Queries have been raised on this matter by external organisations—I am particularly thinking about the NSPCC, which we spoke about earlier. The Minister has thankfully given a number of significant reassurances about the ability to proactively scan. External organisations such as the NSPCC are still concerned that there is not enough on the face of the Bill about proactive scanning and ensuring that the current level of proactive scanning is able—or required—to be replicated when the Bill comes into action.

During an exchange in an earlier Committee sitting, the Minister gave a commitment—I am afraid I do not have the quote—to being open to looking at amending clause 103. I am slightly disappointed that there are no Government amendments, but I understand that there has been only a fairly short period; I am far less disappointed than I was previously, when the Minister had much more time to consider the actions he might have been willing to take.

The suggestion I received from the NSPCC is about the gap in the Bill regarding the ability of Ofcom to take action. These clauses allow Ofcom to take action against individual providers about which it has concerns; those providers will have to undertake duties set out by Ofcom. The NSPCC suggests that there could be a risk register, or that a notice could be served on a number of companies at one time, rather than Ofcom simply having to pick one company, or to repeatedly pick single companies and serve notices on them. Clause 83 outlines a register of risk profiles that must be created by Ofcom. It could therefore serve notice on all the companies that fall within a certain risk profile or all the providers that have common functionalities.

If there were a new, emerging concern, that would make sense. Rather than Ofcom having to go through the individual process with all the individual providers when it knows that there is common functionality—because of the risk assessments that have been done and Ofcom’s oversight of the different providers—it could serve notice on all of them in one go. It could not then accidentally miss one out and allow people to move to a different platform that had not been mentioned. I appreciate the conversation we had around this issue earlier, and the opportunity to provide context in relation to the NSPCC’s suggestions, but it would be great if the Minister would be willing to consider them.

I have another question, to which I think the Minister will be able to reply in the affirmative, which is on the uses of the technology as it evolves. We spoke about that in an earlier meeting. The technology that we have may not be what we use in the future to scan for terrorist-related activity or child sexual abuse material. It is important that the Bill adequately covers future conditions. I think that it does, but will the Minister confirm that, as technology advances and changes, these clauses will adequately capture the scanning technologies that are required, and any updates in the way in which platforms work and we interact with each other on the internet?

I have fewer concerns about future-proofing with regard to these provisions, because I genuinely think they cover future conditions, but it would be incredibly helpful and provide me with a bit of reassurance if the Minister could confirm that. I very much look forward to hearing his comments on clause 103.

Let me start by addressing some questions raised by hon. Members, beginning with the last point made by the hon. Member for Aberdeen North. She sought reconfirmation that the Bill will keep up with future developments in accredited technology that are not currently contemplated. The answer to her question can be found in clause 105(9), in which the definition of accredited technology is clearly set out, as technology that is

“accredited (by OFCOM or another person appointed by OFCOM) as meeting minimum standards of accuracy”.

That is not a one-off determination; it is a determination, or an accreditation, that can happen from time to time, periodically or at any point in the future. As and when new technologies emerge that meet the minimum standards of accuracy, they can be accredited, and the power in clause 103 can be used to compel platforms to use those technologies. I hope that provides the reassurance that the hon. Member was quite rightly asking for.

The shadow Minister, the hon. Member for Pontypridd, asked a related question about the process for publishing those minimum standards. The process is set out in clause 105(10), which says that Ofcom will give advice to the Secretary of State on the appropriate minimum standards, and the minimum standards will then be

“approved…by the Secretary of State, following advice from OFCOM.”

We are currently working with Ofcom to finalise the process for setting those standards, which of course will need to take a wide range of factors into account.

Let me turn to the substantive clauses. Clause 103 is extremely important, because as we heard in the evidence sessions and as Members of the Committee have said, scanning messages using technology such as hash matching, to which the shadow Minister referred, is an extremely powerful way of detecting CSEA content and providing information for law enforcement agencies to arrest suspected paedophiles. I think it was in the European Union that Meta—particularly Facebook and Facebook Messenger—stopped using this scanner for a short period time due to misplaced concerns about privacy laws, and the number of referrals of CSEA images and the number of potential paedophiles who were referred to law enforcement dropped dramatically.

A point that the hon. Member for Aberdeen North and I have discussed previously is that it would be completely unacceptable if a situation arose whereby these messages—I am thinking particularly about Facebook Messenger—did not get scanned for CSEA content in a way that they do get scanned today. When it comes to preventing child sexual exploitation and abuse, in my view there is no scope for compromise or ambiguity. That scanning is happening at the moment; it is protecting children on a very large scale and detecting paedophiles on quite a large scale. In my view, under no circumstances should that scanning be allowed to stop. That is the motivation behind clause 103, which provides Ofcom with the power to make directions to require the use of accredited technology.

As the hon. Member for Aberdeen North signalled in her remarks, given the importance of this issue the Government are of course open to thinking about ways in which the Bill can be strengthened if necessary, because we do not want to leave any loopholes. I urge any social media firms watching our proceedings never to take any steps that degrade or reduce the ability to scan for CSEA content. I thank the hon. Member for sending through the note from the NSPCC, which I have received and will look at internally.

The proactive scanning that we have talked about is critical. To give one or two examples, this is not just about CSEA, but terrorism as well. Every terrorist attack in 2017 had an online element, and many counter-terrorism prosecutions have involved online activity, because terrorists and their supporters continue to use a wide range of online platforms to further their aims. Similarly, in the context of child sexual abuse material, the Internet Watch Foundation confirmed in 2020 that 153,383 reports of webpages containing CSEA, abuse imagery or UK-hosted, non-photographic child sexual abuse imagery were detected. The importance of the scanning technology is clear, as is the importance of ensuring the clause is as strong as possible.

As the shadow Minister has said, clause 105 provides supporting provisions for clause 103, setting out—for example—the particulars of what must appear in the notice, and clause 106 sets out the process for reviewing a notice to deal with terrorism or CSEA content. I hope I have addressed hon. Members’ questions, and I commend this important clause to the Committee.

Question put and agreed to.

Clause 103 accordingly ordered to stand part of the Bill.

Clause 104

Matters relevant to a decision to give a notice under section 103(1)

I beg to move amendment 35, in clause 104, page 88, line 39, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

With this it will be convenient to discuss the following:

Amendment 36, in clause 104, page 88, line 43, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

Amendment 37, in clause 104, page 89, line 13, at end insert—

“(k) risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”

This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.

Amendment 39, in clause 116, page 98, line 37, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

Amendment 40, in clause 116, page 98, line 39, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

Amendment 38, in clause 116, page 99, line 12, at end insert—

“(j) the risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”

This amendment requires Ofcom to consider risks to adults and children through the production, publication and dissemination of illegal content before imposing a proactive technology requirement.

Government amendment 6.

Clause stand part.

We welcome clause 104, but have tabled some important amendments that the Minister should closely consider. More broadly, the move away from requiring child sexual exploitation and abuse content to be prevalent and persistent before enforcement action can be taken is a positive one. It is welcome that Ofcom will have the opportunity to consider a range of factors.

Despite this, Labour—alongside the International Justice Mission—is still concerned about the inclusion of prevalence as a factor, owing to the difficulty in detecting newly produced CSEA content, especially livestreamed abuse. Amendments 35, 36, 39 and 40 seek to address that gap. Broadly, the amendments aim to capture the concern about the Bill’s current approach, which we feel limits its focus to the risk of harm faced by individuals in the UK. Rather, as we have discussed previously, the Bill should recognise the harm that UK nationals cause to people around the world, including children in the Philippines. The amendments specifically require Ofcom to consider the presence of relevant content, rather than its prevalence.

Amendment 37 would require Ofcom’s risk assessments to consider risks to adults and children through the production, publication and dissemination of illegal content—an issue that Labour has repeatedly raised. I believe we last mentioned it when we spoke to amendments to clause 8, so I will do my best to not repeat myself. That being said, we firmly believe it is important that video content, including livestreaming, is captured by the Bill. I remain unconvinced that the Bill as it stands goes far enough, so I urge the Minister to closely consider and support these amendments. The arguments that we and so many stakeholders have already made still stand.

I echo the sentiments that have been expressed by the shadow Minister, and thank her and her colleagues for tabling this amendment and giving voice to the numerous organisations that have been in touch with us about this matter. The Scottish National party is more than happy to support the amendment, which would make the Bill stronger and better, and would better enable Ofcom to take action when necessary.

I understand the spirit behind these amendments, focusing on the word “presence” rather than “prevalence” in various places. It is worth keeping in mind that throughout the Bill we are requiring companies to implement proportionate systems and processes to protect their users from harm. Even in the case of the most harmful illegal content, we are not placing the duty on companies to remove every single piece of illegal content that has ever appeared online, because that is requesting the impossible. We are asking them to take reasonable and proportionate steps to create systems and processes to do so. It is important to frame the legally binding duties in that way that makes them realistically achievable.

As the shadow Minister said, amendments 35, 36, 39 and 40 would replace the word “prevalence” with “presence”. That would change Ofcom’s duty to enforce not just against content that was present in significant numbers—prevalent—but against a single instance, which would be enough to engage the clause.

We mutually understand the intention behind these amendments, but we think the significant powers to compel companies to adopt certain technology contained in section 103 should be engaged only where there is a reasonable level of risk. For example, if a single piece of content was present on a platform, if may not be reasonable or proportionate to force the company to adopt certain new technologies, where indeed they do not do so at the moment. The use of “prevalence” ensures that the powers are used where necessary.

It is clear—there is no debate—that in the circumstances where scanning technology is currently used, which includes on Facebook Messenger, there is enormous prevalence of material. To elaborate on a point I made in a previous discussion, anything that stops that detection happening would be unacceptable and, in the Government’s view, it would not be reasonable to lose the ability to detect huge numbers of images in the service of implementing encryption, because there is nothing more important than scanning against child sexual exploitation images.

However, we think adopting the amendment and replacing the word “prevalence” with “presence” would create an extremely sensitive trigger that would be engaged on almost every site, even tiny ones or where there was no significant risk, because a single example would be enough to trigger the amendment, as drafted. Although I understand the spirit of the amendment, it moves away from the concepts of proportionality and reasonableness in the systems and processes that the Bill seeks to deliver.

Amendment 37 seeks to widen the criteria that Ofcom must consider when deciding to use section 103 powers. It is important to ensure that Ofcom considers a wide range of factors, taking into account the harm occurring, but clause 104(2)(f) already requires Ofcom to consider

“the level of risk of harm to individuals in the United Kingdom presented by relevant content, and the severity of that harm”.

Therefore, the Bill already contains provision requiring Ofcom to take those matters into account, as it should, but the shadow Minister is right to draw attention to the issue.

Finally, amendment 38 seeks to amend clause 116 to require Ofcom to consider the risk of harm posed by individuals in the United Kingdom, in relation to adults and children in the UK or elsewhere, through the production, publication and dissemination of illegal content. In deciding whether to make a confirmation decision requiring the use of technology, it is important that Ofcom considers a wide range of factors. However, clause 116(6)(e) already proposes to require Ofcom to consider, in particular, the risk and severity of harm to individuals in the UK. That is clearly already in the Bill.

I hope that this analysis provides a basis for the shadow Minister to accept that the Bill, in this area, functions as required. I gently request that she withdraw her amendment.

I welcome the Minister’s comments, but if we truly want the Bill to be world-leading, as the Government and the Minister insist it will be, and if it is truly to keep children safe, surely one image of child sexual exploitation and abuse on a platform is one too many. We do not need to consider prevalence over presence. I do not buy that argument. I believe we need to do all we can to make this Bill as strong as possible. I believe the amendments would do that.

Question put, That the amendment be made.

Amendment proposed: 37, in clause 104, page 89, line 13, at end insert—

“(k) risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”—(Alex Davies-Jones.)

This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.

Question put, That the amendment be made.

I beg to move amendment 6, in clause 104, page 89, line 14, after “(2)(f)” insert “, (g)”

This amendment ensures that subsection (3) of this clause (which clarifies what “relevant content” in particular paragraphs of subsection (2) refers to in relation to different kinds of services) applies to the reference to “relevant content” in subsection (2)(g) of this clause.

This technical amendment will ensure that the same definition of “relevant content” used in subsection (2) is used in subsection (3).

Amendment 6 agreed to.

Clause 104, as amended, ordered to stand part of the Bill.

Clauses 105 and 106 ordered to stand part of the Bill.

Clause 107

OFCOM’s guidance about functions under this Chapter

Question proposed, That the clause stand part of the Bill.

Labour welcomes clause 107, which requires Ofcom to issue guidance setting out the circumstances in which it could require a service provider in scope of the power to use technology to identify CSEA and/or terrorism content. It is undeniably important that Ofcom will have the discretion to decide on the exact content of the guidance, which it must keep under review and publish.

We also welcome the fact that Ofcom must have regard to its guidance when exercising these powers. Of course, it is also important that the Information Commissioner is included and consulted in the process. Ofcom has a duty to continually review its guidance, which is fundamental to the Bill’s success.

We also welcome clause 108. Indeed, the reporting of Ofcom is an area that my hon. Friend the Member for Batley and Spen will touch on when we come to new clause 25. It is right that Ofcom will have a statutory duty to lay an annual report in this place, but we feel it should ultimately go further. That is a conversation for another day, however, so we broadly welcome clause 108 and have not sought to amend it directly at this stage.

Clause 109 ensures that the definitions of “terrorism content” and “child sexual exploitation and abuse content” used in chapter 5 are the same as those used in part 3. Labour supports the clause and we have not sought to amend it.

I beg your pardon; I am trying to do too many things at once. I call Kirsty Blackman.

Thank you very much, Sir Roger. I do not envy you in this role, which cannot be easy, particularly with a Bill that is 190-odd clauses long.

I have a quick question for the Minister about the timelines in relation to the guidance and the commitment that Ofcom gave to producing a road map before this coming summer. When is that guidance likely to be produced? Does that road map relate to the guidance in this clause, as well as the guidance in other clauses? If the Minister does not know the answer, I have no problem with receiving an answer at a later time. Does the road map include this guidance as well as other guidance that Ofcom may or may not be publishing at some point in the future?

I welcome the cross-party support for the provisions set out in these important clauses. Clause 107 points out the requirement for Ofcom to publish guidance, which is extremely important. Clause 108 makes sure that it publishes an annual report. Clause 109 covers the interpretations.

The hon. Member for Aberdeen North asked the only question, about the contents of the Ofcom road map, which in evidence it committed to publishing before the summer. I cannot entirely speak for Ofcom, which is of course an independent body. In order to avoid me giving the Committee misleading information, the best thing is for officials at the Department for Digital, Culture, Media and Sport to liaise with Ofcom and ascertain what the exact contents of the road map will be, and we can report that back to the Committee by letter.

It will be fair to say that the Committee’s feeling—I invite hon. Members to intervene if I have got this wrong—is that the road map should be as comprehensive as possible. Ideally, it would lay out the intended plan to cover all the activities that Ofcom would have to undertake in order to make the Bill operational, and the more detail there is, and the more comprehensive the road map can be, the happier the Committee will be.

Officials will take that away, discuss it with Ofcom and we can revert with fuller information. Given that the timetable was to publish the road map prior to the summer, I hope that we are not going to have to wait very long before we see it. If Ofcom is not preparing it now, it will hopefully hear this discussion and, if necessary, expand the scope of the road map a little bit accordingly.

Question put and agreed to.

Clause 107 accordingly ordered to stand part of the Bill

Clauses 108 and 109 ordered to stand part of the Bill.

Clause 110

Provisional notice of contravention

Question proposed, That the clause stand part of the Bill.

I will be brief. Labour welcomes clause 110, which addresses the process of starting enforcement. We support the process, particularly the point that ensures that Ofcom must first issue a “provisional notice of contravention” to an entity before it reaches its final decision.

The clause ultimately ensures that the process for Ofcom issuing a provisional notice of contravention can take place only after a full explanation and deadline has been provided for those involved. Thankfully, this process means that Ofcom can reach a decision only after allowing the recipient a fair opportunity to make relevant representations too. The process must be fair for all involved and that is why we welcome the provisions outlined in the clause.

I hope that I am speaking at the right stage of the Bill, and I promise not to intervene at any further stages where this argument could be put forward.

Much of the meat of the Bill is within chapter 6. It establishes what many have called the “polluter pays” principle, where an organisation that contravenes can then be fined—a very important part of the Bill. We are talking about how Ofcom is going to be able to make the provisions that we have set out work in practice. A regulated organisation that fails to stop harm contravenes and will be fined, and fined heavily.

I speak at this point in the debate with slight trepidation, because these issues are also covered in clause 117 and schedule 12, but it is just as relevant to debate the point at this stage. It is difficult to understand where in the Bill the Government set out how the penalties that they can levy as a result of the powers under this clause will be used. Yes, they will be a huge deterrent, and that is good in its own right and important, but surely the real opportunity is to make the person who does the harm pay for righting the wrong that they have created.

That is not a new concept. Indeed, it is one of the objectives that the Government set out in the intentions behind their approach to the draft victims Bill. It is a concept used in the Investigatory Powers Act 2016. It is the concept behind the victims surcharge. So how does this Bill make those who cause harm take greater responsibility for the cost of supporting victims to recover from what they have suffered? That is exactly what the Justice Ministers set out as being so important in their approach to victims. In the Bill, that is not clear to me.

At clause 70, the Minister helpfully set out that there was absolutely no intention for Ofcom to have a role in supporting victims individually. In reply to the point that I made at that stage, he said that the victims Bill would address some of the issues—I am sure that he did not say all the issues, but some of them at least. I do not believe that it will. The victims Bill establishes a code and a duty to provide victim support, but it makes absolutely no reference to how financial penalties on those who cause harm—as set out so clearly in this Bill—will be used to support victims. How will they support victims’ organisations, which do so much to help in particular those who do not end up in court, before a judge, because what they have suffered does not warrant that sort of intervention?

I believe that there is a gap. We heard that in our evidence session, including from Ofcom itself, which identified the need for law enforcement, victim-support organisations and platforms themselves to find what the witnesses described as an effective way for the new “ecosystem” to work. Victim-support organisations went further and argued strongly for the need for victims’ voices to be heard independently. The NSPCC in particular made a very powerful argument for children’s voices needing to be heard and for having independent advocacy. There would be a significant issue with trust levels if we were to rely solely on the platforms themselves to provide such victim support.

There are a couple of other reasons why we need the Government to tease the issue out. We are talking about the most significant culture change imaginable for the online platforms to go through. There will be a lot of good will, I am sure, to achieve that culture change, but there will also be problems along the way. Again referring back to our evidence sessions, the charity Refuge said that reporting systems are “not up to scratch” currently. There is a lot of room for change. We know that Revenge Porn Helpline has seen a continual increase in demand for its services in support of victims, in particular following the pandemic. It also finds revenue and funding a little hand to mouth.

Victim support organisations will have a crucial role in assisting Ofcom with the elements outlined in chapter 6, of which clause 110 is the start, in terms of monitoring the reality for users of how the platforms are performing. The “polluter pays” principle is not working quite as the Government might want it to in the Bill. My solution is for the Minister to consider talking to his colleagues in the Treasury about whether this circle could be squared—whether we could complete the circle—by having some sort of hypothecation of the financial penalties, so that some of the huge amount that will be levied in penalties can be put into a fund that can be used directly to support victims’ organisations. I know that that requires the Department for Digital, Culture, Media and Sport and the Ministry of Justice to work together, but my hon. Friend is incredibly good at collaborative working, and I am sure he will be able to achieve that.

This is not an easy thing. I know that the Treasury would not welcome Committees such as this deciding how financial penalties are to be used, but this is not typical legislation. We are talking about enormous amounts of money and enormous numbers of victims, as the Minister himself has set out when we have tried to debate some of these issues. He could perhaps undertake to raise this issue directly with the Treasury, and perhaps get it to look at how much money is currently going to organisations to support victims of online abuse and online fraud—the list goes on—and to see whether we will have to take a different approach to ensure that the victims we are now recognising get the support he and his ministerial colleagues want to see.

First, on the substance of the clause, as the shadow Minister said, the process of providing a provisional notice of contravention gives the subject company a fair chance to respond and put its case, before the full enforcement powers are brought down on its head, and that is of course only reasonable, given how strong and severe these powers are. I am glad there is once again agreement between the two parties.

I would like to turn now to the points raised by my right hon. Friend the Member for Basingstoke, who, as ever, has made a very thoughtful contribution to our proceedings. Let me start by answering her question as to what the Bill says about where fines that are levied will go. We can discover the answer to that question in paragraph 8 of schedule 12, which appears at the bottom of page 206 and the top of page 207—in the unlikely event that Members had not memorised that. If they look at that provision, they will see that the Bill as drafted provides that fines that are levied under the powers provided in it and that are paid to Ofcom get paid over to the Consolidated Fund, which is essentially general Treasury resources. That is where the money goes under the Bill as drafted.

My right hon. Friend asks whether some of the funds could be, essentially, hypothecated and diverted directly to pay victims. At the moment, the Government are dealing with victims, or pay for services supporting victims, not just via legislation—the victims Bill—but via expenditure that, I think, is managed by the Ministry of Justice to support victims and organisations working with victims in a number of ways. I believe that the amount earmarked for this financial year is in excess of £300 million, which is funded just via the general spending review. That is the situation as it is today.

I am happy to ask colleagues in Government the question that my right hon. Friend raises. It is really a matter for the Treasury, so I am happy to pass her idea on to it. But I anticipate a couple of responses coming from the Treasury in return. I would anticipate it first saying that allocating money to a particular purpose, including victims, is something that it likes to do via spending reviews, where it can balance all the demands on Government revenue, viewed in the round.

Secondly, it might say that the fine income is very uncertain; we do not know what it will be. One year it could be nothing; the next year it could be billions and billions of pounds. It depends on the behaviour of these social media firms. In fact, if the Bill does its job and they comply with the duties as we want and expect them to, the fines could be zero, because the firms do what they are supposed to. Conversely, if they misbehave, as they have been doing until now, the fines could be enormous. If we rely on hypothecation of these fines as a source for funding victim services, it might be that, in a particular year, we discover that there is no income, because no fines have been levied.

I was anticipating the Treasury’s response as I made those points to the Committee, but since my right hon. Friend spoke with such eloquence, and given her great experience in Government, I shall put her idea to Treasury colleagues. I will happily revert to her when its response is forthcoming, although I have tried to anticipate a couple of points that the Treasury might make.

Question put and agreed to.

Clause 110 accordingly ordered to stand part of the Bill.

Clause 111

Requirements enforceable by OFCOM against providers of regulated services

I beg to move amendment 53, in clause 111, page 94, line 24, at end insert—

“Section 136(7C)

Code of practice on access to data”

This amendment is linked to Amendment 52.

With this it will be convenient to discuss amendment 52, in clause 136, page 118, line 6, at end insert—

“(7A) Following the publication of the report, OFCOM must produce a code of practice on access to data setting out measures with which regulated services are required to comply.

(7B) The code of practice must set out steps regulated services are required to take to facilitate access to date by persons carrying out independent research.

(7C) Regulated services must comply with any measures in the code of practice.”

This amendment would require Ofcom to produce a code of practice on access to data.

Labour welcomes this important clause, which lists the enforceable requirements. Failure to comply with those requirements can trigger enforcement action. However, the provisions could go further, so we urge the Minister to consider our important amendments.

Amendments 52 and 53 make it abundantly clear that more access to, and availability of, data and information about systems and processes would improve understanding of the online environment. We cannot rely solely on Ofcom to act as problems arise, when new issues could be spotted early by experts elsewhere. The entire regime depends on how bright a light we can shine into the black box of the tech companies, but only minimal data can be accessed.

The amendments would require Ofcom simply to produce a code of practice on access to data. We have already heard that without independent researchers accessing data on relevant harm, the platforms have no real accountability for how they tackle online harms. Civil society and researchers work hard to identify online harms from limited data sources, which can be taken away by the platforms if they choose. Labour feels that the Bill must require platforms, in a timely manner, to share data with pre-vetted independent researchers and academics. The EU’s Digital Services Act does that, so will the Minister confirm why such a provision is missing from this supposed world-leading Bill?

Clause 136 gives Ofcom two years to assess whether access to data is required, and it “may”, but not “must”, publish guidance on how its approach to data access might work. The process is far too slow and, ultimately, puts the UK behind the EU, whose legislation makes data access requests possible immediately. Amendment 52 would change the “may” to “must”, and would ultimately require Ofcom to explore how access to data works, not if it should happen in the first place.

Frances Haugen’s evidence highlighted quite how shadowy a significant number of the platforms are. Does the hon. Member agree that that hammers home the need for independent researchers to access as much detail as possible so that we can ensure that the Bill is working?

I agree 100%. The testimony of Frances Haugen, the Facebook whistleblower, highlighted the fact that expert researchers and academics will need to examine the data and look at what is happening behind social media platforms if we are to ensure that the Bill is truly fit for purpose and world leading. That process should be carried out as quickly as possible, and Ofcom must also be encouraged to publish guidance on how access to data will work.

Ultimately, the amendments make a simple point: civil society and researchers should be able to access data, so why will the Minister not let them? The Bill should empower independently verified researchers and civil society to request tech companies’ data. Ofcom should be required to publish guidance as soon as possible —within months, not years—on how data may be accessed. That safety check would hold companies to account and make the internet a safer and less divisive space for everyone.

The process would not be hard or commercially ruinous, as the platforms claim. The EU has already implemented it through its Digital Services Act, which opens up the secrets of tech companies’ data to Governments, academia and civil society in order to protect internet users. If we do not have that data, researchers based in the EU will be ahead of those in the UK. Without more insight to enable policymaking, quality research and harm analysis, regulatory intervention in the UK will stagnate. What is more, without such data, we will not know Instagram’s true impact on teen mental health, nor the reality of violence against women and girls online or the risks to our national security.

We propose amending the Bill to accelerate data sharing provisions while mandating Ofcom to produce guidance on how civil society and researchers can access data, not just on whether they should. As I said, that should happen within months, not years. The provisions should be followed by a code of practice, as outlined in the amendment, to ensure that platforms do not duck and dive in their adherence to transparency requirements. A code of practice would help to standardise data sharing in a way that serves platforms and researchers.

The changes would mean that tech companies can no longer hide in the shadows. As Frances Haugen said of the platforms in her evidence a few weeks ago:

“The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 188, Q320.]

I understand the shadow Minister’s point. We all heard from Frances Haugen about the social media firms’ well-documented reluctance—to put it politely—to open themselves up to external scrutiny. Making that happen is a shared objective. We have already discussed several times the transparency obligations enshrined in clause 64. Those will have a huge impact in ensuring that the social media firms open up a lot more and become more transparent. That will not be an option; they will be compelled to do that. Ofcom is obliged under clause 64 to publish the guidance around those transparency reports. That is all set in train already, and it will be extremely welcome.

Researchers’ access to information is covered in clause 136, which the amendments seek to amend. As the shadow Minister said, our approach is first to get Ofcom to prepare a report into how that can best be done. There are some non-trivial considerations to do with personal privacy and protecting people’s personal information, and there are questions about who counts as a valid researcher. When just talking about it casually, it might appear obvious who is or is not a valid researcher, but we will need to come up with a proper definition of “valid researcher” and what confidentiality obligations may apply to them.

This is all sorted in the health environment because of the personal data involved—there is no data more personal than health data—and a trusted and safe environment has been created for researchers to access personal data.

This data is a little different—the two domains do not directly correspond. In the health area, there has been litigation—an artificial intelligence company is currently engaged in litigation with an NHS hospital trust about a purported breach of patient data rules—so even in that long-established area, there is uncertainty and recent, or perhaps even current, litigation.

We are asking for the report to be done to ensure that those important issues are properly thought through. Once they are, Ofcom has the power under clause 136 to lay down guidance on providing access for independent researchers to do their work.

The Minister has committed to Ofcom being fully resourced to do what it needs to do under the Bill, but he has spoken about time constraints. If Ofcom were to receive 25,000 risk assessments, for example, there simply would not be enough people to go through them. Does he agree that, in cases in which Ofcom is struggling to manage the volume of data and to do the level of assessment required, it may be helpful to augment that work with the use of independent researchers? I am not asking him to commit to that, but to consider the benefits.

Yes, I would agree that bona fide academic independent researchers do have something to offer and to add in this area. The more we have highly intelligent, experienced and creative people looking at a particular problem or issue, the more likely we are to get a good and well-informed result. They may have perspectives that Ofcom does not. I agree that, in principle, independent researchers can add a great deal, but we need to ensure that we get that set up in a thoughtful and proper way. I understand the desire to get it done quickly, but it is important to take the time to do it not just quickly, but right. It is an area that does not exist already—at the moment, there is no concept of independent researchers getting access to the innards of social media companies’ data vaults—so we need to make sure that it is done in the right way, which is why it is structured as it is. I ask the Committee to stick with the drafting, whereby there will be a report and then Ofcom will have the power. I hope we end up in the same place—well, the same place, but a better place. The process may be slightly slower, but we may also end up in a better place for the consideration and thought that will have to be given.

I appreciate where the Minister is coming from. It seems that he wants to back the amendment, so I am struggling to see why he will not, especially given that the DSA—the EU’s new legislation—is already doing this. We know that the current wording in the Bill is far too woolly. If providers can get away with it, they will, which is why we need to compel them, so that we are able to access this data. We need to put that on the face of the Bill. I wish that we did not have to do so, but we all wish that we did not have to have this legislation in the first place. Unless we put it in the Bill, however, the social media platforms will carry on regardless, and the internet will not be a safe place for children and adults in the UK. That is why I will push amendment 53 to a vote.

Question put, That the amendment be made.

I beg to move amendment 56, in clause 111, page 94, line 24, at end insert—

“Section [Supply chain risk assessment duties]

Supply chain risk assessments”

This amendment is linked to NC11.

With this it will be convenient to discuss new clause 11—Supply chain risk assessment duties—

“(1) This section sets out duties to assess risks arising in a provider’s supply chain, which apply to all Part 3 services.

(2) A duty to carry out a suitable and sufficient assessment of the risk of harm arising to persons employed by contractors of the provider, where the role of such persons is to moderate content on the service.

(3) A duty to keep the risk assessment up to date.

(4) Where any change is proposed to any contract for the moderation of content on the service, a duty to carry out a further suitable and sufficient risk assessment.

(5) In this section, the ‘risk of harm’ includes any risks arising from—

(a) exposure to harmful content; and

(b) a lack of training, counselling or support.”

This new clause introduces a duty to assess the risk of harm in the supply chain.

We know that human content moderation is the foundation of all content moderation for major platforms. It is the most important resource for making platforms safe. Relying on AI alone is an ineffective and risky way to moderate content, so platforms have to rely on humans to make judgment calls about context and nuance. I pay tribute to all human moderators for keeping us all safe by having to look at some of the most horrendous and graphic content.

The content moderation reviews carried out by humans, often at impossible speeds, are used to classify content to train algorithms that are then used to automatically moderate exponentially more content. Human moderators can be, and often are, exploited by human resource processes that do not disclose the trauma inherent in the work or properly support them in their dangerous tasks. There is little oversight of this work, as it is done largely through a network of contracted companies that do not disclose their expectations for staff or the support and training provided to them. The contractors are “off book” from the platforms and operate at arm’s length from the services they are supporting, and they are hidden by a chain of unaccountable companies. This creates a hazardous supply chain for the safety processes that platforms claim will protect users in the UK and around the world.

Not all online abuse in the UK happens in English, and women of many cultures and backgrounds in the UK are subject to horrific abuse that is not in the English language. The amendment would make all victim groups in the UK much safer.

To make the internet safer it is imperative to better support human content moderators and regulate the supply chain for their work. It is an obvious but overlooked point that content moderators are users of a platform, but they are also the most vulnerable group of users, as they are the frontline of defence in sifting out harmful content. Their sole job is to watch gruesome, traumatising and harmful content so that we do not have to. The Bill has a duty to protect the most vulnerable users, but it cannot do so if their existence is not even acknowledged.

Many reports in the media have described the lack of clarity about, and the exploitative nature of, the hiring process. Just yesterday, I had the immense privilege of meeting Daniel Motaung, the Facebook whistleblower from Kenya who has described the graphic and horrendous content that he was required to watch to keep us all safe, including live beheadings and children being sexually exploited. Members of the Committee cannot even imagine what that man has had to endure, and I commend him for his bravery in speaking out and standing up for his rights. He has also been extremely exploited by Facebook and the third party company by which he was employed. He was paid the equivalent of $2 an hour for doing that work, whereas human moderators in the US were paid roughly $18 an hour—again, nowhere near enough for what they had to endure.

In one instance, a Meta content moderator working for a contractor was not informed during his interview that the job would require regular viewing of disturbing content that could lead to mental health problems. After he accepted the role, the contractor asked him to sign a non-disclosure agreement, and only then did they reveal to him the exact type of content that he would be working with daily. That moderator—similar to many moderators in the US, Ireland and other locations—was diagnosed with post-traumatic stress disorder due to his work.

One former counsellor for a content moderator contractor

“witnessed managers repeatedly rejecting content moderators’ requests for breaks, citing productivity pressures.”

They also reported that managers

“regularly rejected counsellors’ requests to let content moderators take ‘wellness breaks’ during the day, because of the impact it would have on productivity.”

Other moderators in the US were allocated just nine minutes a day of “wellness time”, which many needed to use to go to the bathroom. In some cases, the wellness coaches that the contractors provide do not have any clinical psychological counselling credentials, and would recommend “karaoke or painting” after shifts of watching suicides and other traumatic content.

Oversight is required to ensure that human resources processes clearly identify the role and provide content descriptions, as well as information on possible occupational hazards. Currently, the conditions of the work are unregulated and rely on the business relationship between two parties focused on the bottom line. Platforms do not release any due diligence on the employment conditions of those contractors, if they conduct it at all. If there is to be any meaningful oversight of the risks inherent in the content moderation supply chain, it is imperative to mandate transparency around the conditions for content moderators in contracted entities. As long as that relationship is self-regulated, the wellness of human moderators will be at risk. That is why we urge the Minister to support this important amendment and new clause: there is a human element to all this. We urge him to do the right thing.

I thank the hon. Member for Pontypridd for laying out her case in some detail, though nowhere near the level of detail that these people have to experience while providing moderation. She has given a very good explanation of why she is asking for the amendment and new clause to be included in the Bill. Concerns are consistently being raised, particularly by the Labour party, about the impact on the staff members who have to deal with this content. I do not think the significance of this issue for those individuals can be overstated. If we intend the Bill to have the maximum potential impact and reduce harm to the highest number of people possible, it makes eminent sense to accept this amendment and new clause.

There is a comparison with other areas in which we place similar requirements on other companies. The Government require companies that provide annual reports to undertake an assessment in those reports of whether their supply chain uses child labour or unpaid labour, or whether their factories are safe for people to work in—if they are making clothes, for example. It would not be an overly onerous request if we were to widen those requirements to take account of the fact that so many of these social media companies are subjecting individuals to trauma that results in them experiencing PTSD and having to go through a lengthy recovery process, if they ever recover. We have comparable legislation, and that is not too much for us to ask. Unpaid labour, or people being paid very little in other countries, is not that different from what social media companies are requiring of their moderators, particularly those working outside the UK and the US in countries where there are less stringent rules on working conditions. I cannot see a reason for the Minister to reject the provision of this additional safety for employees who are doing an incredibly important job that we need them to be doing, in circumstances where their employer is not taking any account of their wellbeing.

As my hon. Friend the Member for Pontypridd has pointed out, there is little or no transparency about one of the most critical ways in which platforms tackle harms. Human moderators are on the frontline of protecting children and adults from harmful content. They must be well resourced, trained and supported in order to fulfil that function, or the success of the Bill’s aims will be severely undermined.

I find it shocking that platforms offer so little data on human moderation, either because they refuse to publish it or because they do not know it. For example, in evidence to the Home Affairs Committee, William McCants from YouTube could not give precise statistics for its moderator team after being given six days’ notice to find the figure, because many moderators were employed or operated under third-party auspices. For YouTube’s global counter-terrorism lead to be unaware of the detail of how the platform is protecting its users from illegal content is shocking, but it is not uncommon.

In evidence to this Committee, Meta’s Richard Earley was asked how many of Meta’s 40,000 human moderators were outsourced to remove illegal content and disinformation from the platform. My hon. Friend the Member for Pontypridd said:

“You do not have the figures, so you cannot tell me.”

Richard Earley replied:

“I haven’t, no, but I will be happy to let you know afterwards in our written submission.”

Today, Meta submitted its written evidence to the Committee. It included no reference to human content moderators, despite its promise.

The account that my hon. Friend gave just now shows why new clause 11 is so necessary. Meta’s representative told this Committee in evidence:

“Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 45, Q76.]

But now we know from whistleblowers such as Daniel, whose case my hon. Friend described, that that is untrue. What is happening to Daniel and the other human moderators is deeply concerning. There are powerful examples of the devastating emotional impact that can occur because human moderators are not monitored, trained and supported.

There are risks of platforms shirking responsibility when they outsource moderation to third parties. Stakeholders have raised concerns that a regulated company could argue that an element of its service is not in the scope of the regulator because it is part of a supply chain. We will return to that issue when we debate new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties.

Platforms, in particular those supporting user-to-user generated content, employ those services from third parties. Yesterday, I met Danny Stone, the chief executive of the Antisemitism Policy Trust, who described the problem of antisemitic GIFs. Twitter would say, “We don’t supply GIFs. The responsibility is with GIPHY.” GIPHY, as part of the supply chain, would say, “We are not a user-to-user platform.” If someone searched Google for antisemitic GIFs, the results would contain multiple entries saying, “Antisemitic GIFs—get the best GIFs on GIPHY. Explore and share the best antisemitic GIFs.”

One can well imagine a scenario in which a company captured by the regulatory regime established by the Bill argues that an element of its service is not within the ambit of the regulator because it is part of a supply chain presented by, but not necessarily the responsibility of, the regulated service. The contracted element, which I have just described by reference to Twitter and GIPHY, supported by an entirely separate company, would argue that it was providing a business-to-business service that is not user-generated content but content designed and delivered at arm’s length and provided to the user-to-user service to deploy for its users.

I suggest that dealing with this issue would involve a timely, costly and unhelpful legal process during which systems were not being effectively regulated—the same may apply in relation to moderators and what my hon. Friend the Member for Pontypridd described; there are a number of lawsuits involved in Daniel’s case—and complex contract law was invoked.

We recognise in UK legislation that there are concerns and issues surrounding supply chains. Under the Bribery Act 2010, for example, a company is liable if anyone performing services for or on the company’s behalf is found culpable for specific actions. These issues on supply chain liability must be resolved if the Bill is to fulfil its aim of protecting adults and children from harm.

Thank you. Clause 111 sets out and defines the “enforceable requirements” in this chapter—the duties that Ofcom is able to enforce against. Those are set out clearly in the table at subsection (2) and the requirements listed in subsection (3).

The amendment speaks to a different topic. It seeks to impose or police standards for people employed as subcontractors of the various companies that are in scope of the Bill, for example people that Facebook contracts; the shadow Minister, the hon. Member for Pontypridd, gave the example of the gentleman from Kenya she met yesterday. I understand the point she makes and I accept that there are people in those supply chains who are not well treated, who suffer PTSD and who have to do extraordinarily difficult tasks. I do not dispute at all the problems she has referenced. However, the Government do not feel that the Bill is the right place to address those issues, for a couple of reasons.

First, in relation to people who are employed in the UK, we have existing UK employment and health and safety laws. We do not want to duplicate or cut across those. I realise that they relate only to people employed in the UK, but if we passed the amendment as drafted, it would apply to people in the UK as much as it would apply to people in Kenya.

Secondly, the amendment would effectively require Ofcom to start paying regard to employment conditions in Kenya, among other places—indeed, potentially any country in the world—and it is fair to say that that sits substantially outside Ofcom’s area of expertise as a telecoms and communications regulator. That is the second reason why the amendment is problematic.

The third reason is more one of principle. The purpose of the Bill is to keep users safe online. While I understand the reasonable premise for the amendment, it seeks essentially to regulate working conditions in potentially any country in the world. I am just not sure that it is appropriate for an online safety Bill to seek to regulate global working conditions. Facebook, a US company, was referenced, but only 10% of its activity—very roughly speaking—is in the UK. The shadow Minister gave the example of Kenyan subcontractors. Compelling though her case was, I am not sure it is appropriate that UK legislation on online safety should seek to regulate the Kenyan subcontractor of a United States company.

The Government of Kenya can set their own employment regulations and President Biden’s Government can impose obligations on American companies. For us, via a UK online safety Bill, to seek to regulate working conditions in Kenya goes a long way beyond the bounds of what we are trying to do, particularly when we take into account that Ofcom is a telecommunications and communications regulator. To expect it to regulate working conditions anywhere in the world is asking quite a lot.

I accept that a real issue is being raised. There is definitely a problem, and the shadow Minister and the hon. Member for Aberdeen North are right to raise it, but for the three principal reasons that I set out, I suggest that the Bill is not the place to address these important issues.

The Minister mentions workers in the UK. I am a proud member of the Labour party and a proud trade unionist; we have strong protections for workers in the UK. There is a reason why Facebook and some of these other platforms, which are incredibly exploitative, will not have human moderators in the UK looking at this content: because they know they would be compelled to treat them a hell of a lot better than they do the workers around the world that they are exploiting, as they do in Kenya, Dublin and the US.

To me, the amendment speaks to the heart of the Bill. This is an online safety Bill that aims to keep the most vulnerable users safe online. People around the world are looking at content that is created here in the UK and having to moderate it; we are effectively shipping our trash to other countries and other people to deal with it. That is not acceptable. We have the opportunity here to keep everybody safe from looking at this incredibly harmful content. We have a duty to protect those who are looking at content created in the UK in order to keep us safe. We cannot let those people down. The amendment and new clause 11 give us the opportunity to do that. We want to make the Bill world leading. We want the UK to stand up for those people. I urge the Minister to do the right thing and back the amendment.

The Minister has not commented on the problem I raised of the contracted firm in the supply chain not being covered by the regulations under the Bill—the problem of Twitter and the GIFs, whereby the GIFs exist and are used on Twitter, but Twitter says, “We’re not responsible for them; it’s that firm over there.” That is the same thing, and new clause 11 would cover both.

I am answering slightly off the cuff, but I think the point the hon. Lady is raising—about where some potentially offensive or illegal content is produced on one service and then propagated or made available by another—is one we debated a few days ago. I think the hon. Member for Aberdeen North raised that question, last week or possibly the week before. I cannot immediately turn to the relevant clause—it will be in our early discussions in Hansard about the beginning of the Bill—but I think the Bill makes it clear that where content is accessed through another platform, which is the example that the hon. Member for Worsley and Eccles South just gave, the platform through which the content is made available is within the scope of the Bill.

Question put, That the amendment be made.

Clause 111 ordered to stand part of the Bill.

Clause 112

Confirmation decisions

Question proposed, That the clause stand part of the Bill.

We support clause 112, which gives Ofcom the power to issue a confirmation decision if, having followed the required process—for example, in clause 110—its final decision is that a regulated service has breached an enforceable requirement. As we know, this will set out Ofcom’s final decision and explain whether Ofcom requires the recipient of the notice to take any specific steps and/or pay a financial penalty. Labour believes that this level of scrutiny and accountability is vital to an Online Safety Bill that is truly fit for purpose, and we support clause 112 in its entirety.

We also support the principles of clause 113, which outlines the steps that a person may be required to take either to come into compliance or to remedy the breach that has been committed. Subsection (5) in particular is vital, as it outlines how Ofcom can require immediate action when the breach has involved an information duty. We hope this will be a positive step forward in ensuring true accountability of big tech companies, so we are happy to support the clause unamended.

It is right and proper that Ofcom has powers when a regulated provider has failed to carry out an illegal content or children’s risk assessment properly or at all, and when it has identified a risk of serious harm that the regulated provider is not effectively mitigating or managing. As we have repeatedly heard, risk assessments are the very backbone of the Bill, so it is right and proper that Ofcom is able to force a company to take measures to comply in the event of previously failing to act.

Children’s access assessments, which are covered by clause 115, are a crucial component of the Bill. Where Ofcom finds that a regulated provider has failed to properly carry out an assessment, it is vital that it has the power and legislative standing to force the company to do more. We also appreciate the inclusion of a three-month timeframe, which would ensure that, in the event of a provider re-doing the assessment, it would at least be completed within a specific—and small—timeframe.

While we recognise that the use of proactive technologies may come with small issues, Labour ultimately feels that clause 116 is balanced and fair, as it establishes that Ofcom may require the use of proactive technology only on content that is communicated publicly. It is fair that content in the public domain is subject to those important safety checks. It is also right that under subsection (7), Ofcom may set a requirement forcing services to review the kind of technology being used. That is a welcome step that will ensure that platforms face a level of scrutiny that has certainly been missing so far.

Labour welcomes and is pleased to support clause 117, which allows Ofcom to impose financial penalties in its confirmation decision. That is something that Labour has long called for, as we believe that financial penalties of this nature will go some way towards improving best practice in the online space and deterring bad actors more widely.

The shadow Minister has set out the provisions in the clauses, and I am grateful for her support. In essence, clauses 112 to 117 set out the processes around confirmation decisions and make provisions to ensure that those are effective and can be operated in a reasonable and fair way. The clauses speak largely for themselves, so I am not sure that I have anything substantive to add.

Question put and agreed to.

Clause 112 accordingly ordered to stand part of the Bill.

Clauses 113 to 117 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Dean Russell.)

Adjourned till Tuesday 21 June at twenty-five minutes past Nine o’clock.

Written evidence reported to the House

OSB77 Twitter

Public Order Bill (Fifth sitting)

The Committee consisted of the following Members:

Chairs: Peter Dowd, †David Mundell

† Anderson, Lee (Ashfield) (Con)

† Bridgen, Andrew (North West Leicestershire) (Con)

† Chamberlain, Wendy (North East Fife) (LD)

† Cunningham, Alex (Stockton North) (Lab)

† Doyle-Price, Jackie (Thurrock) (Con)

† Elmore, Chris (Ogmore) (Lab)

† Elphicke, Mrs Natalie (Dover) (Con)

† Hunt, Tom (Ipswich) (Con)

† Huq, Dr Rupa (Ealing Central and Acton) (Lab)

† Jones, Sarah (Croydon Central) (Lab)

† Longhi, Marco (Dudley North) (Con)

† McCarthy, Kerry (Bristol East) (Lab)

McLaughlin, Anne (Glasgow North East) (SNP)

† Malthouse, Kit (Minister for Crime and Policing)

† Mann, Scott (North Cornwall) (Con)

† Mohindra, Mr Gagan (South West Hertfordshire) (Con)

† Vickers, Matt (Stockton South) (Con)

Anne-Marie Griffiths, Sarah Thatcher, Committee Clerks

† attended the Committee

Public Bill Committee

Thursday 16 June 2022


[David Mundell in the Chair]

Public Order Bill

Order. Before we begin I have a few preliminary reminders for the Committee. Please switch electronic devices to silent. No food or drink is permitted during the sittings of this Committee, except for the water provided. Hansard colleagues would be grateful if Members could email their speaking notes to or, alternatively, pass on their written speaking notes to the Hansard colleague in the room.

Clause 6

Powers to stop and search on suspicion

I beg to move amendment 25, clause 6, page 8, line 23, at end insert—

“(ha) an offence under section (Offence of causing serious disruption by tunnelling) of that Act (offence of causing serious disruption by tunnelling);

(hb) an offence under section (Offence of causing serious disruption by being present in a tunnel) of that Act (offence of causing serious disruption by being present in a tunnel)”.

This amendment applies the stop and search powers in section 1 of the Police and Criminal Evidence Act 1984 to an offence relating to tunnelling under the new clause inserted by NC5 or NC6

With this it will be convenient to discuss the following:

Government amendment 26.

Government new clause 5— Offence of causing serious disruption by tunnelling.

Government new clause 6— Offence of causing serious disruption by being present in a tunnel.

It is a great pleasure to serve under your wise guidance, Mr Mundell, for our contemplation of this legislation today. The amendments make it clear that the protest tactic of building tunnels in order to disrupt legitimate activity while endangering the protesters themselves and the police and emergency services who respond will not be tolerated. The Committee heard last week how HS2 had been targeted on multiple occasions by people building tunnels that have caused enormous cost to the project, with three removal operations alone costing in excess of £10 million.

Even more recently, we have seen protesters from Just Stop Oil engaging in this dangerous and reckless activity at sites in Essex and Warwickshire. Aside from the costs, however, it is the risk of a fatality at one of the sites that concerns us most. Whatever hon. Members think about the merits of a particular cause and the right to protest, we can all agree that such an utterly reckless practice must not be allowed to continue.

Although the individuals may be willing to put themselves at risk, it is not acceptable that they endanger those who are called upon to remove them and repair the damage inflicted. The tunnels are often structurally unsound and poorly ventilated. In addition, the protesters resist removal, increasing the risks for those we ask to enforce the law. While removing protesters from the Euston Square tunnel, for example, HS2 reported that a protester removed part of the shoring, causing a tunnel to collapse on a contractor.

New clause 5 therefore creates a new offence of creating a tunnel, which will be committed when an individual causes serious disruption by creating a tunnel. Their action must cause, or be capable of causing, serious disruption to an organisation or two or more individuals—as we have seen in earlier clauses in the Bill—and the person must intend the tunnel to have a consequence or be reckless as to the consequence. To deter a committed cohort of protest tunnellers, the clause enables a maximum sentence of three years’ imprisonment and/or a fine. The clause also includes a reasonable excuse exemption, as have previous clauses.

New clause 6 is designed to cover those who occupy a tunnel as well as those who constructed it in the first place. They will be liable to a similar penalty of up to three years’ imprisonment and/or a fine. The threshold of serious disruption for this offence will be the same as in new clause 5. For both clauses, the tunnel has been defined as any excavation, whether it leads to a destination or is enough to permit the passage of an individual. We have also included in scope any extension or enlargement of existing natural or artificial excavations. The breadth of the definition will ensure that all stages of this dangerous tactic will be captured.

Government amendments 25 and 26 extend the Bill’s suspicion-based and suspicion-less stop and search powers to include equipment that may be used for creating or being present in a tunnel. It is clear that the police need powers to tackle tunnels proactively before they occur. Those two amendments, alongside new clause 7, which we will debate later, will allow the police to take the necessary preventive action against those they believe may be intending to tunnel, protecting the public from serious disruption.

Finally, the level of sentences for these new offences reflects the level of harm that tunnelling can cause. Not only do they cause significant disruption and cost millions of pounds to clean up, as we heard, but they place protesters and, critically, emergency workers at extraordinary risk of serious injury or death. We therefore think it is completely proportionate that the maximum sentences for these offences are as high as I have set out, for the reasons that I have set out.

It is a pleasure to serve under your chairmanship again today, Mr Mundell.

We move on this morning to powers on stop and search. In this group, the Government are making changes, including to clause 6, through two amendments and two new clauses that deal with tunnelling, which follows the evidence we heard from HS2 about problems that were seen at its sites. It is interesting to note in the news today that an absolutely stunning Anglo-Saxon burial site has just been discovered on the HS2 route—140 people were buried with an amazing array of items. That is tangential, but interesting.

No, we cannot, as the Minister says. Government amendments 25 and 26 apply the stop-and-search powers of clauses 6 and 7 to the new offences related to tunnelling that are included in Government new clauses 5 and 6. These amendments will make it a criminal offence to cause serious disruption by creating and occupying tunnels; going equipped to create tunnels will also be criminalised. The changes include the proposed new maximum sentence, as the Minister said, of three years’ imprisonment and an unlimited fine.

I think we can all agree again today that the digging of these tunnels is incredibly disruptive and dangerous, and obviously hugely costly. As the Government’s note says, they are filled with lethal levels of carbon monoxide and carbon dioxide and the tunnels can become death traps, not just for those inside them and members of the public but for those who are required to undertake rescue operations.

HS2’s written evidence gives a clear picture of the danger and disruption, including:

“delay costs, policing, local authority costs, or the additional security costs to maintain a safe and secure compound once protestors have been removed. For a typical tunnel removal operation, HS2 Ltd employs specialists in soil composition, mine rescue, drone operation, health and safety, and paramedics. Protestors are either unaware of the danger of the situation they put themselves in, or have absolute faith in HS2 Ltd’s ability to extract them safely. The risk of a fatality occurring during a tunnelling protest is significant.

Protestors rely on HS2 Ltd’s contractors to monitor air quality, supply air and to remove human waste from the tunnels…During the Euston eviction operation, a protestor removed shoring that caused a tunnel to collapse on a rescue contractor. Whilst the latter incident caused only minor injury, the ongoing threat to the lives of HS2’s staff and protestors is clearly in evidence.

Air quality is often poor inside make-shift tunnels and sometimes…deadly. Deadly levels of carbon monoxide and dioxide were found in tunnels at Small Dean, for example, and the removal team had to provide an air supply to avoid the occupants being overcome and experiencing breathing difficulties. The provision of a constant air supply is not always possible as some ground conditions mean that there is a risk of further instability and risk of collapse being created if the soil is dried out by the provision of air. Tunnels can be extremely deep and are often inadequately shored creating a very real risk of collapse”.

Nobody has the right to put other people’s lives in danger with this kind of dangerous act. As we heard, the removal operation following tunnelling by protesters at Small Dean in Buckinghamshire in 2021 added more than £4 million to the cost of HS2.

The act of digging a tunnel by a group such as Just Stop Oil or those at HS2 in Euston is already a criminal act—we have had this conversation already. Like most of the offences introduced in this Bill, tunnelling is already covered by existing offences. Aggravated trespass with a prison sentence of three months and criminal damage with a prison sentence of up to 10 years could both apply here.

The hon. Lady has raised the issue of the aggravated trespass offence on a number of occasions as a charge that can be used, so I asked my team to look at why aggravated trespass is not necessarily ideal. What we have found is that in a number of situations, not least with HS2, defendants against aggravated trespass in court claim that they are disrupting unlawful activity. That shifts the burden of proof on to, in this case, HS2 to prove that what it was doing was lawful. For example, at the Euston Square Gardens tunnel aggravated trespass was used, and HS2 was required to present to the court what work was being carried out on the land at the time the protesters were in the tunnel and show it was lawful. The case was dismissed by the judge on the grounds that no construction was being carried out on the land at the time. This failed to recognise that HS2 could not start substantive work on the land because protesters were in the tunnel. This specific offence will cover that.

I am sure the hon. Lady also recognises that a tunnel may cross between different ownerships of land and between public and private land. That legal complexity causes a problem. While I understand that she is cleaving to aggravated trespass in many of her oppositions to these clauses, actually, this issue of the protesters being able to reverse the burden of proof is hugely problematic. That is what we are seeking to address.

I thank the Minister for that substantial intervention. I would answer with the words of the police themselves on that very point. The National Police Chiefs’ Council lead in this area said of the Government’s plans to make it an offence to cause serious disruption by tunnelling—or be present in a tunnel or equipped for tunnelling—that:

“Whilst forces have experienced tunnelling in recent operations, we do not believe that a specific offence around tunnelling will add anything above and beyond our current available powers.”

I think that is really significant. The police have not asked for this offence, and they do not believe it is necessary at all. They believe the existing powers they have are enough to deal with these protests. This is a point we keep coming back to. We have talked through this. I will not read it out again, but I was looking for my list of all the other offences people can be charged with in different circumstances. The police have a raft of powers and say themselves that in this case they do not need these powers. They have broad catch-all ones such as breach of the peace and very specific ones with options for long custodial sentences to deal with and manage protests that are disruptive. Two key issues come up time and again with these new offences. They are either going to be difficult for the police to put in practice or they will make no different to the time it takes to deal with the disruption.

Sorry, I should have been clear in what I said earlier. I heard the evidence by the National Police Chiefs’ Council lead. The problem is not necessarily the police’s ability to remove and charge those individuals. The problems, as I outlined in the example I used, come in the courts. The current suite of offences that are being incurred gives wriggle room for protesters to make this claim and reverse the burden of proof. I am sure the hon. Lady will agree that what happened at Euston Square was very dangerous, and I hope she agrees that an offence was committed, but at Euston Square they were able to avoid punishment for what they did by using this technicality.

I will say two things. First, there is a raft of powers, not least injunctions. HS2 has used injunctions successfully and is currently applying for this whole-route injunction. We will see what comes of that. The second point is an interesting one that we can debate further another time. It is that the courts take different views according to what people are protesting and where. They are more sympathetic to people who are protesting the thing they are against than they are when people are disrupting the public more widely. That is why they have sent people to prison for blocking motorways and have taken a different view on things like the Colston statue.

There is an interesting point about how the courts interpret these things, but I think all these issues come into play when looking at this. We do not believe it is going to make any difference to the time it takes to deal with the disruption, which is important, because that is a core part of the problem itself. Sadly, we do not think it will make the protest removal teams safer when trying to get protesters out. We do not think it will be a deterrent to those repeat offenders we have talked a lot about or that it will speed up the complex and time-consuming removal process.

I speak with some experience on the matter because I was a tunneller; I worked underground in coalmines in Nottinghamshire and Derbyshire for many years. It is a dangerous, dirty and horrible life-risking job, so I would welcome any measure that acts as a deterrent—it is a drastic measure. Does the hon. Member not agree that we should be doing everything in our power to stop these people doing this?

I agree with the hon. Gentleman’s frustration, but I listen to the police when we look at what they need. They are saying that this will not help them. I would listen to them, and I would look at the existing powers. I want to read some more of the written evidence from the National Police Chiefs’ Council lead on public order and public safety, who states:

“A specific offence would likely not change how these are operationally handled as whatever the offence the practical safety considerations of dealing with people in tunnels would remain. There is current legislation, such as that contained in the Criminal Damage Act 1971, that creates offences of damaging property and having articles to damage property. With the associated powers of search these allow the Police to find articles or equipment intended to cause damage. An additional significant concern is that any specific offence relating to tunnelling would apply to private land. This again could place a significant responsibility on policing. We ask that if considered that this offence is restricted to public places.”

That was the NPCC highlighting a few concerns it has with the plans.

Clause 6 and new clause 5 seem to apply to tunnelling everywhere except

“to the extent that it is in or under a dwelling”,

so any offence to do with tunnelling applies to private land, even if it is under a dwelling—essentially, a place where people live. Take the example of protests taking place against a particular farmer for growing a crop in a private field that protesters oppose or for another matter. If the protesters tunnel under the private field, which could cause disruption and is annoying for the farmer, but it does not destroy the crops, what should happen? There are some complications in terms of the police concerns, which we need to bring to light here.

Chris Noble said in his oral evidence:

“this probably goes to the core of one of the key issues that police are keen to discuss within the Committee—the vast majority of that work is done by the landowners and private companies that are skilled and experienced within this work. While I have some dedicated resources allocated to that at present, if that responsibility was to significantly shift to policing, it would cost me… in the region of £80,000 a day to resource that. It would need significant officer resources, which clearly would need to come from elsewhere”.

That is crucial.

He said:

“The key… is not so much even, necessarily, an offence around tunnelling, because we may well have powers that, broadly speaking, exist to deal with it—we are keen to develop that conversation. The challenge is in preventing it in the first place, and then in how we can work with industry and landowners”

so we can

“potentially remove individuals more quickly.”––[Official Report, Public Order Public Bill Committee, 9 June 2022; c. 12, Q14.]

The challenge is how to prevent tunnelling. The new powers replicate powers the police already have, and we agree with the NPCC on a lot of their concerns.

The NPCC also raised concerns about the responsibility that the new offences will place on police. The Bill has drawn out a bit of conflict between the police and private companies, which is interesting. John Groves from HS2 said:

“Certainly, there is frustration from my team on the ground that the police are not more direct with some of the protesters”.––[Official Report, Public Order Public Bill Committee, 9 June 2022; c. 23, Q43.]

Then we have the police asking the Government to consider that this offence is restricted to public places. Surely the intention of Government legislation like this is to make the lives of the police and private companies building infrastructure easier. It is perhaps problematic when complications are raised on both sides. We need to be mindful of the position that this may put the police in, blurring the lines of public and private that we understand. Policing of protests is called public order policing for a reason: it is usually about protests happening on public land.

I understand the argument that the hon. Lady is making, but I think we have accepted the principle that what these people are doing is not protesting. They are effectively committing a crime, and it is a well-established principle that regardless of whether a crime—for example, a burglary—is committed on public or private land, the police will apprehend, prosecute and investigate. Unless the hon. Lady is saying that tunnelling is a legitimate protest—notwithstanding the dangerous things that we have all talked about, and the cost—I do not understand her argument. Secondly, it is worth bearing in mind that regardless of whether the cost falls on HS2 or the police, it is falling on the taxpayer.

The point I was trying to make was to echo the concerns that the police have expressed about the expectation on them to go and do things on private land, the cost associated with that, and the need to deal with that issue. To reiterate, they have said that they think there are already suitable powers for them to stop people when they are committing a criminal act, which we agree tunnelling is. They have said they do not need this extra power. There is also criminal damage, which carries a sentence of up to 10 years in prison, so there are different forms of offences that we can look to.

With regard to the new powers, there is also the issue of training. According to the Police Foundation, over the seven years up to 2017-18, 33 forces reduced their budgeted spending on training in real terms by a greater percentage than their overall reduction in spending. Some 40% of police officers say they did not receive the necessary training to do their job, so I am concerned that many things in the Bill, particularly the new clauses, need to go along with properly resourced training to make sure that people understand and know what the new powers are. We have talked about the complexities of introducing new laws and expecting the police to understand them all many times before, not least with all the covid legislation.

I thank my hon. Friend for mentioning that, because it is something that has been bothering me. As I have said before, I was with the police in the operation centre when they were looking at protests in Bristol. Part of the briefing before protests involves telling the police what offences might be committed, what to look for and so on. We have a plethora of offences, and they have to make judgments on whether something is a serious disruption. The more complex it is, the more difficult it will be for the police to know what they are supposed to do when they are out on the streets in a very difficult situation.

I thank my hon. Friend for that perfect point. This is the challenge that policing has, and we have seen it with the recruitment of new officers as well. We need to make sure that everybody has the right training and understands the legal routes that they can use, and piling new and complex legislation on top of what we think is satisfactory legislation is problematic.

Having listened carefully to the hon. Lady, I have become more concerned about the complexity of the current situation that the police find themselves in. Is tunnelling okay if it is under a field because someone does not like genetically modified crops? What if the tunnelling is to do with something that will happen in the future, such as HS2? It seems to me that the Bill is a very clear piece of legislation that will address the public order issues that exist today. We will know that tunnelling is criminal, and it will be stopped under the Bill. I, too, have been in control rooms dealing with public order issues down in Dover, and it will make the police’s job easier to have the kind of clarity that the Bill will bring.

I refer back to the fact that the police themselves do not share the hon. Lady’s view. In this case, what they are saying is perfectly sensible. I do not think anybody is saying that we want people to be tunnelling in dangerous situations and putting people’s lives at risk; nobody wants that. Everybody agrees that there should be criminal sanctions. That is not the point.

Moving to deterrents and whether this measure would act as one, companies like HS2 hope that it will. It said many times in evidence that it was not an expert on the legal side, but that it hoped the measures would be a deterrent. HS2’s written evidence refers to how it is pursuing the route-wide civil injunction. It reads:

“Whilst, if granted, it is hoped that the route-wide injunction will significantly reduce disruption to the project caused by trespass and obstruction of access, it is unlikely to eliminate the problem.”

HS2 also writes that civil injunctions

“serve as a relatively effective deterrent to unlawful (in the civil legal sense) activity by some groups of protestors”.

We will talk about injunctions later, but as HS2 says, it is a relatively effective deterrent—if not also expensive.

The Government will take ages to implement more offences. My hon. Friend the Member for Stockton North made a speech on Tuesday about the court backlog. If we are adding new and complex criminal offences, maybe we need to sort the court backlog and the record 708 days it takes on average from offence to completion of a case. That is an extraordinarily long period of time. The longest delay from offence to completion was in Bournemouth, which recorded waits of 23 months in 2021.

I will conclude my remarks at this point by reiterating that we think tunnelling is very dangerous and that it is a difficult issue. There are existing laws in place, and we do not think that these measures are the answer. Therefore, we are not entirely convinced by the Government’s arguments today.

Amendment 25 agreed to.

Question proposed, That the clause, as amended, stand part of the Bill.

It is clear that police need the powers to proactively prevent criminal protest activity before it occurs. The hon. Lady has put great store by the evidence of the National Police Chiefs’ Council. She will recall it specifically saying that the ability to stop and search people in and around protests would be helpful, and in its report on the policing of protests, Her Majesty’s Inspectorate of Constabulary and Fire and Rescue Services argued that stop-and-search powers would improve the police’s ability to prevent serious disruption.

Clause 6 extends existing suspicion-led stop-and-search powers to a range of protest-related offences. Police officers will have the power to stop and search anyone they reasonably suspect is carrying items that could be used for locking on, obstruction of major transport works, interference with key infrastructure, public nuisance, obstruction of the highway or the new offences of tunnelling and being present in a tunnel, which have been tabled as Government amendments to the Bill. Existing safeguards, including statutory codes of practice, body-worn video to increase accountability and extensive data collection will continue to apply to ensure that the police use stop and search in an effective and proportionate manner.

While I understand the concerns that have been shared about the expansion of stop and search widely in society, it is clear that these powers are required to allow the police to take the necessary action to prevent the small minority of determined protesters causing serious disruption. I commend the clause to the Committee.

Clause 6 amends section 1 of the Police and Criminal Evidence Act 1984—PACE, as we call it—to allow a constable to stop and search a person or vehicle if they have reasonable grounds for suspecting that they will find an article made, adapted or intended for use in the course of or in connection with a range of offences listed in the Bill. The exercise of stop-and-search powers under section 1 of PACE is subject to PACE code of practice A, which will be updated to reflect the extension of the section 1 powers. This gives the police wide-ranging powers to stop and search anyone in the vicinity of a protest, such as shoppers passing a protest against a library closure. In the words of Liberty:

“This amendment constitutes a mass expansion of police powers through the creation of protest-specific stop and search. This is in spite of the fact that there is no consensus among the police that protest-specific stop and search is necessary or desirable.”

It is worth being clear about what stop and search is used for now. It is so intrusive because it is used for very serious offences. Police stop and search for drugs, weapons, knives and guns. We know that it can be a useful tool and has the potential to stop murder, serious violence and acts of terror. While we do not disagree with the premise of stop and search and recognise that it can be very helpful—I am sure we have all had conversations with both police and communities who talk of its benefits—the clauses in the Bill are a big expansion of powers.

My hon. Friend will recollect that when she and I worked on the Police, Crime, Sentencing and Courts Act 2022, many issues were raised about the disproportionate effect that that legislation would have on young black people. The same applies here. What comments would she make about how, yet again, we will see a disproportionate effect on people of ethnic minorities?

As always, my hon. Friend makes a good point. I will come on to talk about that in my later remarks.

Lord Kennedy, in the Lords, said:

“the Government are mirroring laws that currently exist for serious violence and knife crime.”

He went on to say that

“these measures apply to peaceful protesters, not people carrying knives or causing violence.”—[Official Report, House of Lords, 24 November 2021; Vol. 816, c. 992-993.]

Matt Parr, Her Majesty’s inspector, said that current suspicionless stop and search powers

“are intended to be used by the police to combat serious violence and the carriage of ‘dangerous instruments or offensive weapons’. Using a similar suspicion-less power to target peaceful protesters, who may cause serious (but non-violent) disruption, is a significantly different proposition. Given the potential ‘chilling effect’ on freedom of assembly and expression in terms of discouraging people from attending protests where they may be stopped and searched, we would expect any new suspicion-less powers to be subject to very careful scrutiny by the courts.”

In the same document, it was said that

“police officers highlighted operational difficulties in the targeted use of the power. Others were also concerned over the proportionality of any search as well as the potentially intrusive nature when looking for small items.

One officer reflected that the proposal had ‘complications’ – for instance, whether an otherwise innocuous items was really intended to be used to lock-on. He said that having a tube of superglue in your pocket, or chain and padlock that you intend to use to lock your bike, ‘doesn’t prove intent and presents difficulties’.”

Concern about that has been expressed in Bristol. There are a lot of cyclists in Bristol and many who would be carrying bike locks around with them. College Green is the area where people tend to congregate if there is going to be a march or a protest. However, there would be an awful lot of people in that area who might well be carrying things that, if the police wanted to be difficult, might put them under suspicion. Does my hon. Friend share my concern? [Interruption.] I do not quite know how it works if I am intervening. I am intervening on my shadow Minister, not the Minister.

The Minister will have the opportunity to have his say at the end of this discussion.

That is absolutely right, and it is one of our issues with the Bill in general and this clause in particular. The powers are being made so broad that it makes it difficult for the police to interpret them in a meaningful way. If somebody is searching for a knife, drugs or a gun, they know if they have found it. It is a criminal offence there and then. It gets more complicated when stop and search is extended to somebody who may or may not be peacefully attending a protest but who still could be stopped under the new powers.

Surely if someone were using their bicycle to travel to a protest, when they got to the protest they would have already got off their bicycle and used the chain to secure it in place. They would therefore arrive at the protest without the cycle lock.

They might be pushing their bicycle through the centre of the protest and their bicycle lock would be on their bicycle. That would be covered under the Bill. The lunacy of that is in the legislation, not our interpretation of it. It is a fact.

Does the hon. Lady really believe that our police are that daft that they would arrest somebody for carrying a lock when they are on their push bike going to a protest or wherever else? Does she really believe that?

I do not believe that our police are daft at all. I am a big champion of our police and a supporter of everything that they are trying to do. The point is that if someone goes to a protest and is carrying an item such as a bike lock, they could be stopped by the police and that that will have a chilling effect on protesters—not on the protesters we have been talking about who are about to lock on, who glue their hands to things and do need to be arrested and charged for the disruption that they cause, but on anybody else who wants to attend a peaceful protest. We are slipping from a society in which peaceful protest is a right and something that we encourage to one in which we want everybody to think twice before they go on a protest. I do not think we want to be that kind of country.

To give one example, a few years ago there was a protest in Bristol that involved people blocking the road by sitting and laying their bicycles down in it. That would potentially mean that they would have bike locks on them and could be subject to stop and search, would it not?

My hon. Friend is right. I urge colleagues to read the powers in clause 6. They are very clear and broad.

When Her Majesty’s inspectorate of constabulary and fire and rescue services consulted police on the Home Office’s proposal for a new stop-and-search power, one officer said that

“a little inconvenience is more acceptable than a police state.”

That was a police officer speaking. HMICFRS went on to state that it agreed with that sentiment.

As I have said already, stop and search is a useful tool. It is important in preventing crime. But it is an invasive power and can be counterproductive and undermine the legitimacy of and trust in policing if it is not used correctly. Rightly, it is designed to be used to prevent the most serious crime—knife crime, or drug dealing—and the police themselves have recognised serious concerns about disproportionality and that those who are black are much more likely to be stopped and searched than those who are white.

A lot of the suggestions coming from the shadow Minister seem to be predicated on the basis that the police do not know what they are doing and that they are completely devoid of any sort of common sense. We all have to acknowledge that no one is perfect. The police will not be perfect, the law cannot be perfect and we are certainly not perfect. We are trying to give the police the widest possible tools that they can have to prevent the public from being disrupted to the extent we have seen so far. It is about the application of common sense and it seems to me that everything that is coming from the Opposition is about trying to stop that happening and effectively sending out a message that they are not on the side of ordinary citizens.

I completely disagree. I am absolutely on the side of ordinary citizens, and the evidence I am referring to comes from the police, not direct from me. I am quoting police officers who took part in the consultation back when Matt Parr did his report, and I am raising organisations’ concerns. The police have talked about the disproportionate nature of stop and search; this is not me speaking, but them. Let me quote the recent Independent Office for Police Conduct report on the matter:

“Stop and search is a legitimate policing tactic…The powers have been described as an important tool in dealing with knife crime and drugs, in particular. However, its disproportionate use against people from a Black, Asian, or other minority ethnic background, particularly young Black men, has been a concern for many years and it remains one of the most contentious policing powers.”

Unlike when the Minister was in the Mayor’s office—stop and search went down in every year for which the Prime Minister was Mayor of London—we are debating this against the backdrop of a significant increase in the use of stop and search. In the year ending March 2021, the use of stop and search increased by 24%.

For the sake of accuracy, when I was Deputy Mayor for policing, stop and search increased. The hon. Lady is quite right that it decreased in the second half of the Mayor’s eight-year term. By then, we had got on top of the number of knife crime murders that were happening across London, not least in her constituency—although she was not the Member of Parliament then.

I want to address the issue of disproportionality. No one would deny that when stop and search is used for violence, there is disproportionality, particularly in London although not uniformly across the country. However, we are talking about stop and search in protest situations. For those numbers to show up in stop and search relies on the population in a vicinity of protest being disproportionately reflected demographically. I worry that in their desire to undermine the policy, the Opposition are conflating the two. There is no reason why people showing up to an Extinction Rebellion protest should be stopped and searched disproportionately compared with their demographic background, unless half the people who show up to the protest happen to be from a minority background. We would hope that the stop and search numbers would reflect the population coming to the protest.

The Opposition seem to think that the country is filled with police officers just waiting for their moment to stop and search us, or just looking for an opportunity to be difficult. The hon. Member for Bristol East spoke about the police wanting to be difficult, as if they ever want to be difficult. That indicates a lack of trust in the ability of our police to exercise, as my hon. Friend the Member for Dudley North said, exactly the kind of discretion that we ask them to use every day on the streets, whether in a protest environment or not. I know that the hon. Member for North East Fife has great experience of the fact that we rely on our police officers to use their discretion and judgment. In these circumstances, we are talking about suspicion-led stop and search. There have to be legitimate reasons why the police would stop and search somebody.

I would be worried if the Minister were not considering these issues. Disproportionality means that if somebody is from a different race—in this case, particularly if they are black—they are more likely to be stopped and searched than they would be if they were white. It has nothing to do with the make-up of criminals; it is to do with disproportionality. The report by the NPCC and the College of Policing—I am sure the Minister has read it—talks at great length about the problem of disproportionality and how it needs to be tackled. In previous conversations in the Police, Crime, Sentencing and Courts Bill Committee, the Opposition have said that we need to get those things right before we expand powers. The police would agree that there is a big problem to be fixed.

I would characterise Opposition parties’ arguments in this Committee as seeking clarity to help the police and the legal system. Our role as legislators is to provide that clarity. The hon. Member for Bristol East highlighted in the evidence session last week that people arrested in relation to the destruction of the Colston statue were acquitted. We are asking for clarity in legislation, to enable the police to make the right decisions and be supported on that, and to encourage the courts to follow through on.

I agree. This is about clarity in law to enable the police to do their job. The Government are introducing sweeping and increasingly wide-ranging powers to cover things that stop and search has not historically been used for, and the Opposition think that is wrong.

I want to pursue this issue of disproportionality, because it is incredibly important. Disproportionality increases when there is suspicion-less stop and search, but it very much exists in suspicion-led stop and search. The very reputable Home Affairs Committee published a report last year entitled “The Macpherson Report: Twenty-two years on”, which set out that statistics covering the year to 31 March 2020 showed that ethnic disproportionality in stop and search is worse now than it was 22 years ago. Black people were seven times more likely to be stopped and searched than white people—that is for suspicion-led stop and search—up from five times more likely in 1998. The disproportionality is even starker in no-suspicion searches, where black people were 18 times more likely than white people to be stopped.

There are many case examples in the report from the Independent Office for Police Conduct—a very serious body—looking at stop and search. It found examples where stop and search had not been done correctly, and I am sure the Minister has read about that. The report notes that

“the stop and search records of one officer showed that 79% of their stops and searches under Section 23 of the Misuse of Drugs Act since 2015 involved individuals from a Black, Asian, or other minority ethnic background. In comparison, demographic information from the 2011 census showed that only 43% of the residents in the area covered by the station where the officer was based were from a Black, Asian, or other minority ethnic background.”

It also says:

“Stop and search is often the most confrontational encounter an individual will have with the police. When a search is not carried out professionally and with sensitivity, complainants have told us of the lasting effect it can have, making them feel victimised, humiliated, and violated. And when the individual being stopped is a…child who may subsequently experience repeated stops and searches throughout their lifetime, the cumulative impact can be significant.”

The police are of course able to use reasonable force to carry out stop and search, which introduces the use of Tasers, firearms, batons and handcuffs.

The Home Office’s equality impact assessment on the expansion of stop and search says that

“this would risk having a negative effect on a part of the community where trust and confidence levels are relatively low.”

That is the Home Office’s own assessment of the expansion of these powers.

In our evidence session, we heard from Sir Peter Fahy, who said:

“you have to take into account absolutely the feelings of your local community. I would say that on things like this extension of stop and search, for me there would need to be a well-documented community impact assessment, where the police worked with other agencies and community groups to assess what the impact is going to be.”––[Official Report, Public Order Public Bill Committee, 9 June 2022; c. 59, Q121.]

I am realistic about our chances of winning votes in this room. If we do not win the argument today, I ask the Minister to listen to the voices who say, “If you are introducing this, you have to do it right. You have to make sure that there is a proper impact assessment, that people are involved, that people are trained and that the whole raft of measures that the NPCC and the College of Policing are looking at are put in place to tackle some of the problems.” Those measures have been well set out over the past couple of months, and they should be introduced before changes are made. I urge the Minister to think about trialling the changes, as other measures have been in the past.

I move on to the difficulties in implementing the proposals. On stop and search, Chief Superintendent Phil Dolby said that

“it is still a key point of discussion and, sometimes, contention. We have the community coming in and scrutinising how we have used it. They watch our body-worn video of what we tried to do. We have even got youth versions of that…I do not know how you would do the same kind of thing with protest. I think there is something that needs to be done there. There is best practice advice on how to conduct stop and search, and I think there is potentially some real thinking if those go ahead to start with that position as opposed to learning those lessons as we go along.”––[Official Report, Public Order Public Bill Committee, 9 June 2022; c. 59, Q120.]

In the new race action plan, the NPCC has committed to really looking at what stop and search does. The plan says that, across the country:

“Chief constables will identify and address disproportionality in the use of stop and search, particularly in relation to drugs and the searches of children. This will be achieved by having robust accountability and learning processes based on scrutiny and supervision.”

It has committed to reviewing the use of the smell of cannabis as grounds for stop and search, because that increases disproportionality, as well as the use of Taser, section 60, intimate searches, standardised recording practice—I could go on. The breadth of the NPCC’s commitments reflects its concern about the issue.

Our concern is that the breadth of the Bill’s drafting means potentially endless cases. The powers are so broad: there is an endless list of objects that could be made, adapted or intended for use in the course of, or in connection with, the offences listed, so we are heading for problems. We have already talked about bike locks, but posters, placards, flyers, banners and glue could all potentially fall under this clause. Arguably, the police could have reasonable suspicion to subject any person in the street to a search. How would they establish whether a person was intending to use a poster or a placard in a protest in a way that would be considered illegal? The Government are seriously asking the police to search people on suspicion of carrying a tambourine, when they already have a power to search anyone who is intentionally or recklessly causing public nuisance.

The NPCC’s written evidence says that

“Although currently the wording around ‘intentionally or recklessly causing a public nuisance’ is open to interpretation and would require additional guidance to prevent the onus and risk being placed on an individual officer when deciding to carry out a search.”

If this clause becomes part of the Bill, another ask of the Minister will therefore be to make sure that we have proper guidance. The NPCC continues:

“It would however, evidenced by experiences from forces, be difficult for an individual officer to have the overall picture necessary to make such a decision.”

The police say that the wording is too broad for police officers to interpret without problems occurring. Is the Minister comfortable about the fact that the police have those concerns, and what measures can he put in place to address them?

I conclude by referring back to what Sir Peter Fahy said:

“You do not start with the heaviest. You work up to it, and that then maintains the confidence in your legality and proportionality.”––[Official Report, Public Order Public Bill Committee, 9 June 2022; c. 62, Q122.]

That is how we do British policing. Those are the Peelian principles: de-escalation, communication and negotiation. We in the Labour party think that this clause takes us in the wrong direction.

Question put, That the clause, as amended, stand part of the Bill.

Clause 6, as amended, ordered to stand part of the Bill.

Clause 7

Powers to stop and search without suspicion

Amendment made: 26, clause 7, page 8, line 40, at end insert—

“(iiia) an offence under section (Offence of causing serious disruption by tunnelling) (offence of causing serious disruption by tunnelling);

(iiib) an offence under section (Offence of causing serious disruption by being present in a tunnel) (offence of causing serious disruption by being present in a tunnel);”—(Kit Malthouse.)

This amendment applies the stop and search powers in clause 7 of the Bill to an offence relating to tunnelling under the new clause inserted by NC5 or NC6.

Question proposed, That the clause, as amended, stand part of the Bill.

Clause 7 builds on the Government’s plans to give the police the powers they need to prevent serious disruption at protests by introducing suspicion-less stop and search powers. The hon. Member for Croydon Central referred in her previous speech to both suspicion-led and suspicion-less stop and search.

Although the extension of suspicion-based stop and search powers, provided for by clause 6, will help the police to manage disruptive protests more effectively, it is not always possible in high-pressure, fast-paced protest environments for officers to form reasonable suspicion that individuals may be about to commit an offence. Clause 7 therefore introduces a suspicion-less stop and search power for the offences covered under clause 6.

If an officer of the rank of inspector or above believes that any of the specified offences may be committed in their police area and that individuals are carrying prohibited objects for the commission of those offences, officers may stop and search individuals and vehicles within the area specified by the senior officer, whether or not they suspect those individuals are carrying prohibited objects. If such items are found, the police may seize them.

These powers are modelled on existing suspicion-less stop and search powers available under section 60 of the Criminal Justice and Public Order Act 1994. The powers are well understood by the police, and emulating them prevents confusion between the powers and the complication of officers’ training. As with section 60, powers under clause 7 may not last longer than 24 hours unless an officer of superintendent rank or higher deems it necessary to extend them by a further 24 hours. Such an extension may happen only if senior police officers deem it necessary to prevent the offences in scope from being carried out or to prevent prohibited objects from being carried.

The hon. Lady criticised both suspicion-led and suspicion-less stop and search, and I hope I can allay some of her concerns. As with all stop and search powers, we believe, as she does, that no one should be stopped based on a protected characteristic, and there are safeguards to ensure these powers are used proportionately. This point was emphasised by Her Majesty’s inspector in the recent report on the policing of protests, in which he recognised that

“the proposed new power has the clear potential to improve police efficiency and effectiveness”

in managing protests, so long as they are

“subject to strong and effective safeguards”.

As the hon. Lady knows, we intend to amend PACE code A. We regularly review safeguards, and we now collect more data on stop and search than ever before. That data is posted online, enabling police and crime commissioners and others to hold forces to account. It is also important that communities hold PCCs to account through the electoral process, as I am sure she would agree.

We have responded to the “Inclusive Britain” report by saying that we intend to enhance the safeguards through the development of a national framework for scrutiny of stop and search by local communities, and through the consideration of any unnecessary barriers to the increased use of body-worn video. We also asked the College of Policing to update its stop and search guidance to ensure fair and proportionate use. The updated guidance, which is available to all forces, was published in July 2020 and provides best practice examples of community engagement and security. HMICFRS continues to inspect regularly on stop and search.

It is slightly worrying how the Minister talks about this differently from his own police. The NPCC and the College of Policing talk about it in a very different way. They say that stop and search is an important tool—on which we all agree—but that its implementation is disproportionate and lots of work needs to be done to fix that. The Minister seems to be saying that it does not need to be fixed. Perhaps he should talk to the NPCC, the College of Policing and those who put that report together to ensure that they are on the same page as him.

Notwithstanding the hon. Lady’s patronising tone, I speak to the National Police Chiefs’ Council and senior police officers all the time. In fact, I have lived the stop and search journey for the last 14 years. I have probably spent more time than most talking to people in communities that are affected by violence and where stop and search is regularly utilised about its challenges and its efficacy in protecting people.

I repeat what I have said in the House: I have often been challenged during those 14 years on the disproportionality in the use of stop and search, but I have never been challenged on the disproportionality in the people who are killed with knives. No one has ever said to me that it is a total disgrace that the vast majority of those people are young black men. I would welcome that challenge and a proper set of solutions to that problem.

That is a completely unreasonable distinction to make. I have challenged the number of young black men who have been murdered in my constituency many, many times. Indeed, that is why I set up the all-party parliamentary group on knife crime and why I have worked on that exact issue ever since I entered Parliament. The two things are not comparable. Just because most victims of knife crime murders happen to be young black men in London, that does not mean that the majority of black people are criminals.

No, but I am saying that the two are not connected, and we cannot connect them. The victims are often young black people—I find that as awful as anybody else would, and I have campaigned to do something about it—but that is not the point. The point is that stop and search is disproportionate not because of the nature of crimes, and not because of the victims of crimes, but because it is disproportionate.

I understand what the hon. Lady is saying, but there are complicated reasons why stop and search is disproportionate. Some of them are to do with geography, some with offence types, and some with the way that section 60 is used. I do not think that it is entirely cultural within the police.

There are other disproportionalities of concern. On cannabis possession in London, for example, which the hon. Lady mentioned, there is a strange disproportionality that does not, in my experience, reflect the pattern of cannabis use in London. We need to pay some attention to that. Having said that, I do not necessarily think that that problem and the solutions to it should be a barrier to using the stop-and-search power.

We heard clearly from the National Police Chiefs’ Council’s lead for public order that the use of stop and search—both suspicion-led and, in a fast-moving protest situation, suspicion-less—would be useful and enable police to get ahead of and prevent some of those offences. Indeed, I think I remember him saying that if police had those powers, it would result in less of an infringement on the rights of protestors. We therefore believe that the case has been made.

I will spend a bit of time of clauses 6 and 7 as they are the two important chunks that address suspicion-led and suspicion-less stop and search. The further stop-and-search clauses contain additional but less significant provisions.

Clause 7 addresses peaceful protest as if it were a social ill akin to knife crime, terrorism, serious organised crime or other situations in which people are stopped and searched. Section 1 of the Police and Criminal Evidence Act already allows officers to stop and search those whom they have reasonable grounds to suspect possess stolen or prohibited articles. For the purposes of section 1, prohibited articles include any item that has been made or adapted to be used to cause criminal damage. That would cover most of the scenarios that the Government are worried about.

The issue is that lock-ons, which we have debated and agreed have caused significant problems, are infrequent compared with protests as a whole. There might be a very large protest of 100,000 people, with 10 people or fewer trying to do something disruptive or illegal. That does not make the entire protest illegal; it makes those protestors unlawful. Our concern about the even broader extension of the powers, and the Bill more widely, is that we are not criminalising the criminals; we risk criminalising the vast majority of the people who want to protest and have their say on the issues of the day.

I am sure Matt Parr must be pleased, because we talk about him so much in Committee. The Minister is absolutely right that he agreed that the power could be a useful tool, but he listed a lot of concern in his report about how it would be implemented:

“Current suspicion-less stop and search powers for weapons…are intended to be used by the police to combat serious violence and the carriage of ‘dangerous instruments or offensive weapons’. Using a similar suspicion-less power to target peaceful protesters, who may cause serious (but non-violent) disruption, is a significantly different proposition. Given the potential ‘chilling effect’ on freedom of assembly and expression in terms of discouraging people from attending protests where they may be stopped and searched, we would expect any new suspicion-less powers to be subject to very careful scrutiny by the courts.

Such powers could have a disproportionate impact on people from black, Asian and other minority ethnic groups. We have repeatedly raised concerns about the police’s disproportionate use of stop and search in previous inspection reports…If and when contemplating the use of such powers in future, forces will need to carefully consider the demographic composition of the protest groups concerned. The importance of this issue should not be underestimated.

We would wish to see appropriate legal thresholds and authority levels set for authorising the use of the power, and the use of such powers monitored in a similar way to existing stop and search powers…When a person is stopped and searched, they may make an application for a written statement that they were searched. We would also wish to see high standards of training, vigilance and caution in the use of such a power”.

It is a well-used expression, but this is using a hammer to crack a nut. We do not want all the peaceful protesters to be hammered by the legislation when they are not doing anything unlawful.

The hon. Lady made a point moments ago that she has the unfortunate situation of BAME members of her community being killed because of knife crime. We are ignoring an important statistic, which is the fact that very often, people who come to harm or die because of knife crime do so as a result of the knife they have brought themselves. I hear what she is saying, but the measure is about saving lives and saving people from harm. I come back to the point that we are trying to have a common-sense approach that will save lives. If that has such a chilling effect on people attending so-called protests, then I wonder whether there is a balance that we need to consider. Which is more important, the saving of lives or the potential disruption to people’s willingness or want to participate in demonstrations or protests?

I do not think that anyone is arguing that we should not have stop-and-search powers for knife crime. Absolutely, in a lot of knife crime cases, who the victim or the perpetrator is depends on whoever happens to win the fight at the time. That is very difficult to deal with, but it is not relevant to this argument, which is about giving the police disproportionate powers to deal with a situation that they already have powers to deal with, in the meantime potentially criminalising people who would not have been, and should not be, criminalised.

The concerns about disproportionality exist for suspicion-less stop and search far more than for suspicion-led stop and search. The more ambiguity and the greater lack of evidence there is for who should be stopped, the more the disproportionality increases. This is something that the former Prime Minister, the right hon. Member for Maidenhead (Mrs May), was very interested in when she was Home Secretary. She insisted that stop and search be intelligence-led, and there was an improvement on her watch in the proportion of people who were found to be carrying something illegal. I think the figure at the moment is that one in 100 stop and searches for knives under section 60 leads to the discovery of a knife. We absolutely want to find that knife, but 99 stop and searches is a lot of police time and resources, and there are other ways to gather intelligence and solve crime.

I want to stress how many organisations are concerned about the powers. We have been very lucky to have people give evidence and write to us about their concerns. Organisations believe that the powers are incompatible with article 11 of the ECHR and article 21 of the international covenant on civil and political rights, as they relate to freedom of peaceful assembly. During the debate on the Police, Crime, Sentencing and Courts Bill in the Lords, Lord Carlile compared the powers with the use of stop-and-search powers under the Terrorism Act. He noted that:

“The Terrorism Act stop and search power is there for the prevention of actual acts of actual terrorism which kill actual people.

The dilution of without-suspicion stop and search powers is a menacing and dangerous measure.”—[Official Report, House of Lords, 17 January 2022; Vol. 817, c. 1435.]

In a similar way, Liberty has noted that stop and search without suspicion has normally been used

“in the context of crimes that will potentially kill many, many people.”––[Official Report, Public Order Public Bill Committee, 9 June 2022; c. 75, Q145.]

Lord Carlile concluded that the power

“is disproportionate, and the Government should think twice about it.”—[Official Report, House of Lords, 17 January 2022; Vol. 817, c. 1435.]

In its oral evidence, Amnesty noted that

“the proposal fails the test of lawfulness…the confiscation powers that go behind the stop-and-search powers around the locking-on offence capture an enormously broad range of items that an officer could argue might be capable of causing an offence. You have so many caveats that you will get into a situation where an ordinary person could have no idea why they were stopped, or why somebody might be taking an item off them that was completely lawful—everything from string to a bit of glue. It fails on that basic principle of lawfulness, which I think is incredibly problematic.”––[Official Report, Public Order Public Bill Committee, 9 June 2022; c. 75-76, Q145.]

The list of bodies and individuals—including HMICFRS, the College of Policing, former police chiefs and the right hon. Member for Maidenhead—have highlighted issues and broad concerns about suspicion-less stop and search. I say to the Minister that a whole raft of work is being done by the NSPCC and the College of Policing, and that should be done before we try to extend such extreme powers to the police without putting in place any measures to stop the disproportionality.

I will leave it there. We have the same view on clause 7 as we did on clause 6: we do not think it is necessary or proportionate. We think that it will criminalise potentially innocent protesters and that the Government should think again.

Question put, That the clause, as amended, stand part of the Bill.

Clause 7, as amended, ordered to stand part of the Bill.

Clause 8

Further provisions about authorisations and directions under section 7

Question proposed, That the clause stand part of the Bill.

The clause makes further provision as to how police officers should authorise the aforementioned stop and search. It extends to the British Transport Police. It is self-explanatory.

Amendment 8, tabled by the hon. Member for Glasgow North East, is supported by me and the shadow Home Secretary, my right hon. Friend the Member for Normanton, Pontefract and Castleford (Yvette Cooper), and we believe the clause should be struck from the Bill.

Question put and agreed to.

Clause 8 accordingly ordered to stand part of the Bill.

Clause 9

Further provisions about searches under section 7

Question proposed, That the clause stand part of the Bill.

The clause provides that anyone searched or who has their vehicle searched under the new suspicion-less stop-and-search powers is entitled to apply for a written statement from the police confirming that they have been searched. That is in line with the existing stop-and-search powers, and a number of forces will allow a person to do that electronically. It also allows the Home Secretary to make regulations, subject to the negative resolution procedure, governing the retention, keeping and disposal of prohibited objects seized by the police under these powers.

We agree with amendment 9, tabled by the hon. Member for Glasgow North East, and we would leave out the clause.

Question put and agreed to.

Clause 9 accordingly ordered to stand part of the Bill.

Clause 10

Offence relating to section 7

Question proposed, That the clause stand part of the Bill.

Anyone who intentionally obstructs a constable exercising suspicion-less stop-and-search powers under clause 7 commits an offence, with a maximum penalty of one month’s imprisonment or a level 3 fine. That is in line with other stop-and-search powers.

We support amendment 10, tabled by the hon. Member for Glasgow North East, and we would leave out the clause. We do not support the measure. Liberty has suggested that a consequence of the offence is that it could be used to target legal observers who may be stopped and searched on their way to a protest for carrying items such as bus cards or for wearing an identifiable yellow bib. There are legitimate concerns that should be considered, so we do not support the clause.

Question put and agreed to.

Clause 10 accordingly ordered to stand part of the Bill.

Clause 11

Processions, assemblies and one-person protests: delegation of functions

Question proposed, That the clause stand part of the Bill.

The clause reflects a request from the Metropolitan Police to reflect the differential rank structure with regard to the delegation of powers of authorisation such that an assistant commissioner in the Metropolitan Police can delegate the authorisation powers to a commander, which would be different from other forces in the rest of the UK, but it seems a sensible and proportionate measure, given the differential rank structure.

We have no issues with the clause. To quote Matt Parr in the evidence session:

“That strikes me as entirely pragmatic. If you look at the Met, the real expertise in public order tends to be at commander rank, rather than above, where people get a bit more generalist. The deep professional experts in London, in my experience, are the commanders. That strikes me as perfectly sensible.”––[Official Report, Public Order Public Bill Committee, 13 June 2022; c. 56-57, Q117.]

We agree.

Question put and agreed to.

Clause 11 accordingly ordered to stand part of the Bill.

Clause 12

Serious disruption prevention order made on conviction

I beg to move amendment 38, in clause 12, page 12, line 16, leave out

“on the balance of probabilities”

and insert “beyond reasonable doubt”.

This amendment would raise the burden of proof for imposing a serious disruption prevention order to the criminal standard.

With this it will be convenient to discuss amendment 39, in clause 12, page 12, line 21, leave out

“on the balance of probabilities”

and insert “beyond reasonable doubt”.

This amendment would raise the burden of proof for imposing a serious disruption prevention order to the criminal standard.

The purpose of these amendments is to raise the burden of proof in relation to SDPOs to the criminal standard, rather than the balance of probabilities. Simply put, there is a reason why we use a higher bar for crimes that result in people being fined or losing their liberty, and the risks are the same here. One condition of an SDPO could be that someone has to wear an electronic monitor and have their every movement tracked. Given the impact on day-to-life, it is not acceptable that that could be imposed just because the evidence suggests that the offence is more likely than not to have been committed. Justice requires that people are given due process, and it is vastly inappropriate for a low standard of proof to be used when we are, effectively, taking away someone’s rights and restricting their movements. I think this measure shows that we are slipping into a concerning state of affairs, and that is why my amendments suggest that the situation should be rectified.

I also want to talk about keeping trust with the public, and I am thinking of Peter Fahy’s comments last week about the challenges of dealing with protests. Our concern with the legislation is that when the police fail to deal with things effectively, they are seen as incompetent, and that risks public trust. For the public to have trust, they must feel that punishments are fairly applied. We heard a lot in the evidence sessions last week about the importance of policing by consent. That is something that I am passionate about as a former police officer, and it is what makes British policing unique. It is a fundamental principle enshrined in our justice system, and to maintain this consent and to further trust, people must know that sanctions are applied fairly.

I do not wish to add to what the hon. Lady has said, other than to say that we agree with the amendments.

The amendments, I am afraid, are a deliberate attempt to water down the courts’ ability to place an SDPO on those who are intent on repeatedly disrupting the lives of others, as we have talked about a lot during our consideration of the Bill. Amendments 38 and 39 attempt to raise the burden of proof required for SDPOs from

“on the balance of probabilities”

to “beyond reasonable doubt”, in effect requiring the criminal rather than the civil standard of proof. Amendment 38 raises the burden of proof required when considering whether an offence constitutes a protest-related offence for the purpose of making a serious disruption prevention order. Amendment 39 does the same when a court considers whether a person has engaged, in the last five years, in previous behaviour that would qualify them for an SDPO.

The amendments would make it more challenging for a court to place an SDPO on prolific activists who engage in criminal or unjustifiable behaviour. As this is a court order, I see no issue with requiring the civil burden of proof. The Opposition have shown much enthusiasm for injunctions, which operate to a civil burden of proof, and the same burden would be required here. For the avoidance of doubt, for someone to be convicted for breaching an SDPO, the criminal burden of proof would apply.

I want to query the Minister’s use of the phrase “unjustifiable behaviour”. What would that cover?

We have discussed the range of offences that offenders commit. In presenting the requirement for this order to a court, the police would have to make a case that a series of offences had occurred, or indeed that serious disruption had been caused by the individuals’ behaviour, to warrant this order. We will come on to the substance of those matters, and we can debate it at that point. For the reasons I have given, we do not agree with the amendment, and we hope that the hon. Member will withdraw it.

This is a probing amendment to get the Government’s view on the matter. The Minister has made it clear that he thinks the civil burden is appropriate at this time, so I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Ordered, That further consideration be now adjourned. —(Scott Mann.)

Adjourned till this day at Two o’clock.

Public Order Bill (Sixth sitting)

The Committee consisted of the following Members:

Chairs: Peter Dowd, †David Mundell

Anderson, Lee (Ashfield) (Con)

† Bridgen, Andrew (North West Leicestershire) (Con)

† Chamberlain, Wendy (North East Fife) (LD)

† Cunningham, Alex (Stockton North) (Lab)

† Doyle-Price, Jackie (Thurrock) (Con)

† Elmore, Chris (Ogmore) (Lab)

† Elphicke, Mrs Natalie (Dover) (Con)

† Hunt, Tom (Ipswich) (Con)

† Huq, Dr Rupa (Ealing Central and Acton) (Lab)

† Jones, Sarah (Croydon Central) (Lab)

Longhi, Marco (Dudley North) (Con)

† McCarthy, Kerry (Bristol East) (Lab)

McLaughlin, Anne (Glasgow North East) (SNP)

† Malthouse, Kit (Minister for Crime and Policing)

† Mann, Scott (North Cornwall) (Con)

† Mohindra, Mr Gagan (South West Hertfordshire) (Con)

† Vickers, Matt (Stockton South) (Con)

Anne-Marie Griffiths, Sarah Thatcher, Committee Clerks

† attended the Committee

Public Bill Committee

Thursday 16 June 2022


[David Mundell in the Chair]

Public Order Bill

Clause 12

Serious disruption prevention order made on conviction

Question proposed, That the clause stand part of the Bill.

Clause 12 will protect the British public from the small minority of protesters who are determined to repeatedly inflict disruption on those who simply wish to go about their daily lives. In 2021, approximately 170 Insulate Britain protesters were arrested about 980 times for obstructing motorways. That means that each protester was arrested on average nearly six times, on separate occasions. It is clear that something needs to be done to prevent these people from returning time and time again to ruin the daily life of the wider public, and to stop them cocking a snook at our justice system.

We have heard, and no doubt will hear more, criticism of serious disruption prevention orders, but there is one big misconception that I want to address: the claim that SDPOs ban protests. Critics have referred to the report by Her Majesty’s inspectorate of constabulary and fire and rescue services about the policing of protest, which found protest banning orders to be incompatible with human rights legislation, and we heard that during our evidence day. But the clue is in the name: HMICFRS considered orders that sought to outright ban people from protesting. SDPOs only enable the independent judiciary to place necessary and proportionate conditions on people to prevent them from engaging in criminal acts of protest and causing serious disruption time and time again. Those conditions could include curfews or electronic monitoring. Most importantly, they will be for the courts, not Government, to decide.

Under this clause, an SDPO can be imposed on a person convicted of a protest-related offence where, in the past five years, that person has been convicted of another offence or has committed other specified protest-related behaviour. A breach of an order will be a criminal offence, punishable by an unlimited fine, six months’ imprisonment, or both. An SDPO can be made if the court is satisfied, on the balance of probabilities, that the person has, on two or more occasions, been convicted of a protest-related offence; has been found in contempt of court for a protest-related breach of an injunction; has caused or contributed to a protest-related criminal offence or breach of an injunction; or has carried out, or caused or contributed to the carrying out by another person of, protest-related activities that resulted, or were likely to result, in serious disruption.

Along with the stop-and-search measures, these measures provide pre-emptive powers for the police. Officers will be able to interrupt and arrest those who breach the conditions of their SDPO before they have the opportunity to commit another disruptive act. SDPOs mirror many characteristics of injunctions, which the Opposition parties have been so keen for us and others to use. I urge that clause 12 stand part of the Bill.

A raft of clauses relate to serious disruption prevention orders, but clauses 12 and 13 are the most significant, so I will direct focus my attention on them. The shadow Home Secretary, my right hon. Friend the Member for Normanton, Pontefract and Castleford (Yvette Cooper), and I put our names to amendment 12, which would have left out the entirety of clause 12.

The clause, as we know, creates a new civil order—the serious disruption prevention order. These orders can be imposed on individuals who have a previous conviction for a protest-related offence and who have participated in another protest within a five-year period. There is a very broad list of conditions that may be met, including that the offender has been convicted of another protest-related offence; has been found in contempt of court for a protest-related breach of an injunction; has carried out activities related to a protest that resulted, or were likely to result, in serious disruption to two or more individuals or to an organisation; has caused or contributed to any other person committing a protest-related offence or protest-related breach of an injunction; or has caused or contributed to the carrying out by any other person of activities related to a protest that resulted, or were likely to result, in serious disruption to two or more individuals or to an organisation. That means that someone can be given an order if they have one previous protest-related offence and just contribute to another person’s activities, which were likely to result in serious disruption to only two people. As in so much of the Bill, that is a low threshold for such a restriction on someone’s rights.

Serious disruption prevention orders can last anywhere from a week to two years, with the potential to be renewed indefinitely. They can ban individuals from protesting, associating with certain people at certain times, and using the internet in certain ways. Those subject to the orders might have to report to certain places at certain times, and even be electronically monitored. If they fail to fulfil one of the requirements without a reasonable excuse, provide the police with false information, or violate a prohibition in the SDPO, they will have committed a crime. The consequence is a maximum of 51 weeks’ imprisonment, a fine, or both.

When we debated these clauses previously, we had, as the Minister referred to, a conversation about protest banning orders and the work that has gone into looking at them. In the evidence session, the Minister said of SDPOs that

“this measure is a conditional order, which may place restrictions or conditions on somebody’s ability to operate in a protest environment.”

However, the restrictions are significantly broader than just being prevented from attending protests. Martha Spurrier from Liberty pointed out that

“the serious disruption prevention orders have the capacity to be absolute bans in the same way as the protest banning orders...under judicial supervision—but... to a low standard of proof.”––[Official Report, Public Order Public Bill Committee, 9 June 2022; c. 69, Q131.]

Again, the Government are extending to peaceful protest powers that we would normally make available just for serious violence and terrorism.

Perhaps I can reiterate the point that I made, because I am interested in the hon. Lady’s view, although I know we want to get through a lot this afternoon. Other than, for example, the condition of electronic monitoring, which we will come to, what would be the difference between an injunction, on which she is so keen and which could be used as a complete ban on attending any protest, and an SPDO, which has many more safety measures around it?

I do not think that an SPDO has much more safety around it. The conditions under which someone can get an order—which I have just read out—include that they have caused, or contributed to, the carrying out by any other person of activities related to a protest that resulted in, or were likely to result in, serious disruption to two or more individuals. Conditions could be put on people and, if those people were deemed to have not adhered to them, new conditions could continue indefinitely, or people could go to prison or be fined. There is a specific condition that is put on an individual, with a very broad and legally difficult to identify range of conditions that would then be possible. It is different.

Police officers themselves, whom we turn to so often, said that an SPDO is

“a severe restriction on a person’s rights to protest and in reality, is unworkable”.

It is worth reflecting on what the inspectorate said about protest banning orders:

“We agree with the police and Home Office that such orders would neither be compatible with human rights legislation nor create an effective deterrent. All things considered, legislation creating protest banning orders would be legally very problematic because, however many safeguards might be put in place, a banning order would completely remove an individual’s right to attend a protest. It is difficult to envisage a case where less intrusive measures could not be taken to address the risk that an individual poses, and where a court would therefore accept that it was proportionate to impose a banning order.”

The inspectorate’s report also said:

“This proposal essentially takes away a person’s right to protest and…we believe it unlikely the measure would work as hoped.”

In the evidence sessions, the National Police Chiefs’ Council protest lead said:

“unless we knew the exact circumstances of the individual it would be hard to say how exactly the orders could be justified.”––[Official Report, Public Order Public Bill Committee, 9 June 2022; c. 15, Q23.]

Senior officers noted that protest banning orders would

“unnecessarily curtail people’s democratic right to protest”

and be

“a massive civil rights infringement”.

In the words of Liberty, the orders are

“an unprecedented and highly draconian measure that stand to extinguish named individuals’ fundamental right to protest as well as their ability to participate in a political community. They will also have the effect of subjecting individuals and wider communities to intrusive surveillance.”

It is worth digging down a little into the detail of these prevention orders. For example, would buying a lock, paint or superglue, observing a protest from afar or holding a banner be enough to contribute to a protest-related offence? As the noble Lord Paddick noted at Report stage of the Police, Crime, Sentencing and Courts Bill, when these measures were first introduced,

“you do not even have to have been to a protest to be banned from future ones.”—[Official Report, House of Lords, 17 January 2022; Vol. 817, c. 1439.]

That is where we are.

Restrictions imposed via a serious disruption prevention order are not necessarily directed at preventing anything criminal, but at preventing the facilitation of non-criminal protest-related activities, which could include sharing songs or chants, flag designs or just some information about where protests are being held. Underpinning our concerns is the wide and diffuse definition of serious disruption, and the power of the Secretary of State to redefine it.

For those given an SDPO, there are a wide set of requirements and prohibitions, which, again, might interfere with rights to respect for private and family life and to freedom of thought, belief and religion, expression, and assembly. Individuals might be prevented from associating with particular people or community members. They might not be able to possess locks, paint or glue. Crucially, they would not be allowed to participate in protests. They might also not be allowed to worship—the Quakers see direct action as a crucial part of their faith. Although there is a safeguard in the Bill, it does not match up to the overreach that the clauses represent.

The enforcement of an SDPO is also potentially problematic. Let us take electronic monitoring. There is the potential for 24/7 GPS tracking under the Bill. We are unclear whether that is proportionate for the undefined prevention of serious disruption.

Failing to comply with an SDPO could result in a maximum of 51 weeks in prison, a fine, or both, but none of the breaches is criminal without an SDPO. The clause criminalises potentially normal activities. When we consider that there is no limit to the number of times that an SDPO can be renewed by the court, we risk people being pushed into a cycle of criminalisation and indefinite periods of not being able to protest or associate with people, look on the internet or take part in other normal parts of life.

For something that places really serious restrictions on a person’s liberty, the court can make an SDPO if it is satisfied

“on the balance of probabilities that the current offence is a protest-related offence”,

rather than that being beyond reasonable doubt. That is the civil standard of proof. SDPOs on conviction can be made on the basis of lower-quality evidence.

I am conscious of the point the hon. Lady is making about the infringement of people’s liberties. Will she accept that this is not a novel concept and in fact happens already? For example, she will remember the incident where anti-lockdown protesters chased and harassed a journalist outside Downing Street. When that happened, those protesters got a fine and unpaid work, but the judge also banned them from attending near Parliament and in Whitehall for 18 months as part of the condition of their punishment. This concept is not a novel one. In many ways, codifying this seems a sensible thing to do, rather than leaving it entirely to judicial discretion.