Skip to main content

Public Bill Committees

Debated on Thursday 9 June 2022

Online Safety Bill (Seventh sitting)

The Committee consisted of the following Members:

Chairs: Sir Roger Gale, † Christina Rees

† Ansell, Caroline (Eastbourne) (Con)

† Bailey, Shaun (West Bromwich West) (Con)

† Blackman, Kirsty (Aberdeen North) (SNP)

† Carden, Dan (Liverpool, Walton) (Lab)

† Davies-Jones, Alex (Pontypridd) (Lab)

† Double, Steve (St Austell and Newquay) (Con)

† Fletcher, Nick (Don Valley) (Con)

† Holden, Mr Richard (North West Durham) (Con)

† Keeley, Barbara (Worsley and Eccles South) (Lab)

† Leadbeater, Kim (Batley and Spen) (Lab)

† Miller, Dame Maria (Basingstoke) (Con)

† Mishra, Navendu (Stockport) (Lab)

† Moore, Damien (Southport) (Con)

† Nicolson, John (Ochil and South Perthshire) (SNP)

† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)

† Russell, Dean (Watford) (Con)

† Stevenson, Jane (Wolverhampton North East) (Con)

Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks

† attended the Committee

Public Bill Committee

Thursday 9 June 2022

(Morning)

[Christina Rees in the Chair]

Online Safety Bill

We are now sitting in public and proceedings are being broadcast. Please switch electronic devices to silent. Tea and coffee are not allowed during sittings. I have no objections to Members taking their jackets off—it is very warm in this room.

Clause 17

Duty about content reporting

Question proposed, That the clause stand part of the Bill.

Good morning, Ms Rees. It is a pleasure to serve once again under your chairmanship. I wondered whether the shadow Minister, the hon. Member for Pontypridd, wanted to speak first—I am always happy to follow her, if she would prefer that.

I do my best.

Clauses 17 and 27 have similar effects, the former applying to user-to-user services and the latter to search services. They set out an obligation on the companies to put in place effective and accessible content reporting mechanisms, so that users can report issues. The clauses will ensure that service providers are made aware of illegal and harmful content on their sites. In relation to priority illegal content, the companies must proactively prevent it in the first place, but in the other areas, they may respond reactively as well.

The clause will ensure that anyone who wants to report illegal or harmful content can do so in a quick and reasonable way. We are ensuring that everyone who needs to do that will be able to do so, so the facility will be open to those who are affected by the content but who are not themselves users of the site. For example, that might be non-users who are the subject of the content, such as a victim of revenge pornography, or non-users who are members of a specific group with certain characteristics targeted by the content, such as a member of the Jewish community reporting antisemitic content. There is also facility for parents and other adults with caring responsibility for children, and adults caring for another adult, to report content. Clause 27 sets out similar duties in relation to search. I commend the clauses to the Committee.

I will talk about this later, when we come to a subsequent clause to which I have tabled some amendments—I should have tabled some to this clause, but unfortunately missed the chance to do so.

I appreciate the Minister laying out why he has designated the people covered by this clause; my concern is that “affected” is not wide enough. My logic is that, on the strength of these provisions, I might not be able to report racist content that I come across on Twitter if I am not the subject of that content—if I am not a member of a group that is the subject of the content or if I am not caring for someone who is the subject of it.

I appreciate what the Minister is trying to do, and I get the logic behind it, but I think the clause unintentionally excludes some people who would have a reasonable right to expect to be able to make reports in this instance. That is why I tabled amendments 78 and 79 to clause 28, about search functions, but those proposals would have worked reasonably for this clause as well. I do not expect a positive answer from the Minister today, but perhaps he could give consideration to my concern. My later amendments would change “affected person” to “any other person”. That would allow anyone to make a report, because if something is illegal content, it is illegal content. It does not matter who makes the report, and it should not matter that I am not a member of the group of people targeted by the content.

I report things all the time, particularly on Twitter, and a significant amount of it is nothing to do with me. It is not stuff aimed at me; it is aimed at others. I expect that a number of the platforms will continue to allow reporting for people who are outwith the affected group, but I do not want to be less able to report than I am currently, and that would be the case for many people who see concerning content on the internet.

The hon. Lady is making a really important point. One stark example that comes to my mind is when English footballers suffered horrific racist abuse following the penalty shootout at the Euros last summer. Hundreds of thousands of people reported the abuse that they were suffering to the social media platforms on their behalf, in an outcry of solidarity and support, and it would be a shame if people were prevented from doing that.

I absolutely agree. I certainly do not think I am suggesting that the bigger platforms such as Twitter and Facebook will reduce their reporting mechanisms as a result of how the Bill is written. However, it is possible that newer or smaller platforms, or anything that starts after this legislation comes, could limit the ability to report on the basis of these clauses.

Good morning, Ms Rees.

It is important that users of online services are empowered to report harmful content, so that it can be removed. It is also important for users to have access to complaints procedures when wrong moderation decisions have been made. Reporting and complaint mechanisms are integral to ensuring that users are safe and that free speech is upheld, and we support these provisions in the Bill.

Clauses 17 and 18, and clauses 27 and 28, are two parts of the same process: content reporting by individual users, and the handling of content reported as a complaint. However, it is vital that these clauses create a system that works. That is the key point that Labour Members are trying to make, because the wild west system that we have at the moment does not work.

It is welcome that the Government have proposed a system that goes beyond the users of the platform and introduces a duty on companies. However, companies have previously failed to invest enough money in their complaints systems for the scale at which they are operating in the UK. The duties in the Bill are an important reminder to companies that they are part of a wider society that goes beyond their narrow shareholder interest.

One example of why this change is so necessary, and why Labour Members are broadly supportive of the additional duties, is the awful practice of image abuse. With no access to sites on which their intimate photographs are being circulated, victims of image abuse have very few if any routes to having the images removed. Again, the practice of image abuse has increased during the pandemic, including through revenge porn, which the Minister referred to. The revenge porn helpline reported that its case load more than doubled between 2019 and 2020.

These clauses should mean that people can easily report content that they consider to be either illegal, or harmful to children, if it is hosted on a site likely to be accessed by children, or, if it is hosted on a category 1 platform, harmful to adults. However, the Minister needs to clarify how these service complaints systems will be judged and what the performance metrics will be. For instance, how will Ofcom enforce against a complaint?

In many sectors of the economy, even with long-standing systems of regulation, companies can have tens of millions of customers reporting content, but that does not mean that any meaningful action can take place. The hon. Member for Aberdeen North has just told us how often she reports on various platforms, but what action has taken place? Many advocacy groups of people affected by crimes such as revenge porn will want to hear, in clear terms, what will happen to material that has been complained about. I hope the Minister can offer that clarity today.

Transparency in reporting will be vital to analysing trends and emerging types of harm. It is welcome that in schedule 8, which we will come to later, transparency reporting duties apply to the complaints process. It is important that as much information as possible is made public about what is going on in companies’ complaints and reporting systems. As well as the raw number of complaints, reporting should include what is being reported or complained about, as the Joint Committee on the draft Bill recommended last year. Again, what happens to the reported material will be an important metric on which to judge companies.

Finally, I will mention the lack of arrangements for children. We have tabled new clause 3, which has been grouped for discussion with other new clauses at the end of proceedings, but it is relevant to mention it now briefly. The Children’s Commissioner highlighted in her oral evidence to the Committee how children had lost faith in complaints systems. That needs to be changed. The National Society for the Prevention of Cruelty to Children has also warned that complaints mechanisms are not always appropriate for children and that a very low proportion of children have ever reported content. A child specific user advocacy body could represent the interests of child users and support Ofcom’s regulatory decisions. That would represent an important strengthening of protections for users, and I hope the Government will support it when the time comes.

I rise briefly to talk about content reporting. I share the frustrations of the hon. Member for Aberdeen North. The way I read the Bill was that it would allow users and affected persons, rather than “or” affected persons, to report content. I hope the Minister can clarify that that means affected persons who might not be users of a platform. That is really important.

Will the Minister also clarify the use of human judgment in these decisions? Many algorithms are not taking down some content at the moment, so I would be grateful if he clarified that there is a need for platforms to provide a genuine human judgment on whether content is harmful.

I want to raise an additional point about content reporting and complaints procedures. I met with representatives of Mencap yesterday, who raised the issue of the accessibility of the procedures that are in place. I appreciate that the Bill talks about procedures being accessible, but will the Minister give us some comfort about Ofcom looking at the reporting procedures that are in place, to ensure that adults with learning disabilities in particular can access those content reporting and complaints procedures, understand them and easily find them on sites?

That is a specific concern that Mencap raised on behalf of its members. A number of its members will be users of sites such as Facebook, but may find it more difficult than others to access and understand the procedures that are in place. I appreciate that, through the Bill, the Minister is making an attempt to ensure that those procedures are accessible, but I want to make sure they are accessible not just for the general public but for children, who may need jargon-free access to content reporting and complaints procedures, and for people with learning disabilities, who may similarly need jargon-free, easy-to-understand and easy-to-find access to those procedures.

Let me try to address some of the questions that have been raised in this short debate, starting with the question that the hon. Member for Aberdeen North quite rightly asked at the beginning. She posed the question, “What if somebody who is not an affected person encountered some content and wanted to report it?” For example, she might encounter some racist content on Twitter or elsewhere and would want to be able to report it, even though she is not herself the target of it or necessarily a member of the group affected. I can also offer the reassurance that my hon. Friend the Member for Wolverhampton North East asked for.

The answer is to be found in clause 17(2), which refers to

“A duty to operate a service using systems and processes that allow users and”—

I stress “and”—“affected persons”. As such, the duty to offer content reporting is to users and affected persons, so if the hon. Member for Aberdeen North was a user of Twitter but was not herself an affected person, she would still be able to report content in her capacity as a user. I hope that provides clarification.

I appreciate that. That is key, and I am glad that this is wider than just users of the site. However, taking Reddit as an example, I am not signed up to that site, but I could easily stumble across content on it that was racist in nature. This clause would mean that I could not report that content unless I signed up to Reddit, because I would not be an affected person or a user of that site.

I thank the hon. Lady for her clarificatory question. I can confirm that in order to be a user of a service, she would not necessarily have to sign up to it. The simple act of browsing that service, of looking at Reddit—not, I confess, an activity that I participate in regularly—regardless of whether or not the hon. Lady has an account with it, makes her a user of that service, and in that capacity she would be able to make a content report under clause 17(2) even if she were not an affected person. I hope that clears up the question in a definitive manner.

The hon. Lady asked in her second speech about the accessibility of the complaints procedure for children. That is strictly a matter for clause 18, which is the next clause, but I will quickly answer her question. Clause 18 contains provisions that explicitly require the complaints process to be accessible. Subsection (2)(c) states that the complaints procedure has to be

“easy to access, easy to use (including by children) and transparent”,

so the statutory obligation that she requested is there in clause 18.

Can the Minister explain the logic in having that phrasing for the complaints procedure but not for the content-reporting procedure? Surely it would also make sense for the content reporting procedure to use the phrasing

“easy to access, easy to use (including by children) and transparent.”

There is in clause 17(2)

“a duty to operate a service that allows users and affected persons to easily report content which they consider to be content of a…kind specified below”,

which, of course, includes services likely to be accessed by children, under subsection (4). The words “easily report” are present in clause 17(2).

I will move on to the question of children reporting more generally, which the shadow Minister raised as well. Clearly, a parent or anyone with responsibility for a child has the ability to make a report, but it is also worth mentioning the power in clauses 140 to 142 to make super-complaints, which the NSPCC strongly welcomed its evidence. An organisation that represents a particular group—an obvious example is the NSPCC representing children, but it would apply to loads of other groups—has the ability to make super-complaints to Ofcom on behalf of those users, if it feels they are not being well treated by a platform. A combination of the parent or carer being able to make individual complaints, and the super-complaint facility, means that the points raised by Members are catered for. I commend the clause to the Committee.

Question put and agreed to.

Clause 17 accordingly ordered to stand part of the Bill.

Clause 18

Duties about complaints procedures

Question proposed, That the clause stand part of the Bill.

With this it will be convenient to discuss the following:

Amendment 78, in clause 28, page 28, line 28, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider to be illegal.

Amendment 79, in clause 28, page 28, line 30, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider not to comply with sections 24, 27 or 29.

Clause 28 stand part.

New clause 1—Report on redress for individual complaints

“(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under—

(a) section 18; and

(b) section 28

(2) The report must—

(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services and regulated search services;

(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and

(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services and search services.

(3) The report must be laid before Parliament within six months of the commencement of this Act.”

I will speak to new clause 1. Although duties about complaints procedures are welcome, it has been pointed out that service providers’ user complaints processes are often obscure and difficult to navigate—that is the world we are in at the moment. The lack of any external complaints option for individuals who seek redress is worrying.

The Minister has just talked about the super-complaints mechanism—which we will come to later in proceedings—to allow eligible entities to make complaints to Ofcom about a single regulated service if that complaint is of particular importance or affects a particularly large number of service users or members of the public. Those conditions are constraints on the super-complaints process, however.

An individual who felt that they had been failed by a service’s complaints system would have no source of redress. Without redress for individual complaints once internal mechanisms have been exhausted, victims of online abuse could be left with no further options, consumer protections could be compromised, and freedom of expression could be impinged upon for people who felt that their content had been unfairly removed.

Various solutions have been proposed. The Joint Committee recommended the introduction of an online safety ombudsman to consider complaints for which recourse to internal routes of redress had not resulted in resolution and the failure to address risk had led to significant and demonstrable harm. Such a mechanism would give people an additional body through which to appeal decisions after they had come to the end of a service provider’s internal process. Of course, we as hon. Members are all familiar with the ombudsman services that we already have.

Concerns have been raised about the level of complaints such an ombudsman could receive. However, as the Joint Committee noted, complaints would be received only once the service’s internal complaints procedure had been exhausted, as is the case for complaints to Ofcom about the BBC. The new clause seeks to ensure that we find the best possible solution to the problem. There needs to be a last resort for users who have suffered serious harm on services. It is only through the introduction of an external redress mechanism that service providers can truly be held to account for their decisions as they impact on individuals.

I rise to contribute to the stand part debate on clauses 18 and 28. It was interesting, though, to hear the debate on clause 17, because it is right to ask how the complaints services will be judged. Will they work in practice? When we start to look at how to ensure that the legislation works in all eventualities, we need to ensure that we have some backstops for when the system does not work as it should.

It is welcome that there will be clear duties on providers to have operational complaints procedures—complaints procedures that work in practice. As we all know, many of them do not at the moment. As a result, we have a loss of faith in the system, and that is not going to be changed overnight by a piece of legislation. For years, people have been reporting things—in some cases, very serious criminal activity—that have not been acted on. Consumers—people who use these platforms—are not going to change their mind overnight and suddenly start trusting these organisations to take their complaints seriously. With that in mind, I hope that the Minister listened to the points I made on Second Reading about how to give extra support to victims of crimes or people who have experienced things that should not have happened online, and will look at putting in place the right level of support.

The hon. Member for Worsley and Eccles South talked about the idea of an ombudsman; it may well be that one should be in place to deal with situations where complaints are not dealt with through the normal processes. I am also quite taken by some of the evidence we received about third-party complaints processes by other organisations. We heard a bit about the revenge porn helpline, which was set up a few years ago when we first recognised in law that revenge pornography was a crime. The Bill creates a lot more victims of crime and recognises them as victims, but we are not yet hearing clearly how the support systems will adequately help that massively increased number of victims to get the help they need.

I will probably talk in more detail about this issue when we reach clause 70, which provides an opportunity to look at the—unfortunately—probably vast fines that Ofcom will be imposing on organisations and how we might earmark some of that money specifically for victim support, whether by funding an ombudsman or helping amazing organisations such as the revenge porn helpline to expand their services.

We must address this issue now, in this Bill. If we do not, all those fines will go immediately into the coffers of the Treasury without passing “Go”, and we will not be able to take some of that money to help those victims directly. I am sure the Government absolutely intend to use some of the money to help victims, but that decision would be at the mercy of the Treasury. Perhaps we do not want that; perhaps we want to make it cleaner and easier and have the money put straight into a fund that can be used directly for people who have been victims of crime or injustice or things that fall foul of the Bill.

I hope that the Minister will listen to that and use this opportunity, as we do in other areas, to directly passport fines for specific victim support. He will know that there are other examples of that that he can look at.

As the right hon. Member for Basingstoke has mentioned the revenge porn helpline, I will mention the NSPCC’s Report Remove tool for children. It does exactly the same thing, but for younger people—the revenge porn helpline is specifically only for adults. Both those tools together cover the whole gamut, which is massively helpful.

The right hon. Lady’s suggestion about the hypothecation of fines is a very good one. I was speaking to the NSPCC yesterday, and one of the issues that we were discussing was super-complaints. Although super-complaints are great and I am very glad that they are included in the Bill, the reality is that some of the third-sector organisations that are likely to be undertaking super-complaints are charitable organisations that are not particularly well funded. Given how few people work for some of those organisations and the amazing amount of work they do, if some of the money from fines could support not just victims but the initial procedure for those organisations to make super-complaints, it would be very helpful. That is, of course, if the Minister does not agree with the suggestion of creating a user advocacy panel, which would fulfil some of that role and make that support for the charitable organisations less necessary—although I am never going to argue against support for charities: if the Minister wants to hypothecate it in that way, that would be fantastic.

I tabled amendments 78 and 79, but the statement the Minister made about the definition of users gives me a significant level of comfort about the way that people will be able to access a complaints procedure. I am terribly disappointed that the Minister is not a regular Reddit user. I am not, either, but I am well aware of what Reddit entails. I have no desire to sign up to Reddit, but knowing that even browsing the site I would be considered a user and therefore able to report any illegal content I saw, is massively helpful. On that basis, I am comfortable not moving amendments 78 and 79.

On the suggestion of an ombudsman—I am looking at new clause 1—it feels like there is a significant gap here. There are ombudsman services in place for many other areas, where people can put in a complaint and then go to an ombudsman should they feel that it has not been appropriately addressed. As a parliamentarian, I find that a significant number of my constituents come to me seeking support to go to the ombudsman for whatever area it is in which they feel their complaint has not been appropriately dealt with. We see a significant number of issues caused by social media companies, in particular, not taking complaints seriously, not dealing with complaints and, in some cases, leaving illegal content up. Particularly in the initial stages of implementation—in the first few years, before companies catch up and are able to follow the rules put in place by the Bill and Ofcom—a second-tier complaints system that is removed from the social media companies would make things so much better than they are now. It would provide an additional layer of support to people who are looking to make complaints.

I am sure the hon. Lady will agree with me that it is not either/or—it is probably both. Ultimately, she is right that an ombudsman would be there to help deal with what I think will be a lag in implementation, but if someone is a victim of online intimate image abuse, in particular, they want the material taken down immediately, so we need to have organisations such as those that we have both mentioned there to help on the spot. It has to be both, has it not?

I completely agree. Both those helplines do very good work, and they are absolutely necessary. I would strongly support their continuation in addition to an ombudsman-type service. Although I am saying that the need for an ombudsman would likely be higher in the initial bedding-in years, it will not go away—we will still need one. With NHS complaints, the system has been in place for a long time, and it works pretty well in the majority of cases, but there are still cases it gets wrong. Even if the social media companies behave in a good way and have proper complaints procedures, there will still be instances of them getting it wrong. There will still be a need for a higher level. I therefore urge the Minister to consider including new clause 1 in the Bill.

It is a pleasure to see you in the Chair, Ms Rees, and to make my first contribution in Committee—it will be a brief one. It is great to follow the hon. Member for Aberdeen North, and I listened intently to my right hon. Friend the Member for Basingstoke, from whom I have learned so much having sat with her in numerous Committees over the past two years.

I will speak to clause 18 stand part, in particular on the requirements of the technical specifications that the companies will need to use to ensure that they fulfil the duties under the clause. The point, which has been articulated well by numerous Members, is that we can place such a duty on service providers, but we must also ensure that the technical specifications in their systems allow them to follow through and deliver on it.

I sat in horror during the previous sitting as I listened to the hon. Member for Pontypridd talking about the horrendous abuse that she has to experience on Twitter. What that goes to show is that, if the intention of this clause and the Bill are to be fulfilled, we must ensure that the companies enable themselves to have the specifications in their systems on the ground to deliver the requirements of the Bill. That might mean that the secondary legislation is slightly more prescriptive about what those systems look like.

It is all well and good us passing primary legislation in this place to try to control matters, but my fear is that if those companies do not have systems such that they can follow through, there is a real risk that what we want will not materialise. As we proceed through the Bill, there will be mechanisms to ensure that that risk is mitigated, but the point that I am trying to make to my hon. Friend the Minister is that we should ensure that we are on top of this, and that companies have the technical specifications in their complaints procedures to meet the requirements under clause 18.

We must ensure that we do not allow the excuse, “Oh, well, we’re a bit behind the times on this.” I know that later clauses seek to deal with that, but it is important that we do not simply fall back on excuses. We must embed a culture that allows the provisions of the clause to be realised. I appeal to the Minister to ensure that we deal with that and embed a culture that looks at striding forward to deal with complaints procedures, and that these companies have the technical capabilities on the ground so that they can deal with these things swiftly and in the right way. Ultimately, as my right hon. Friend the Member for Basingstoke said, it is all well and good us making these laws, but it is vital that we ensure that they can be applied.

Let me address some of the issues raised in the debate. First, everyone in the House recognises the enormous problem at the moment with large social media firms receiving reports about harmful and even illegal content that they just flagrantly ignore. The purpose of the clause, and indeed of the whole Bill and its enforcement architecture, is to ensure that those large social media firms no longer ignore illegal and harmful content when they are notified about it. We agree unanimously on the importance of doing that.

The requirement for those firms to take the proper steps is set out in clause 18(2)(b), at the very top of page 18 —it is rather depressing that we are on only the 18th of a couple of hundred pages. That paragraph creates a statutory duty for a social media platform to take “appropriate action”—those are the key words. If the platform is notified of a piece of illegal content, or content that is harmful to children, or of content that it should take down under its own terms and conditions if harmful to adults, then it must do so. If it fails to do so, Ofcom will have the enforcement powers available to it to compel—ultimately, escalating to a fine of up to 10% of global revenue or even service disconnection.

Let me develop the point before I give way. Our first line of defence is Ofcom enforcing the clause, but we have a couple of layers of additional defence. One of those is the super-complaints mechanism, which I have mentioned before. If a particular group of people, represented by a body such as the NSPCC, feel that their legitimate complaints are being infringed systemically by the social media platform, and that Ofcom is failing to take the appropriate action, they can raise that as a super-complaint to ensure that the matter is dealt with.

I should give way to the hon. Member for Aberdeen North first, and then I will come to the shadow Minister.

I wanted to ask specifically about the resourcing of Ofcom, given the abilities that it will have under this clause. Will Ofcom have enough resource to be able to be that secondary line of defence?

A later clause gives Ofcom the ability to levy the fees and charges it sees as necessary and appropriate to ensure that it can deliver the duties. Ofcom will have the power to set those fees at a level to enable it to do its job properly, as Parliament would wish it to do.

This is the point about individual redress again: by talking about super-complaints, the Minister seems to be agreeing that it is not there. As I said earlier, for super-complaints to be made to Ofcom, the issue has to be of particular importance or to impact a particularly large number of users, but that does not help the individual. We know how much individuals are damaged; there must be a system of external redress. The point about internal complaints systems is that we know that they are not very good, and we require a big culture change to change them, but unless there is some mechanism thereafter, I cannot see how we are giving the individual any redress—it is certainly not through the super-complaints procedure.

As I said explicitly a few moments ago, the hon. Lady is right to point out the fact that the super-complaints process is to address systemic issues. She is right to say that, and I think I made it clear a moment or two ago.

Whether there should be an external ombudsman to enforce individual complaints, rather than just Ofcom enforcing against systemic complaints, is a question worth addressing. In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast. Facebook in the UK alone has tens of millions of users—I might get this number wrong, but I think it is 30 million or 40 million users.

I will in a moment. The volume of complaints that gets generated is vast. The way that we will fix this is not by having an external policeman to enforce on individual complaints, but by ensuring that the systems and processes are set up correctly to deal with problems at this large scale. [Interruption.] The shadow Minister, the hon. Member for Pontypridd, laughs, but it is a question of practicality. The way we will make the internet safe is to make sure that the systems and processes are in place and effective. Ofcom will ensure that that happens. That will protect everyone, not just those who raise individual complaints with an ombudsman.

I can see that there is substantial demand to comment, so I shall start by giving way to my right hon. Friend the Member for Basingstoke.

The Minister is doing an excellent job explaining the complex nature of the Bill. Ultimately, however, as he and I know, it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it. If my hon. Friend looks back at his comments, he will see that that is exactly the point he was making. Although it is possibly not necessary with this clause, I think he needs to give some assurances that later in the Bill he will look at hypothecating some of the money to be generated from fines to address the issues of individual constituents, who on a daily basis are suffering at the hands of the social media companies. I apologise for the length of my intervention.

It is categorically not the Government’s position that this problem is too big to fix. In fact, the whole purpose of this piece of groundbreaking and world-leading legislation is to fix a problem of such magnitude. The point my right hon. Friend was making about the hypothecation of fines to support user advocacy is a somewhat different one, which we will come to in due course, but there is nothing in the Bill to prevent individual groups from assisting individuals with making specific complaints to individual companies, as they are now entitled to do in law under clauses 17 and 18.

The point about an ombudsman is a slightly different one—if an individual complaint is made to a company and the individual complainant is dissatisfied with the outcome of their individual, particular and personal complaint, what should happen? In the case of financial services, if, for example, someone has been mis-sold a mortgage and they have suffered a huge loss, they can go to an ombudsman who will bindingly adjudicate that individual, single, personal case. The point that I am making is that having hundreds of thousands or potentially millions of cases being bindingly adjudicated on a case-by- case basis is not the right way to tackle a problem of this scale. The right way to tackle the problem is to force the social media companies, by law, to systemically deal with all of the problem, not just individual problems that may end up on an ombudsman’s desk.

That is the power in the Bill. It deals at a systems and processes level, it deals on an industry-wide level, and it gives Ofcom incredibly strong enforcement powers to make sure this actually happens. The hon. Member for Pontypridd has repeatedly called for a systems and processes approach. This is the embodiment of such an approach and the only way to fix a problem of such magnitude.

I associate myself with the comments of the right hon. Member for Basingstoke. Surely, if we are saying that this is such a huge problem, that is an argument for greater stringency and having an ombudsman. We cannot say that this is just about systems. Of course it is about systems, but online harms—we have heard some powerful examples of this—are about individuals, and we have to provide redress and support for the damage that online harms do to them. We have to look at systemic issues, as the Minister is rightly doing, but we also have to look at individual cases. The idea of an ombudsman and greater support for charities and those who can support victims of online crime, as mentioned by the hon. Member for Aberdeen North, is really important.

I thank the hon. Lady for her thoughtful intervention. There are two separate questions here. One is about user advocacy groups helping individuals to make complaints to the companies. That is a fair point, and no doubt we will debate it later. The ombudsman question is different; it is about whether to have a right of appeal against decisions by social media companies. Our answer is that, rather than having a third-party body—an ombudsman—effectively acting as a court of appeal against individual decisions by the social media firms, because of the scale of the matter, the solution is to compel the firms, using the force of law, to get this right on a systemic and comprehensive basis.

I give way first to the hon. Member for Aberdeen North—I think she was first on her feet—and then I will come to the hon. Member for Pontypridd.

Does the Minister not think this is going to work? He is creating this systems and processes approach, which he suggests will reduce the thousands of complaints—complaints will be made and complaints procedures will be followed. Surely, if it is going to work, in 10 years’ time we are going to need an ombudsman to adjudicate on the individual complaints that go wrong. If this works in the way he suggests, we will not have tens of millions of complaints, as we do now, but an ombudsman would provide individual redress. I get what he is arguing, but I do not know why he is not arguing for both things, because having both would provide the very best level of support.

I will address the review clause now, since it is relevant. If, in due course, as I hope and expect, the Bill has the desired effect, perhaps that would be the moment to consider the case for an ombudsman. The critical step is to take a systemic approach, which the Bill is doing. That engages the question of new clause 1, which would create a mechanism, probably for the reason the hon. Lady just set out, to review how things are going and to see if, in due course, there is a case for an ombudsman, once we see how the Bill unfolds in practice.

Let me finish the point. It is not a bad idea to review it and see how it is working in practice. Clause 149 already requires a review to take place between two and four years after Royal Assent. For the reasons that have been set out, it is pretty clear from this debate that we would expect the review to include precisely that question. If we had an ombudsman on day one, before the systems and processes had had a chance to have their effect, I fear that the ombudsman would be overwhelmed with millions of individual issues. The solution lies in fixing the problem systemically.

I wanted to reiterate the point that the hon. Member for Aberdeen North made, which the Minister has not answered. If he has such faith that the systems and processes will be changed and controlled by Ofcom as a result of the Bill, why is he so reluctant to put in an ombudsman? It will not be overwhelmed with complaints if the systems and processes work, and therefore protect victims. We have already waited far too long for the Bill, and now he says that we need to wait two to four years for a review, and even longer to implement an ombudsman to protect victims. Why will he not just put this in the Bill now to keep them safe?

Because we need to give the new systems and processes time to take effect. If the hon. Lady felt so strongly that an ombudsman was required, she was entirely at liberty to table an amendment to introduce one, but she has not done so.

I wonder whether Members would be reassured if companies were required to have a mechanism by which users could register their dissatisfaction, to enable an ombudsman, or perhaps Ofcom, to gauge the volume of dissatisfaction and bring some kind of group claim against the company. Is that a possibility?

Yes. My hon. Friend hits the nail on the head. If there is a systemic problem and a platform fails to act appropriately not just in one case, but in a number of them, we have, as she has just described, the super-complaints process in clauses 140 to 142. Even under the Bill as drafted, without any changes, if a platform turns out to be systemically ignoring reasonable complaints made by the public and particular groups of users, the super-complainants will be able to do exactly as she describes. There is a mechanism to catch this—it operates not at individual level, but at the level of groups of users, via the super-complaint mechanism—so I honestly feel that the issue has been addressed.

When the numbers are so large, I think that the super-complaint mechanism is the right way to push Ofcom if it does not notice. Obviously, the first line of defence is that companies comply with the Bill. The second line of defence is that if they fail to do so, Ofcom will jump on them. The third line of defence is that if Ofcom somehow does not notice, a super-complaint group—such as the NSPCC, acting for children—will make a super-complaint to Ofcom. We have three lines of defence, and I submit to the Committee that they are entirely appropriate.

On this clause. What we have done, however—we are debating it now—is to table a new clause to require a report on redress for individual complaints. The Minister talks about clause 149 and a process that will kick in between two and five years away, but we have a horrendous problem at the moment. I and various others have described the situation as the wild west, and very many people—thousands, if not millions, of individuals—are being failed very badly. I do not see why he is resisting our proposal for a report within six months of the commencement of the Act, which would enable us to start to see at that stage, not two to five years down the road, how these systems—he is putting a lot of faith in them—were turning out. I think that is a very sound idea, and it would help us to move forward.

The third line of defence—the super-complaint process—is available immediately, as I set out a moment ago. In relation to new clause 1, which the hon. Lady mentioned a moment ago, I think six months is very soon for a Bill of this magnitude. The two-to-five-year timetable under the existing review mechanism in clause 149 is appropriate.

Although we are not debating clause 149, I hope, Ms Rees, that you will forgive me for speaking about it for a moment. If Members turn to pages 125 and 126 and look at the matters covered by the review, they will see that they are extraordinarily comprehensive. In effect, the review covers the implementation of all aspects of the Bill, including the need to minimise the harms to individuals and the enforcement and information-gathering powers. It covers everything that Committee members would want to be reviewed. No doubt as we go through the Bill we will have, as we often do in Bill Committee proceedings, a number of occasions on which somebody tables an amendment to require a review of x, y or z. This is the second such occasion so far, I think, and there may be others. It is much better to have a comprehensive review, as the Bill does via the provisions in clause 149.

Question put and agreed to.

Clause 18 accordingly ordered to stand part of the Bill.

Clause 19

Duties about freedom of expression and privacy

Question proposed, That the clause stand part of the Bill.

Clause 19, on user-to-user services, and its associated clause 29, which relates to search services, specify a number of duties in relation to freedom of expression and privacy. In carrying out their safety duties, in-scope companies will be required by clause 19(2) to have regard to the importance of protecting users’ freedom of expression and privacy.

Let me pause for a moment on this issue. There has been some external commentary about the Bill’s impact on freedom of expression. We have already seen, via our discussion of a previous clause, that there is nothing in the Bill that compels the censorship of speech that is legal and not harmful to children. I put on the record again the fact that nothing in the Bill requires the censorship of legal speech that poses no harm to children.

We are going even further than that. As far as I am aware, for the first time ever there will be a duty on social media companies, via clause 19(2), to have regard to freedom of speech. There is currently no legal duty at all on platforms to have regard to freedom of speech. The clause establishes, for the first time, an obligation to have regard to freedom of speech. It is critical that not only Committee members but others more widely who consider the Bill should bear that carefully in mind. Besides that, the clause speaks to the right to privacy. Existing laws already speak to that, but the clause puts it in this Bill as well. Both duties are extremely important.

In addition, category 1 service providers—the really big ones—will need proactively to assess the impact of their policies on freedom of expression and privacy. I hope all Committee members will strongly welcome the important provisions I have outlined.

As the Minister says, clauses 19 and 29 are designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulation in less substantive ways. That is a fear here.

Category 1 providers will need to undertake an impact assessment to determine the impact of their product and safety decisions on freedom of expression, but it is unclear whether that applies only in respect of content that is harmful to adults. Unlike with the risk assessments for the illegal content and child safety duties set out in part 3, chapter 2, these clauses do not set expectations about whether risk assessments are of a suitable and sufficient quality. It is also not clear what powers Ofcom has at its disposal to challenge any assessments that it considers insufficient or that reach an inappropriate or unreasonable assessment of how to balance fundamental rights. I would appreciate it if the Minister could touch on that when he responds.

The assumption underlying these clauses is that privacy and free expression may need to act as a constraint on safety measures, but I believe that that is seen quite broadly as simplistic and potentially problematic. To give one example, a company could argue that end-to-end encryption is important for free expression, and privacy could justify any adverse impact on users’ safety. The subjects of child abuse images, which could more easily be shared because of such a decision, would see their safety and privacy rights weakened. Such an argument fails to take account of the broader nuance of the issues at stake. Impacts on privacy and freedom of expression should therefore be considered across a range of groups rather than assuming an overarching right that applies equally to all users.

Similarly, it will be important that Ofcom understands and delivers its functions in relation to these clauses in a way that reflects the complexity and nuance of the interplay of fundamental rights. It is important to recognise that positive and negative implications for privacy and freedom of expression may be associated with any compliance decision. I think the Minister implied that freedom of speech was a constant positive, but it can also have negative connotations.

I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?

I confirm what the hon. Lady has just said. In response to the hon. Member for Worsley and Eccles South, it is important to say that the duty in clause 19 is “to have regard”, which simply means that a balancing exercise must be performed. It is not determinative; it is not as if the rights in the clause trump everything else. They simply have to be taken into account when making decisions.

To repeat what we discussed on Tuesday, I can explicitly and absolutely confirm to the hon. Member for Aberdeen North that in my view and the Government’s, concerns about freedom of expression or privacy should not trump platforms’ ability to scan for child sexual exploitation and abuse images or protect children. It is our view that there is nothing more important than protecting children from exploitation and sexual abuse.

We may discuss this further when we come to clause 103, which develops the theme a little. It is also worth saying that Ofcom will be able to look at the risk assessments and, if it feels that they are not of an adequate standard, take that up with the companies concerned. We should recognise that the duty to have regard to freedom of expression is not something that currently exists. It is a significant step forward, in my view, and I commend clauses 19 and 29 to the Committee.

I have been contacted by a number of people about this clause, and they have serious concerns about the “have regard” statement. The Christian Institute said that it was

“promised ‘considerably stronger protections for free speech’, but the Bill does not deliver. Internet companies will be under ‘a duty to have regard to the importance of’ protecting free speech,”

but a “have regard” duty

“has no weight behind it. It is perfectly possible to…have regard to something…and then ignore it in practice.”

The “have regard” duty is not strong enough, and it is a real concern for a lot of people out there. Protecting children is absolutely imperative, but there are serious concerns when it comes to freedom of speech. Can the Minister address them for me?

As I have said, at the moment there is nothing at all. Platforms such as Facebook can and do arbitrarily censor content with little if any regard for freedom of speech. Some platforms have effectively cancelled Donald Trump while allowing the Russian state to propagate shocking disinformation about the Russian invasion of Ukraine, so there is real inconsistency and a lack of respect for freedom of speech. This at least establishes something where currently there is nothing. We can debate whether “have regard to” is strong enough. We have heard the other point of view from the other side of the House, which expressed concern that it might be used to allow otherwise harmful content, so there are clearly arguments on both sides of the debate. The obligation to have regard does have some weight, because the issue cannot be completely ignored. I do not think it would be adequate to simply pay lip service to it and not give it any real regard, so I would not dismiss the legislation as drafted.

I would point to the clauses that we have recently discussed, such as clause 15, under which content of democratic importance—which includes debating current issues and not just stuff said by an MP or candidate—gets additional protection. Some of the content that my hon. Friend the Member for Don Valley referred to a second ago would probably also get protection under clause 14, under which content of democratic importance has to be taken in account when making decisions about taking down or removing particular accounts. I hope that provides some reassurance that this is a significant step forwards compared with where the internet is today.

I share the Minister’s sentiments about the Bill protecting free speech; we all want to protect that. He mentions some of the clauses we debated on Tuesday regarding democratic importance. Some would say that debating this Bill is of democratic importance. Since we started debating the Bill on Tuesday, and since I have mentioned some of the concerns raised by stakeholders and others about the journalistic exemption and, for example, Tommy Robinson, my Twitter mentions have been a complete sewer—as everyone can imagine. One tweet I received in the last two minutes states:

“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”

in this country. Does the Minister agree that that is content of democratic importance, given we are debating this Bill, and that it should remain on Twitter?

That sounds like a very offensive tweet. Could the hon. Lady read it again? I didn’t quite catch it.

Yes:

“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”

in this country. It goes on:

“this is a toxic combination of bloc vote grubbing and woke”

culture, and there is a lovely GIF to go with it.

I do not want to give an off-the-cuff assessment of an individual piece of content—not least because I am not a lawyer. It does not sound like it meets the threshold of illegality. It most certainly is offensive, and that sort of matter is one that Ofcom will set out in its codes of practice, but there is obviously a balance between freedom of speech and content that is harmful, which the codes of practice will delve into. I would be interested if the hon. Lady could report that to Twitter and then report back to the Committee on what action it takes.

At the moment, there is no legal obligation to do anything about it, which is precisely why this Bill is needed, but let us put it to the test.

Question put and agreed to.

Clause 19 accordingly ordered to stand part of the Bill.

Clause 20

Record-keeping and review duties

Question proposed, That the clause stand part of the Bill.

Record-keeping and review duties on in-scope services make up an important function of the regulatory regime that we are discussing today. Platforms will need to report all harms identified and the action taken in response to this, in line with regulation. The requirements to keep records of the action taken in response to harm will be vital in supporting the regulator to make effective decisions about regulatory breaches and whether company responses are sufficient. That will be particularly important to monitor platforms’ responses through risk assessments—an area where some charities are concerned that we will see under-reporting of harms to evade regulation.

Evidence of under-reporting can be seen in the various transparency reports that are currently being published voluntarily by sites, where we are not presented with the full picture and scale of harm and the action taken to address that harm is thus obscured.

As with other risk assessments, the provisions in clauses 20 and 30 could be strengthened through a requirement on in-scope services to publish their risk assessments. We have made that point many times. Greater transparency would allow researchers and civil society to track harms and hold services to account.

The shadow Minister has eloquently introduced the purpose and effect of the clause, so I shall not repeat what she has said. On her point about publication, I repeat the point that I made on Tuesday, which is that the transparency requirements—they are requirements, not options—set out in clause 64 oblige Ofcom to ensure the publication of appropriate information publicly in exactly the way she requests.

Question put and agreed to.

Clause 20 accordingly ordered to stand part of the Bill.

Clauses 21 to 24 ordered to stand part of the Bill.

Clause 25

Children’s risk assessment duties

Amendment proposed: 16, in clause 25, page 25, line 10, at end insert—

“(3A) A duty for the children’s risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.” —(Alex Davies-Jones.)

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.

Clause 25 ordered to stand part of the Bill.

Clauses 26 to 30 ordered to stand part of the Bill.

Clause 31

Children’s access assessments

I call Kirsty Blackman to move amendment 22. [Interruption.] Sorry—my bad, as they say. I call Barbara Keeley to move amendment 22.

I beg to move amendment 22, in clause 31, page 31, line 17, leave out subsection (3).

This amendment removes the condition that applies a child use test to a service or part of a service.

With this it will be convenient to discuss the following:

Clause stand part.

Clause 32 stand part.

That schedule 3 be the Third schedule to the Bill.

Clause 33 stand part.

The purpose of the amendment is to remove the child use test from the children’s access assessment and to make sure that any service likely to be accessed by children is within the scope of the child safety duty. The amendment is supported by the NSPCC and other children’s charities.

Children require protection wherever they are online. I am sure that every Committee member believes that. The age-appropriate design code from the Information Commissioner’s Office requires all services that are likely to be accessed by children to provide high levels of data protection and privacy. Currently, the Bill will regulate only user-to-user and search services that have a significant number of child users or services for which children form a significant part of their user base. It will therefore not apply to all services that fall within the scope of the ICO’s code, creating a patchwork of regulation that could risk uncertainty, legal battles and unnecessary complexity. It might also create a perverse incentive for online services to stall the introduction of their child safety measures until Ofcom has the capacity to investigate and reach a determination on the categorisation of their sites.

The inclusion of a children’s access assessment in the Bill may result in lower standards of protection, with highly problematic services such as Telegram and OnlyFans able to claim that they are excluded from the child safety duties because children do not account for a significant proportion of their user base. However, evidence has shown that children have been able to access those platforms.

Other services will remain out of the scope of the Bill as currently drafted. They include harmful blogs that promote life-threatening behaviours, such as pro-anorexia sites with provider-generated rather than user-generated content; some of the most popular games among children that do not feature user-generated content but are linked to increasing gambling addiction among children, and through which some families have lost thousands of pounds; and other services with user-generated content that is harmful but does not affect an appreciable number of children. That risks dozens, hundreds or even thousands of children falling unprotected.

Parents have the reasonable expectation that, under the new regime introduced by the Bill, children will be protected wherever they are online. They cannot be expected to be aware of exemptions or distinctions between categories of service. They simply want their children to be protected and their rights upheld wherever they are.

As I say, children have the right to be protected from harmful content and activity by any platform that gives them access. That is why the child user condition in clause 31 should be deleted from the Bill. As I have said, the current drafting could leave problematic platforms out of scope if they were to claim that they did not have a significant number of child users. It should be assumed that platforms are within the scope of the child safety duties unless they can provide evidence that children cannot access their sites, for example through age verification tools.

Although clause 33 provides Ofcom with the power to determine that a platform is likely to be accessed by children, this will necessitate Ofcom acting on a company-by-company basis to bring problematic sites back into scope of the child safety duties. That will take considerable time, and it will delay children receiving protection. It would be simpler to remove the child user condition from clause 31, as I have argued.

It is welcome that schedule 3 specifies the timing of service providers’ risk assessments and children’s access assessments. Three months from the publication of Ofcom guidance to the completion of the service assessments is ample time. What is concerning, as we have heard from contributions this morning, is the long delay that children have already faced in gaining protections online. We know that the situation has become very bad.

As I understand it, the duties on Ofcom to provide the necessary guidance on risk assessments and children’s access assessments will come into force only on such a date as the Secretary of State may, by regulations, appoint, because the measure is not one of those listed in clause 193(1). That means that children and adults may continue to be exposed to harm for a significant further stretch of time. Can the Minister offer any clarification as to when Ofcom will be required to publish guidance? After the disappointing flop of part 3 of the Digital Economy Act 2017 not being implemented, what reassurances can the Minister offer that this regime will come into effect as soon as possible?

I would have been quite happy to move the amendment, but I do not think the Opposition would have been terribly pleased with me if I had stolen it. I have got my name on it, and I am keen to support it.

As I have said, I met the NSPCC yesterday, and we discussed how clause 31(3) might work, should the Minister decide to keep it in the Bill and not accept the amendment. There are a number of issues with the clause, which states that the child user condition is met if

“a significant number of children”

are users of the service, or if the service is

“likely to attract a significant number of users who are children”.

I do not understand how that could work. For example, a significant number of people who play Fortnite are adults, but a chunk of people who play it are kids. If some sort of invisible percentage threshold is applied in such circumstances, I do not know whether that threshold will be met. If only 20% of Fortnite users are kids, and that amounts only to half a million children, will that count as enough people to meet the child access assessment threshold?

Fortnite is huge, but an appropriate definition is even more necessary for very small platforms and services. With the very far-right sites that we have mentioned, it may be that only 0.5% of their users are children, and that may amount only to 2,000 children—a very small number. Surely, because of the risk of harm if children access these incredibly damaging and dangerous sites that groom people for terrorism, they should have a duty to meet the child access requirement threshold, if only so that we can tell them that they must have an age verification process—they must be able to say, “We know that none of our users are children because we have gone through an age verification process.” I am keen for children to be able to access the internet and meet their friends online, but I am keen for them to be excluded from these most damaging sites. I appreciate the action that the Government have taken in relation to pornographic content, but I do not think that this clause allows us to go far enough in stopping children accessing the most damaging content that is outwith pornographic content.

The other thing that I want to raise is about how the number of users will be calculated. The Minister made it very clear earlier on, and I thank him for doing so, that an individual does not have to be a registered user to be counted as a user of a site. People can be members of TikTok, for example, only if they are over 13. TikTok has some hoops in place—although they are not perfect—to ensure that its users are over 13, and to be fair, it does proactively remove users that it suspects are under 13, particularly if they are reported. That is a good move.

My child is sent links to TikTok videos through WhatsApp, however. He clicks on the links and is able to watch the videos, which will pop up in the WhatsApp mini-browser thing or in the Safari browser. He can watch the videos without signing up as a registered user of TikTok and without using the platform itself—the videos come through Safari, for example, rather than through the app. Does the Minister expect that platforms will count those people as users? I suggest that the majority of people who watch TikTok by those means are doing so because they do not have a TikTok account. Some will not have accounts because they are under 13 and are not allowed to by TikTok or by the parental controls on their phones.

My concern is that, if the Minister does not provide clarity on this point, platforms will count just the number of registered users, and will say, “It’s too difficult for us to look at the number of unregistered users, so in working out whether we meet the criteria, we are not even going to consider people who do not access our specific app or who are not registered users in some way, shape or form.” I have concerns about the operation of the provisions and about companies using that “get out of jail free” card. I genuinely believe that the majority of those who access TikTok other than through its platform are children and would meet the criteria. If the Minister is determined to keep subsection (3) and not accept the amendment, I feel that he should make it clear that those users must be included in the counting by any provider assessing whether it needs to fulfil the child safety duties.

I agree with thon. Lady’s important point, which feeds into the broader question of volume versus risk—no matter how many children see something that causes harm and damage, one is one too many—and the categorisation of service providers into category 1 to category 2A and category 2B. The depth of the risk is the problem, rather than the number of people who might be affected. The hon. Lady also alluded to age verification—I am sure we will come to that at some point—which is another can of worms. The important point, which she made well, is about volume versus risk. The point is not how many children see something; even if only a small number of children see something, the damage has been done.

I absolutely agree. In fact, I have tabled an amendment to widen category 1 to include sites with the highest risk of harm. The Minister has not said that he agrees with my amendment specifically, but he seems fairly amenable to increasing and widening some duties to include the sites of highest risk. I have also tabled another new clause on similar issues.

I am glad that these clauses are in the Bill—a specific duty in relation to children is important and should happen—but as the shadow Minister said, clause 31(3) is causing difficulty. It is causing difficulty for me and for organisations such as the NSPCC, which is unsure how the provisions will operate and whether they will do so in the way that the Government would like.

I hope the Minister will answer some of our questions when he responds. If he is not willing to accept the amendment, will he give consideration to how the subsection could be amended in the future—we have more stages, including Report and scrutiny in the other place—to ensure that there is clarity and that the intention of the purpose is followed through, rather than being an intention that is not actually translated into law?

Colleagues have spoken eloquently to the purpose and effect of the various clauses and schedule 3 —the stand part component of this group. On schedule 3, the shadow Minister, the hon. Member for Worsley and Eccles South, asked about timing. The Government share her desire to get this done as quickly as possible. In its evidence a couple of weeks ago, Ofcom said it would be publishing its road map before the summer, which would set out the timetable for moving all this forward. We agree that that is extremely important.

I turn to one or two questions that arose on amendment 22. As always, the hon. Member for Aberdeen North asked a number of very good questions. The first was whether the concept of a “significant number” applied to a number in absolute terms or a percentage of the people using a particular service, and which is looked at when assessing what is significant. The answer is that it can be either—either a large number in absolute terms, by reference to the population of the whole United Kingdom, or a percentage of those using the service. That is expressed in clause 31(4)(a). Members will note the “or” there. It can be a number in proportion to the total UK population or the proportion using a service. I hope that answers the hon. Member’s very good question.

My concern is where services that meet neither of those criteria—they do not meet the “significant number” criterion in percentage terms because, say, only 0.05% of their users are children, and they do not meet it in population terms, because they are a pretty small platform and only have, say, 1,000 child users—but those children who use the platform are at very high risk because of the nature of the platform or the service provided. My concern is for those at highest risk where neither of the criteria are met and the service does not have to bother conducting any sort of age verification or access requirements.

I am concerned to ensure that children are appropriately protected, as the hon. Lady sets out. Let me make a couple of points in that area before I address that point.

The hon. Lady asked another question earlier, about video content. She gave the example of TikTok videos being viewed or accessed not directly on TikTok but via some third-party means, such as a WhatsApp message. First, it is worth emphasising again that in order to count as a user, a person does not have to be registered and can simply be viewing the content. Secondly, if someone is viewing something through another service, such as WhatsApp—the hon. Lady used the example of browsing the internet on another site—the duty will bite at the level of WhatsApp, and it will have to consider the content that it is providing access to. As I said, someone does not have to be registered with a service in order to count as a user of that service.

On amendment 22, there is a drafting deficiency, if I may put it politely—this is a point of drafting rather than of principle. The amendment would simply delete subsection (3), but there would still be references to the “child user condition”—for example, the one that appears on the same page of the Bill at line 11. If the amendment were adopted as drafted, it would end up leaving references to “child user condition” in the Bill without defining what it meant, because we would have deleted the definition.

No, is the short answer. I was just mentioning in passing that there is that drafting issue.

On the principle, it is worth being very clear that, when it comes to content or matters that are illegal, that applies to all platforms, regardless of size, where children are at all at risk. In schedule 6, we set out a number of matters—child sexual exploitation and abuse, for example—as priority offences that all platforms have to protect children from proactively, regardless of scale.

Of course, anything to do with children that is illegal falls under the legal duties that we have discussed already. Anything that touches on illegality is covered, notwith-standing this clause, which deals with topics where the subject, act or content is not illegal. It is important to keep that in mind.

Other areas include gambling, which the shadow Minister mentioned. There is separate legislation—very strong legislation—that prohibits children from being involved in gambling. That stands independently of this Bill, so I hope that the Committee is assured—

The Minister has not addressed the points I raised. I specifically raised—he has not touched on this—harmful pro-anorexia blogs, which we know are dangerous but are not in scope, and games that children access that increase gambling addiction. He says that there is separate legislation for gambling addiction, but families have lost thousands of pounds through children playing games linked to gambling addiction. There are a number of other services that do not affect an appreciable number of children, and the drafting causes them to be out of scope.

There is no hard and fast rule about moving the Adjournment motion. It is up to the Government Whip.

I have a few more things to say, but I am happy to finish here if it is convenient.

Ordered, That the debate be now adjourned.—(Steve Double.)

Adjourned till this day at Two o’clock.

Online Safety Bill (Eighth sitting)

The Committee consisted of the following Members:

Chairs: Sir Roger Gale, † Christina Rees

† Ansell, Caroline (Eastbourne) (Con)

† Bailey, Shaun (West Bromwich West) (Con)

† Blackman, Kirsty (Aberdeen North) (SNP)

† Carden, Dan (Liverpool, Walton) (Lab)

† Davies-Jones, Alex (Pontypridd) (Lab)

† Double, Steve (St Austell and Newquay) (Con)

† Fletcher, Nick (Don Valley) (Con)

† Holden, Mr Richard (North West Durham) (Con)

† Keeley, Barbara (Worsley and Eccles South) (Lab)

† Leadbeater, Kim (Batley and Spen) (Lab)

† Miller, Dame Maria (Basingstoke) (Con)

† Mishra, Navendu (Stockport) (Lab)

Moore, Damien (Southport) (Con)

† Nicolson, John (Ochil and South Perthshire) (SNP)

† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)

† Russell, Dean (Watford) (Con)

† Stevenson, Jane (Wolverhampton North East) (Con)

Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks

† attended the Committee

Public Bill Committee

Thursday 9 June 2022

(Afternoon)

[Christina Rees in the Chair]

Online Safety Bill

Clause 31

Children’s access assessments

Amendment proposed (this day): 22, in clause 31, page 31, line 17, leave out subsection (3).—(Barbara Keeley.)

This amendment removes the condition that applies a child use test to a service or part of a service.

Question again proposed, That the amendment be made.

I remind the Committee that with this we are discussing the following:

Clause stand part.

Clause 32 stand part.

That schedule 3 be the Third schedule to the Bill.

Clause 33 stand part.

When the sitting was suspended for lunch, I was concluding my remarks and saying that where children are the victim of illegal activity or illegal content, all of that is covered in other aspects of the Bill. For areas such as gambling, we have separate legislation that protects children. In relation to potentially harmful content, the reason there is a “significant number” test for the child user condition that we are debating is that, without it, platforms that either would not have any children accessing them or had nothing of any concern on them—such as a website about corporation tax—would have an unduly burdensome and disproportionate obligation placed on them. That is why there is the test—just to ensure that there is a degree of proportionality in these duties. We find similar qualifications in other legislation; that includes the way the age-appropriate design code works. Therefore, I respectfully resist the amendment.

Question put, That the amendment be made.

Clause 31 ordered to stand part of the Bill.

Clause 32 ordered to stand part of the Bill.

Schedule 3 agreed to.

Clause 33 ordered to stand part of the Bill.

Clause 34

Duties about fraudulent advertising: Category 1 services

With this it will be convenient to discuss the following:

Amendment 24, in clause 35, page 34, line 34, after “service” insert “that targets users”.

New clause 5—Duty to distinguish paid-for advertisements

“(1) A provider of a Category 2A service must operate the service using systems and processes designed to clearly distinguish to users of that service paid-for advertisements from all other content appearing in or via search results of the service.

(2) The systems and processes described under subsection (1)—

(a) must include clearly displaying the words “paid-for advertisement” next to any paid-for advertisement appearing in or via search results of the service, and

(b) may include measures such as but not limited to the application of colour schemes to paid-for advertisements appearing in or via search results of the service.

(3) The reference to paid-for advertisements appearing “in or via search results of a search service” does not include a reference to any advertisements appearing as a result of any subsequent interaction by a user with an internet service other than the search service.

(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extend to the design, operation and use of a Category 2A service that hosts paid-for advertisements targeted at users of that service in the United Kingdom.

(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).

(7) For the meaning of “paid-for advertisement”, see section 189 (interpretation: general).”

New clause 6—Duty to verify advertisements

“(1) A provider of a Category 2A service must operate an advertisement verification process for any relevant advertisement appearing in or via search results of the service.

(2) In this section, “relevant advertisement” means any advertisement for a service or product to be designated in regulations made by the Secretary of State.

(3) The verification process under subsection (1) must include a requirement for advertisers to demonstrate that they are authorised by a UK regulatory body.

(4) In this section, “UK regulatory body” means a UK regulator responsible for the regulation of a particular service or product to be designated in regulations made by the Secretary of State.

(5) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).

(7) Regulations under this section shall be made by statutory instrument.

(8) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by resolution of each House of Parliament.”

I begin by thanking my hon. Friend the Member for Washington and Sunderland West (Mrs Hodgson) for her work on drafting these amendments and others relating to this chapter, which I will speak to shortly. She has campaigned excellently over many years in her role as chair of the all-party parliamentary group on ticket abuse. I attended the most recent meeting of that group back in April to discuss what we need to see changed in the Bill to protect people from scams online. I am grateful to those who have supported the group and the anti-ticket touting campaign for their insights.

It is welcome that, after much flip-flopping, the Government have finally conceded to Labour’s calls and those of many campaign groups to include a broad duty to tackle fraudulent advertising on search engines through chapter 5 of part 3 of the Bill. We know that existing laws to protect consumers in the online world have failed to keep pace with the actors attempting to exploit them, and that is particularly true of scams and fraudulent advertisements.

Statistics show a steep increase in this type of crime in the online world, although those figures are likely to be a significant underestimate and do not capture the devastating emotional impact that scams have on their victims. The scale of the problem is large and it is growing.

The Financial Conduct Authority estimates that fraud costs the UK up to £190 billion a year, with 86% of that fraud committed online. We know those figures are increasing. The FCA more than doubled the number of scam warnings it issued between 2019 and 2020, while UK Finance data shows that there has been a significant rise in cases across all scam types as criminals adapt to targeting victims online. The pandemic, which led to a boom in internet shopping, created an environment ripe for exploitation. Reported incidents of scams and fraud have increased by 41% since before the pandemic, with one in 10 of us now victims of fraud.

Being scammed can cause serious psychological harm. Research by the Money and Mental Health Policy Institute suggests that three in 10 online scam victims felt depressed as a result of being scammed, while four in 10 said they felt stressed. Clearly, action to tackle the profound harms that result from fraudulent advertising is long overdue.

This Bill is an important opportunity but, as with other issues the Government are seeking to address, we need to see changes if it is to be successful. Amendments 23 and 24 are small and very simple, but would have a profound impact on the ability of the Bill to prevent online fraud from taking place and to protect UK users.

As currently drafted, the duties set out in clauses 34 and 35 for category 1 and 2A services extend only to the design, operation and use of a category 1 or 2A service in the United Kingdom. Our amendments would mean that the duties extended to the design, operation and use of a category 1 or 2A service that targets users in the United Kingdom. That change would make the Bill far more effective, because it would reduce the risk of a company based overseas being able to target UK consumers without any action being taken against them—being allowed to target the public fraudulently without fear of disruption.

That would be an important change, because paid-for advertisements function by the advertiser stating where in the world, by geographical location, they wish to target consumers. For instance, a company would be able to operate from Hong Kong and take out paid-for advertisements to target consumers just in one particular part of north London. The current wording of the Bill does not acknowledge the fact that internet services can operate from anywhere in the world and use international boundaries to circumvent UK legislation.

Other legislation has been successful in tackling scams across borders. I draw the Committee’s attention to the London Olympic Games and Paralympic Games Act 2006, which made it a crime to sell a ticket to the Olympics into the black market anywhere in the world, rather than simply in the UK where the games took place. I suggest that we should learn from the action taken to regulate the Olympics back in 2012 and implement the same approach through amendments 23 and 24.

New clause 5 was also tabled by my hon. Friend the Member for Washington and Sunderland West, who will be getting a lot of mentions this afternoon.

New clause 5 would tackle one of the reasons people become subject to fraud online by introducing a duty for search engines to ensure that all paid-for search advertisements should be made to look distinct from non-paid-for search results. When bad actors are looking to scam consumers, they often take out paid-for advertising on search results, so that they can give consumers the false impression that their websites are official and trustworthy.

Paid search results occur when companies pay a charge to have their site appear at the top of search results. This is valuable to them because it is likely to direct consumers towards their site. The new clause would stop scam websites buying their way to the top of a search result.

Let me outline some of the consequences of not distinguishing between paid-for and not-paid-for advertisements, because they can be awful. Earlier this year, anti-abortion groups targeted women who were searching online for a suitable abortion clinic. The groups paid for the women to have misleading adverts at the top of their search that directed them towards an anti-abortion centre rather than a clinic. One woman who knew that she wanted to have an abortion went on researching where she could have the procedure. Her search for a clinic on Google led her to an anti-abortion centre that she went on to contact and visit. That was because she trusted the top search results on Google, which were paid for. The fact that it was an advertisement was indicated only by the two letters “AD” appearing in very small font underneath the search headline and description.

Another example was reported by The Times last year. Google had been taking advertising money from scam websites selling premier league football tickets, even though the matches were taking place behind closed doors during lockdown. Because these advertisements appeared at the top of search results, it is entirely understandable that people looking for football tickets were deceived into believing that they would be able to attend the games, which led to them being scammed.

There have been similar problems with passport renewals. As colleagues will be very aware, people have been desperately trying to renew their passports amid long delays because of the backlog of cases. This is a target for fraudsters, who take out paid advertisements to offer people assistance with accessing passport renewal services and then scam them.

New clause 5 would end this practice by ensuring that search engines provide clear messaging to show that the user is looking at a paid-for advertisement, by stating that clearly and through other measures, such as a separate colour scheme. A duty to distinguish paid-for advertising is present in many other areas of advertising. For example, when we watch TV, there is no confusion between what is a programme and what is an advert; the same is true of radio advertising; and when someone is reading a newspaper or magazine, the line between journalism and the advertisements that fund the paper is unmistakable.

We cannot continue to have these discrepancies and be content with the internet being a wild west. Therefore, it is clear that advertising on search engines needs to be brought into line with advertising in other areas, with a requirement on search engines to distinguish clearly between paid-for and organic results.

New clause 6 is another new clause tabled by my hon. Friend the Member for Washington and Sunderland West. It would protect consumers from bad actors trying to exploit them online by placing a duty on search engines to verify adverts before they accept them. That would mean that, before their adverts were allowed to appear in a paid-for search result, companies would have to demonstrate that they were authorised by a UK regulatory body designated by the Secretary of State.

This methodology for preventing fraud is already in process for financial crime. Google only accepts financial services advertisements from companies that are a member of the Financial Conduct Authority. This gives companies a further incentive to co-operate with regulators and it protects consumers by preventing companies that are well-known for their nefarious activities from dominating search results and then misleading consumers. By extending this best practice to all advertisements, search engines would no longer be able to promote content that is fake or fraudulent after being paid to do so.

Without amending the Bill in this way, we risk missing an opportunity to tackle the many forms of scamming that people experience online, one of which is the world of online ticketing. In my role as shadow Minister for the arts and civil society, I have worked on this issue and been informed by the expertise of my hon. Friend the Member for Washington and Sunderland West.

In the meeting of the all-party parliamentary group on ticket abuse in April, we heard about the awful consequences of secondary ticket reselling practices. Ticket reselling websites, such as Viagogo, are rife with fraud. Large-scale ticket touts dominate the resale site, and Viagogo has a well-documented history of breaching consumer protection laws. Those breaches include a number of counts of fraud for selling non-existent tickets. Nevertheless, Viagogo continues to take out paid-for advertisements with Google and is continually able to take advantage of consumers by dominating search results and commanding false trust.

If new clause 6 is passed, then secondary ticketing websites such as Viagogo would have to be members of a regulatory body responsible for secondary ticketing, such as the Society of Ticket Agents and Retailers, or STAR. Viagogo would then have to comply with STAR standards for its business model to be successful.

I have used ticket touting as an example, but the repercussions of this change would be wider than that. Websites that sell holidays and flights, such as Skyscanner, would have to be a member of the relevant regulatory group, for example the Association of British Travel Agents. People would be able to go to football matches, art galleries and music festivals without fearing that they are getting ripped off or have been issued with fake tickets.

I will describe just a few examples of the poor situation we are in at the moment, to illustrate the need for change. The most heartbreaking one is of an elderly couple who bought two tickets from a secondary ticketing website to see their favourite artist, the late Leonard Cohen, to celebrate their 70th wedding anniversary. When the day came around and they arrived at the venue, they were turned away and told they had been sold fake tickets. The disappointment they must have felt would have been very hard to bear. In another instance, a British soldier serving overseas decided to buy his daughter concert tickets because he could not be with her on her birthday. When his daughter went along to the show, she was turned away at the door and told she could not enter because the tickets had been bought through a scam site and were invalid.

It is clear that the human impact of inaction is too great to ignore. Not only are victims scammed out of their money, but they go through intense stress and experience shame and humiliation. The Government have accepted the urgent need for action by following the advice of campaigners and the Joint Committee in including fraudulent advertising in the Bill, but more must be done if we are to prevent online fraud. By requiring search engines to verify advertisers before accepting their money, traders such as Viagogo will have an incentive to act responsibly and to comply with regulatory bodies.

I rise to agree with all the amendments in this group that have been tabled by the Opposition. I want to highlight a couple of additional groups who are particularly at risk in relation to fraudulent advertising. One of those is pensioners and people approaching pension age. Because of the pension freedoms that are in place, we have a lot of people making uninformed decisions about how best to deal with their pensions, and sometimes they are able to withdraw a significant amount of money in one go. For an awful lot of people, withdrawing that money and paying the tax on it leads to a major financial loss—never mind the next step that they may take, which is to provide the money to fraudsters.

For pensioners in particular, requiring adverts to be clearly different from other search results would make a positive difference. The other thing that we have to remember is that pensioners generally did not grow up online, and some of them struggle more to navigate the internet than some of us who are bit younger.

I speak with some experience of this issue, because I had a constituent who was a pensioner and who was scammed of £20,000—her life savings. Does my hon. Friend realise that it is sometimes possible to pressurise the banks into returning the money? In that particular case, I got the money back for my constituent by applying a great deal of pressure on the bank, and it is worth knowing that the banks are susceptible to a bit of publicity. That is perhaps worth bearing in mind, because it is a useful power that we have as Members of Parliament.

I thank my hon. Friend for his public service announcement. His constituent is incredibly lucky that my hon. Friend managed to act in that way and get the money back to her, because there are so many stories of people not managing to get their money back and losing their entire life savings as a result of scams. It is the case that not all those scams take place online—people can find scams in many other places—but we have the opportunity with the Bill to take action on scams that are found on the internet.

The other group I want to mention, and for whom highlighting advertising could make a positive difference, is people with learning disabilities. People with learning disabilities who use the internet may not understand the difference between adverts and search results, as the hon. Member for Worsley and Eccles South mentioned. They are a group who I would suggest are particularly susceptible to fraudulent advertising.

We are speaking a lot about search engines, but a lot of fraudulent advertising takes place on Facebook and so on. Compared with the majority of internet users, there is generally an older population on such sites, and the ability to tackle fraudulent advertising there is incredibly useful. We know that the sites can do it, because there are rules in place now around political advertising on Facebook, for example. We know that it is possible for them to take action; it is just that they have not yet taken proper action.

I am happy to support the amendments, but I am also glad that the Minister has put these measures in the Bill, because they will make a difference to so many of our constituents.

I thank the hon. Member for Aberdeen North for her latter remarks. We made an important addition to the Bill after listening to parliamentarians across the House and to the Joint Committee, which many people served on with distinction. I am delighted that we have been able to make that significant move. We have heard a lot about how fraudulent advertising can affect people terribly, particularly more vulnerable people, so that is an important addition.

Amendments 23 and 24 seek to make it clear that where the target is in the UK, people are covered. I am happy to assure the Committee that that is already covered, because the definitions at the beginning of the Bill—going back to clause 3(5)(b), on page 3—make it clear that companies are in scope, both user-to-user and search, if there is a significant number of UK users or where UK users form one of the target markets, or is the only target market. Given the reference to “target markets” in the definitions, I hope that the shadow Minister will withdraw the amendment, because the matter is already covered in the Bill.

New clause 5 raises important points about the regulation of online advertising, but that is outside the purview of what the Bill is trying to achieve. The Government are going to work through the online advertising programme to tackle these sorts of issues, which are important. The shadow Minister is right to raise them, but they will be tackled holistically by the online advertising programme, and of course there are already codes of practice that apply and are overseen by the Advertising Standards Authority. Although these matters are very important and I agree with the points that she makes, there are other places where those are best addressed.

New clause 6 is about the verification process. Given that the Bill is primary legislation, we want to have the core duty to prevent fraudulent advertising in the Bill. How that is implemented in this area, as in many others, is best left to Ofcom and its codes of practice. When Ofcom publishes the codes of practice, it might consider such a duty, but we would rather leave Ofcom, as the expert regulator, with the flexibility to implement that via the codes of practice and leave the hard-edged duty in the Bill as drafted.

We are going to press amendments 23 and 24 to a vote because they are very important. I cited the example of earlier legislation that considered it important, in relation to selling tickets, to include the wording “anywhere in the world”. We know that ticket abuses happen with organisations in different parts of the world.

The hon. Lady is perfectly entitled to press to a vote whatever amendments she sees fit, but in relation to amendments 24 and 25, the words she asks for,

“where the UK is a target market”,

are already in the Bill, in clause 3(5)(b), on page 3, which set out the definitions at the start. I will allow the hon. Lady a moment to look at where it states:

“United Kingdom users form one of the target markets for the service”.

That applies to user-to-user and to search, so it is covered already.

The problem is that we are getting into the wording of the Bill. As with the child abuse clause that we discussed before lunch, there are limitations. Clause 3 states that a service has links with the United Kingdom if

“the service has a significant number of United Kingdom users”.

It does not matter if a person is one of 50, 100 or 1,000 people who get scammed by some organisation operating in another part of the country. The 2006 Bill dealing with the sale of Olympic tickets believed that was important, and we also believe it is important. We have to find a way of dealing with ticket touting and ticket abuse.

Turning to fraudulent advertising, I have given examples and been supported very well by the hon. Member for Aberdeen North. It is not right that vulnerable people are repeatedly taken in by search results, which is the case right now. The reason we have tabled all these amendments is that we are trying to protect vulnerable people, as with every other part of the Bill.

That is of course our objective as well, but let me just return to the question of the definitions. The hon. Lady is right that clause 3(5)(a) says

“a significant number of United Kingdom users”,

but paragraph (b) just says,

“United Kingdom users form one of the target markets”.

There is no significant number qualification in paragraph (b), and to put it beyond doubt, clause 166(1) makes it clear that service providers based outside the United Kingdom are within the scope of the Bill. To reiterate the point, where the UK is a target market, there is no size qualification: the service provider is in scope, even if it is only one user.

Question proposed, That the clause stand part of the Bill.

With this it will be convenient to discuss the following:

Amendment 45, in clause 35, page 34, line 2, leave out subsection (1) and insert—

“(1) A provider of a Category 2A service must operate the service using proportionate systems and processes designed to—

(a) prevent individuals from encountering content consisting of fraudulent advertisements by means of the service;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.”

This amendment brings the fraudulent advertising provisions for Category 2A services in line with those for Category 1 services.

Government amendments 91 to 94.

Clause 35 stand part.

Amendment 44, in clause 36, page 35, line 10, at end insert—

“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”

This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.

Clause 36 stand part.

I am aware that the Minister has reconsidered the clause and tabled a Government amendment that is also in this group, with the same purpose as our amendment 45. That is welcome, as there was previously no justifiable reason why the duties on category 1 services and category 2A services were misaligned.

All three of the duties on category 1 services introduced by clause 34 are necessary to address the harm caused by fraudulent and misleading online adverts. Service providers need to take proportionate but effective action to prevent those adverts from appearing or reappearing, and when they do appear, those service providers need to act quickly by swiftly taking them down. The duties on category 2A services were much weaker, only requiring them to minimise the risk of individuals encountering content consisting of fraudulent advertisements in or via search results of the service. There was no explicit reference to prevention, even though that is vital, or any explicit requirement to act quickly to take harmful adverts down.

That difference would have created an opportunity for fraudsters to exploit by focusing on platforms with lesser protections. It could have resulted in an increase in fraud enabled by paid-for advertising on search services, which would have undermined the aims of the Bill. I am glad that the Government have recognised this and will require the same proactive, preventative response to harmful ads from regulated search engines as is required from category 1 services.

I will now speak to amendment 44, which focuses on the loophole that exists with regard to harm resulting from exposure to fraudulent and misleading advertising for debt help and solutions. The debt advice charity StepChange told us that as many as 15% of people searching for StepChange and other debt advice charities online are routed away by deceptive adverts, resulting in a staggering 1.7 million click-throughs every year. These adverts impersonate the names and branding of the charities and make misleading claims about the services on offer. People exposed to these adverts will be people needing debt advice who will often be under intense emotional and financial pressure. They can therefore be very vulnerable to scammers who then push them towards unsuitable services for a fee.

Debt advice charities, including StepChange and the Money Advice Trust, have been working hard to tackle these impersonator ads. For instance, StepChange reported 72 adverts to the tech giants and regulators last year for misleading and harmful practices, only some of which the Advertising Standards Authority has issued rulings against. StepChange and the Money Advice Trust are keen to have the safeguards in place that are needed by the people who are most vulnerable to harm and exploitation, yet in the current drafting of the Bill harmful adverts on debt advice could slip through the net.

The conditions for an advert to be defined as fraudulent are set out in clause 34(3) for category 1 services and clause 35(3) for category 2A search services. Both clauses specify that an advert is fraudulent if it amounts to an offence set out in clause 36. Clause 36 lists a series of offences gathered from financial services legislation and the Fraud Act 2006.

Charities are concerned that fraudulent debt advice advertisements will not be captured by the offences set out in clause 36(2) contained in the Financial Services and Markets Act 2000, which relate to persons unauthorised by the Financial Conduct Authority carrying on an activity that is regulated under the Act. While providing debt counselling and debt adjusting are regulated activities, brokering debt solutions is not. Therefore the offences listed in the Bill would not seem to capture the unregulated advertisers behind misleading adverts, including those that impersonate debt advice charities.

Furthermore, the explanatory notes for the offences taken from the Financial Services Act 2012 show that these offences appear to be intended to address financial market abuse, and so seem somewhat at a distance from the harm consumers face from fraudulent online ads for debt help services.

Clause 36(3) lists offences under the Fraud Act 2006. This could capture harmful advertisements for debt help and debt solutions, but it is not completely clear that these provisions capture, or best capture, the nature of unfair practice caused by misleading online adverts for debt solutions. The Government’s announcement on 8 March outlined that fraudulent paid-for online adverts would be included in this Bill. However, they drew a distinction between “fraudulent adverts”, to be covered by the Bill, and “misleading adverts”, which will be considered in the online advertising consultation. In reality, this dividing line is not clear cut, even where the Bill seeks to define “fraudulent adverts” in terms of offences in other legislation.

Amendment 44 seeks to align clause 36 offences better with important existing consumer protection legislation. It would insert further offences into clause 36 to include offences that are contained in part 3 of the existing consumer protection from unfair trading regulations of 2008. Those regulations are key pieces of consumer protection legislation. Part 3 of those regulations creates offences relating to misleading or aggressive practices. Most relevant here would be the regulation 9 offence for contravening the prohibition on “misleading actions”, which states that something is a misleading practice if it fulfils one of two conditions. The first is that it both contains “false information” and is likely to cause “the average consumer” to take a decision they would not otherwise have done. The second is that it causes “confusion” with other products or trade names.

It has been pointed out that these regulations by themselves have not stopped vulnerable consumers being exposed to adverts of misleading debt solutions, despite the best efforts of regulators and charities to stop them. Adding offences under the consumer protection regulations to the Bill would finally close the net.

There should be no objection from the Government to this amendment. Through the consumer protection regulations, they have already recognised misleading commercial practices as an offence, including promotions that mislead consumers or create confusion over trade names. We therefore have a situation where harmful debt adverts meet the criteria of offence in consumer protection regulations, but might not meet the Fraud Act 2006 provisions in the Online Safety Bill. The amendment seeks to clarify and align the treatment of misleading debt adverts, which can be so harmful to people.

I admit that these amendments can get very technical, but it is important that I finish by talking about the impact of these scams on people’s lives. I want to talk about the experience of a woman who was recommended to StepChange’s debt advice services but clicked on a copycat debt ad from a firm masquerading as StepChange in the online search results. After entering her personal information into what she thought was a genuine website, the woman was pestered by phone calls into setting up an individual voluntary arrangement, or IVA, and made a series of payments worth £650 that were meant for her creditors. Sadly, it was only after contact from her bank, four months later, that the woman realised the debt firm she had clicked on was a scam.

The Bill offers a chance to establish an important principle. People should be able to have confidence that the links they click on are for reputable regulated advice services. People should not have to be constantly on their guard against scams and other misleading promotions found on social media websites and in top-of-the-page search results. Without this amendment and the others to this chapter, we cannot be sure that those outcomes will be achieved.

As we have heard already, these clauses are very important because they protect people from online fraudulent advertisements for the first time—something that the whole House quite rightly called for. As the shadow Minister said, the Government heard Parliament’s views on Second Reading, and the fact that the duties in clause 35 were not as strongly worded as those in clause 34 was recognised. The Government heard what Members said on Second Reading and tabled Government amendments 91 to 94, which make the duties on search firms in clause 35 as strong as those on user-to-user firms in clause 34. Opposition amendment 45 would essentially do the same thing, so I hope we can adopt Government amendments 91 to 94 without needing to move amendment 45. It would do exactly the same thing—we are in happy agreement on that point.

I listened carefully to what the shadow Minister said on amendment 44. The example she gave at the end of her speech—the poor lady who was induced into sending money, which she thought was being sent to pay off creditors but was, in fact, stolen—would, of course, be covered by the Bill as drafted, because it would count as an act of fraud.

The hon. Lady also talked about some other areas that were not fraud, such as unfair practices, misleading statements or statements that were confusing, which are clearly different from fraud. The purpose of clause 35 is to tackle fraud. Those other matters are, as she says, covered by the Consumer Protection from Unfair Trading Regulations 2008, which are overseen and administered by the Competition and Markets Authority. While matters to do with unfair, misleading or confusing content are serious—I do not seek to minimise their importance—they are overseen by a different regulator and, therefore, better handled by the CMA under its existing regulations.

If we introduce this extra offence to the list in clause 36, we would end up having a bit of regulatory overlap and confusion, because there would be two regulators involved. For that reason, and because those other matters—unfair, misleading and confusing advertisements —are different to fraud, I ask that the Opposition withdraw amendment 44 and, perhaps, take it up on another occasion when the CMA’s activities are in the scope of the debate.

No, we want to press this amendment to a vote. I have had further comment from the organisations that I quoted. They believe that we do need the amendment because it is important to stop harmful ads going up in the first place. They believe that strengthened provisions are needed for that. Guidance just puts the onus for protecting consumers on the other regulatory regimes that the Minister talked about. The view of organisations such as StepChange is that those regimes—the Advertising Standards Authority regime—are not particularly strong.

The regulatory framework for financial compulsion is fragmented. FCA-regulated firms are clearly under much stronger obligations than those that fall outside FCA regulations. I believe that it would be better to accept the amendment, which would oblige search engines and social media giants to prevent harmful and deceptive ads from appearing in the first place. The Minister really needs to take on board the fact that in this patchwork, this fragmented world of different regulatory systems, some of the existing systems are clearly failing badly, and the strong view of expert organisations is that the amendment is necessary.

Question put and agreed to.

Clause 34 accordingly ordered to stand part of the Bill.

Clause 35

Duties about fraudulent advertising: Category 2A services

Amendments made: 91, in clause 35, page 34, line 3, leave out from “to” to end of line 5 and insert—

“(a) prevent individuals from encountering content consisting of fraudulent advertisements in or via search results of the service;

(b) if any such content may be encountered in or via search results of the service, minimise the length of time that that is the case;

(c) where the provider is alerted by a person to the fact that such content may be so encountered, or becomes aware of that fact in any other way, swiftly ensure that individuals are no longer able to encounter such content in or via search results of the service.”

This amendment alters the duty imposed on providers of Category 2A services relating to content consisting of fraudulent advertisements so that it is in line with the corresponding duty imposed on providers of Category 1 services by clause 34(1).

Amendment 92, in clause 35, page 34, line 16, leave out “reference” and insert “references”.

This amendment is consequential on Amendment 91.

Amendment 93, in clause 35, page 34, line 18, leave out “is a reference” and insert “are references”.

This amendment is consequential on Amendment 91.

Amendment 94, in clause 35, page 34, line 22, leave out

“does not include a reference”

and insert “do not include references”.—(Chris Philp.)

This amendment is consequential on Amendment 91.

Clause 35, as amended, ordered to stand part of the Bill.

Clause 36

Fraud etc offences

Amendment proposed: 44, in clause 36, page 35, line 10, at end insert—

“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”—(Barbara Keeley.)

This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.

Question put, That the amendment be made.

Clause 36 ordered to stand part of the Bill.

Clause 37

Codes of practice about duties

Amendment 96 has been tabled by Carla Lockhart, who is not on the Committee. Does anyone wish to move amendment 96? No.

I beg to move amendment 65, in clause 37, page 36, line 27, at end insert—

“(ia) organisations that campaign for the removal of animal abuse content, and”.

This amendment would add organisations campaigning for the removal of animal content to the list of bodies Ofcom must consult.

With this it will be convenient to discuss the following:

Amendment 63, in schedule 4, page 176, line 29, at end insert “and

(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.

This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.

Amendment 64, in schedule 4, page 177, line 4, at end insert “and

(vii) the systems and process are appropriate to detect cruelty towards humans and animals;”.

This amendment would ensure that ensuring systems and processes are appropriate to detect cruelty towards humans and animals is one of the online safety objectives for search services.

Amendment 60, in clause 52, page 49, line 5, at end insert—

“(e) an offence, not within paragraph (a), (b) or (c), of which the subject is an animal.”

This amendment brings offences to which animals are subject within the definition of illegal content.

Amendment 59, in schedule 7, page 185, line 39, at end insert—

“Animal Welfare

22A An offence under any of the following provisions of the Animal Welfare Act 2006—

(a) section 4 (unnecessary suffering);

(b) section 5 (mutilation);

(c) section 7 (administration of poisons);

(d) section 8 (fighting);

(e) section 9 (duty of person responsible for animal to ensure welfare).

22B An offence under any of the following provisions of the Animal Health and Welfare (Scotland) Act 2006—

(a) section 19 (unnecessary suffering);

(b) section 20 (mutilation);

(c) section 21 (cruel operations);

(d) section 22 (administration of poisons);

(e) section 23 (fighting);

(f) section 24 (ensuring welfare of animals).

22C An offence under any of the following provisions of the Welfare of Animals Act (Northern Ireland) 2011—

(a) section 4 (unnecessary suffering);

(b) section 5 (prohibited procedures);

(c) section 7 (administration of poisons);

(d) section 8 (fighting);

(e) section 9 (ensuring welfare of animals).

22D For the purpose of paragraphs 22A, 22B or 22C of this Schedule, the above offences are deemed to have taken place regardless of whether the offending conduct took place within the United Kingdom, if the offending conduct would have constituted an offence under the provisions contained within those paragraphs.”

This amendment adds certain animal welfare offences to the list of priority offences in Schedule 7.

Amendment 66, in clause 140, page 121, line 8, at end insert—

“(d) causing harm to any human or animal.”

This amendment ensures groups are able to make complaints regarding animal abuse videos.

Amendment 67, in clause 140, page 121, line 20, at end insert

“, or a particular group that campaigns for the removal of harmful online content towards humans and animals”.

This amendment makes groups campaigning against harmful content eligible to make supercomplaints.

It is, as ever, a pleasure to serve under your chairship, Ms Rees. Amendment 65 would add organisations campaigning for the removal of animal content to the list of bodies that Ofcom must consult. As we all know, Ofcom must produce codes of practice that offer guidance on how regulated services can comply with its duties. Later in the Bill, clause 45 makes clear that if a company complies with the code of practice, it will be deemed to have complied with the Bill in general. In addition, the duties for regulated services come into force at the same time as the codes of practice. That all makes what the codes say extremely important.

The absence of protections relating to animal abuse content is a real omission from the Bill. Colleagues will have seen the written evidence from Action for Primates, which neatly summarised the key issues on which Labour is hoping to see agreement from the Government. Given this omission, it is clear that the current draft of the Bill is not fit for tackling animal abuse, cruelty and violence, which is all too common online.

There are no explicit references to content that can be disturbing and distressing to those who view it—both children and adults. We now know that most animal cruelty content is produced specifically for sharing on social media, often for profit through the monetisation schemes offered by platforms such as YouTube. Examples include animals being beaten, set on fire, crushed or partially drowned; the mutilation and live burial of infant monkeys; a kitten intentionally being set on by a dog and another being stepped on and crushed to death; live and conscious octopuses being eaten; and animals being pitted against each other in staged fights.

Animals being deliberately placed into frightening or dangerous situations from which they cannot escape or are harmed before being “rescued” on camera is becoming increasingly popular on social media, too. For example, kittens and puppies are “rescued” from the clutches of a python. Such fake rescues not only cause immense suffering to animals, but are fraudulent because viewers are asked to donate towards the rescue and care of the animals. This cannot be allowed to continue.

Indeed, as part of its Cancel Out Cruelty campaign, the Royal Society for the Prevention of Cruelty to Animals conducted research, which found that in 2020 there were nearly 500 reports of animal cruelty on social media. That was more than twice the figure reported for 2019. The majority of these incidents appeared on Facebook. David Allen, head of prevention and education at the RSPCA, has spoken publicly about the issue, saying:

“Sadly, we have seen an increase in recent years in the number of incidents of animal cruelty being posted and shared on social media such as Facebook, Instagram, TikTok and Snapchat.”

I totally agree with the points that the hon. Lady is making. Does she agree that the way in which the Bill is structured means that illegal acts that are not designated as “priority illegal” will likely be put at the very end of companies’ to-do list and that they will focus considerably more effort on what they will call “priority illegal” content?

I completely agree with and welcome the hon. Gentleman’s contribution. It is a very valid point and one that we will explore further. It shows the necessity of this harm being classed as a priority harm in order that we protect animals, as well as people.

David Allen continued:

“We’re very concerned that the use of social media has changed the landscape of abuse with videos of animal cruelty being shared for likes and kudos with this sort of content normalising—and even making light of—animal cruelty. What’s even more worrying is the level of cruelty that can be seen in these videos, particularly as so many young people are being exposed to graphic footage of animals being beaten or killed which they otherwise would never have seen.”

Although the Bill has a clear focus on protecting children, we must remember that the prevalence of cruelty to animals online has the potential to have a hugely negative impact on children who may be inadvertently seeing that content through everyday social media channels.

The hon. Lady knows that I am a great animal lover, and I obviously have concerns about children being exposed to these images. I am just wondering how she would differentiate between abusive images and the images that are there to raise awareness of certain situations that animals are in. I have seen many distressing posts about the Yulin dogmeat festival and about beagles being used in laboratory experiments. How would she differentiate between images that are there to raise awareness of the plight of animals and the abusive ones?

I thank the hon. Lady for her contribution. Like me, she is a passionate campaigner for animal welfare. It was a pleasure to serve on the Committee that considered her Glue Traps (Offences) Act 2022, which I know the whole House was pleased to pass. She raises a very important point and one that the Bill later explores with regard to other types of content, such as antisemitic content and racist content in terms of education and history and fact. The Bill deals specifically with that later, and this content would be dealt with in the same way. We are talking about where content is used as an educational tool and a raising-awareness tool, compared with just images and videos of direct abuse.

To give hon. Members a real sense of the extent of the issue, I would like to share some findings from a recent survey of the RSPCA’s frontline officers. These are pretty shocking statistics, as I am sure Members will all agree. Eighty-one per cent. of RSPCA frontline officers think that more abuse is being caught on camera. Nearly half think that more cases are appearing on social media. One in five officers said that one of the main causes of cruelty to animals is people hurting animals just to make themselves more popular on social media. Some of the recent cruelty videos posted on social media include a video of a magpie being thrown across the road on Instagram in June 2021; a woman captured kicking her dog on TikTok in March 2021; a teenager being filmed kicking a dog, which was shared on WhatsApp in May 2021; and videos posted on Instagram of cockerels being forced to fight in March 2021.

I am sure that colleagues will be aware of the most recent high-profile case, which was when disturbing footage was posted online of footballer Kurt Zouma attacking his cat. There was, quite rightly, an outpouring of public anger and demands for justice. Footage uploaded to Snapchat on 6 February showed Zouma kicking his Bengal cat across a kitchen floor in front of his seven-year-old son. Zouma also threw a pair of shoes at his pet cat and slapped its head. In another video, he was heard saying:

“I swear I’ll kill it.”

In sentencing him following his guilty plea to two offences under the Animal Welfare Act 2006, district judge Susan Holdham described the incident as “disgraceful and reprehensible”. She added:

“You must be aware that others look up to you and many young people aspire to emulate you.”

What makes that case even more sad is the way in which the video was filmed and shared, making light of such cruelty. I am pleased that the case has now resulted in tougher penalties for filming animal abuse and posting it on social media, thanks to new guidelines from the Sentencing Council. The prosecutor in the Zouma case, Hazel Stevens, told the court:

“Since this footage was put in the public domain there has been a spate of people hitting cats and posting it on various social media sites.”

There have been many other such instances. Just a few months ago, the most abhorrent trend was occurring on TikTok: people were abusing cats, dogs and other animals to music and encouraging others to do the same. Police officers discovered a shocking 182 videos with graphic animal cruelty on mobile phones seized during an investigation. This sickening phenomenon is on the rise on social media platforms, provoking a glamorisation of the behaviour. The videos uncovered during the investigation showed dogs prompted to attack other animals such as cats, or used to hunt badgers, deer, rabbits and birds. Lancashire police began the investigation after someone witnessed two teenagers encouraging a dog to attack a cat on an estate in Burnley in March of last year. The cat, a pet named Gatsby, was rushed to the vet by its owners once they discovered what was going on, but unfortunately it was too late and Gatsby’s injuries were fatal. The photos and videos found on the boys’ phones led the police to discover more teenagers in the area who were involved in such cruel activities. The views and interactions that the graphic footage was attracting made it even more visible, as the platform was increasing traffic and boosting content when it received attention.

It should not have taken such a high-profile case of a professional footballer with a viral video to get this action taken. There are countless similar instances occurring day in, day out, and yet the platforms and authorities are not taking the necessary action to protect animals and people from harm, or to protect the young people who seek to emulate this behaviour.

I pay tribute to the hard work of campaigning groups such as the RSPCA, Action for Primates, Asia for Animals Coalition and many more, because they are the ones who have fought to keep animal rights at the forefront. The amendment seeks to ensure that such groups are given a voice at the table when Ofcom consults on its all-important codes of practice. That would be a small step towards reducing animal abuse content online, and I hope the Minister can see the merits in joining the cause.

I turn to amendment 60, which would bring offences to which animals are subject within the definition of illegal content, a point raised by the hon. Member for Ochil and South Perthshire. The Minister will recall the Animal Welfare (Sentencing) Act 2021, which received Royal Assent last year. Labour was pleased to see the Government finally taking action against those who commit animal cruelty offences offline. The maximum prison sentence for animal cruelty was increased from six months to five years, and the Government billed that move as them taking a firmer approach to cases such as dog fighting, abuse of puppies and kittens, illegally cropping a dog’s ears and gross neglect of farm animals. Why, then, have the Government failed to include offences against animals within the scope of illegal content online? We want parity between the online and offline space, and that seems like a sharp omission from the Bill.

Placing obligations on service providers to remove animal cruelty content should fall within both the spirit and the scope of the Bill. We all know that the scope of the Bill is to place duties on service providers to remove illegal and harmful content, placing particular emphasis on the exposure of children. Animal cruelty content is a depiction of illegality and also causes significant harm to children and adults.

If my inbox is anything to go by, all of us here today know what so many of our constituents up and down the country feel about animal abuse. It is one of the most popular topics that constituents contact me about. Today, the Minister has a choice to make about his Government's commitment to preventing animal cruelty and keeping us all safe online. I hope he will see the merit in acknowledging the seriousness of animal abuse online.

Amendment 66 would ensure that groups were able to make complaints about animal abuse videos. Labour welcomes clause 140, as the ability to make super-complaints is a vital part of our democracy. However, as my hon. Friend the Member for Worsley and Eccles South and other Members have mentioned, the current definition of an “eligible entity” is far too loose. I have set out the reasons as to why the Government must go further to limit and prevent animal abuse content online. Amendment 66 would ensure that dangerous animal abuse content is a reasonable cause for a super-complaint to be pursued.

The shadow Minister raises important issues to do with animal cruelty. The whole House and our constituents feel extremely strongly about this issue, as we know. She set out some very powerful examples of how this terrible form of abuse takes place.

To some extent, the offences are in the Bill’s scope already. It covers, for example, extreme pornography. Given that the content described by the hon. Lady would inflict psychological harm to children, it is, to that extent, in scope.

The hon. Lady mentioned the Government’s wider activities to prevent animal cruelty. That work goes back a long time and includes the last Labour Government’s Animal Welfare Act 2006. She mentioned the more recent update to the criminal sentencing laws that increased by a factor of 10 the maximum sentence for cruelty to animals. It used to be six months and has now been increased to up to five years in prison.

In addition, just last year the Department for Environment, Food and Rural Affairs announced an action plan for animal welfare, which outlines a whole suite of activities that the Government are taking to protect animals in a number of different areas—sentience, international trade, farming, pets and wild animals. That action plan will be delivered through a broad programme of legislative and non-legislative work.

I mentioned some of the ways the Bill will assist with looking after animals. We are concerned to make sure that the Bill delivers its core intent: to protect children, to protect humans from illegal activity, and to stop the priority offences. Given that that is the objective, and given everything else I have just said about the other work that is going on—much of which is effective, as demonstrated by the prosecution of Kurt Zouma just a week or two ago—we do not feel able to accept the amendments as drafted. However, it is an area that I am sure is of concern to Members across the House, and now that the shadow Minister has raised the question, we will certainly give further thought to it.

On the basis of the Government’s existing work on animal welfare, the effect that the Bill as drafted will have in this area, and the fact that we will give this issue some further thought, I hope that the shadow Minister will let the matter rest for now.

I thank the Minister for agreeing to look at this issue further. However, we do see it as being within the scope of the Bill, and we have the opportunity to do something about it now, so we will be pressing these amendments to a vote. If you will allow me, Ms Rees, I would also like to pay tribute to the former Member of Parliament for Redcar, Anna Turley, who campaigned tirelessly on these issues when she was a Member of the House. We would like these amendments to be part of the Bill.

Question put, That the amendment be made.

Question proposed, That the clause stand part of the Bill.

With this it will be convenient to discuss the following:

Clause 38 stand part.

That schedule 4 be the Fourth schedule to the Bill.

New clause 20—Use of proactive technology in private messaging: report

“(1) OFCOM must produce a report—

(a) examining the case for the use of proactive technology in private messaging where the aim is to identify CSEA content; and

(b) making recommendations to whether or not proactive technology should be used in such cases.

(2) The report must be produced in consultation with organisations that have expertise and experience in tackling CSEA.

(3) The report must be published and laid before both Houses of Parliament within six months of this Act being passed.”

On clause 37, it is welcome that Ofcom will have to prepare and issue a code of practice for service providers with duties relating to illegal content in the form of terrorism or child sexual exploitation and abuse content. The introduction of compliance measures relating to fraudulent advertising is also very welcome. We do, however, have some important areas to amend, including the role of different expert groups in assisting Ofcom during its consultation process, which I have already outlined in relation to animal cruelty.

On clause 38, Labour supports the notion that Ofcom must have specific principles to adhere to when preparing the codes of practice, and of course, the Secretary of State must have oversight of those. However, as I will touch on as we proceed, Labour feels that far too much power is given to the Secretary of State of the day in establishing those codes.

Labour believes that that schedule 4 is overwhelmingly loose in its language, and we have concerns about the ability of Ofcom—try as it might—to ensure that its codes of practice are both meaningful to service providers and in compliance with the Bill’s legislative requirements. Let me highlight the schedule’s broadness by quoting from it. Paragraph 4 states:

“The online safety objectives for regulated user-to-user services are as follows”.

I will move straight to paragraph 4(a)(iv), which says

“there are adequate systems and processes to support United Kingdom users”.

Forgive me if I am missing something here, but surely an assessment of adequacy is too subjective for these important codes of practice. Moreover, the Bill seems to have failed to consider the wide-ranging differences that exist among so-called United Kingdom users. Once again, there is no reference to future-proofing against emerging technologies. I hope that the Minister will therefore elaborate on how he sees the codes of practice and their principles, objectives and content as fit for purpose. More broadly, it is remarkable that schedule 4 is both too broad in its definitions and too limiting in some areas—we might call it a Goldilocks schedule.

I turn to new clause 20. As we have discussed, a significant majority of online child abuse takes place in private messages. Research from the NSPCC shows that 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people whom they have not met offline before. When children are contacted by someone they do not know, in nearly three quarters of cases that takes place by private message.

Schedule 4 introduces new restrictions on Ofcom’s ability to require a company to use proactive technology to identify or disrupt abuse in private messaging. That will likely restrict Ofcom’s ability to include in codes of practice widely used industry-standard tools such as PhotoDNA and CSAI Match, which detect known child abuse images, and artificial intelligence classifiers to detect self-generated images and grooming behaviour. That raises significant questions about whether the regulator can realistically produce codes of practice that respond to the nature and extent of the child abuse threat.

As it stands, the Bill will leave Ofcom unable to require companies to proactively use technology that can detect child abuse. Instead, Ofcom will be wholly reliant on the use of CSEA warning notices under clause 103, which will enable it to require the use of proactive technologies only where there is evidence that child abuse is already prevalent—in other words, where significant online harm has already occurred. That will necessitate the use of a laborious and resource-intensive process, with Ofcom having to build the evidence to issue CSEA warning notices company by company.

Those restrictions will mean that the Bill will be far less demanding than comparable international legislation in respect of the requirement on companies to proactively detect and remove online child abuse. So much for the Bill being world leading. For example, the EU child abuse legislative proposal published in May sets out clear and unambiguous requirements on companies to proactively scan for child abuse images and grooming behaviour on private messages.

If the regulator is unable to tackle online grooming sufficiently proactively, the impact will be disproportionately felt by girls. NSPCC data shows that an overwhelming majority of criminal offences target girls, with those aged 12 to 15 the most likely to be victims of online grooming. Girls were victims in 83% of offences where data was recorded. Labour recognises that once again there are difficulties between our fundamental right to privacy and the Bill’s intentions in keeping children safe. This probing new clause is designed to give the Government an opportunity to report on the effectiveness of their proposed approach.

Ultimately, the levels of grooming taking place on private messaging platforms are incredibly serious. I have two important testimonies that are worth placing on the record, both of which have been made anonymous to protect the victims but share the same sentiment. The first is from a girl aged 15. She said:

“I’m in a serious situation that I want to get out of. I’ve been chatting with this guy online who’s like twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to prove my trust to him, like doing video chats with my chest exposed.”

The second is from a boy aged 17. He said:

“I’ve got a fitness page on Instagram to document my progress but I get a lot of direct messages from weird people. One guy said he’d pay me a lot of money to do a private show for him. He now messages me almost every day asking for more explicit videos and I’m scared that if I don’t do what he says, then he will leak the footage and my life would be ruined”.

Those testimonies go to show how fundamentally important it is for an early assessment to be made of the effectiveness of the Government’s approach following the Bill gaining Royal Assent.

We all have concerns about the use of proactive technology in private messaging and its potential impact on personal privacy. End-to-end encryption offers both risks and benefits to the online environment, but the main concern is based on risk profiles. End-to-end encryption is particularly problematic on social networks because it is embedded in the broader functionality of the service, so all text, DMs, images and live chats could be encrypted. Consequently, its impact on detecting child abuse becomes even greater. There is an even greater risk with Meta threatening to bring in end-to-end encryption for all its services. If platforms cannot demonstrate that they can mitigate those risks to ensure a satisfactory risk profile, they should not be able to proceed with end-to-end encryption until satisfactory measures and mitigations are in place.

Tech companies have made significant efforts to frame this issue in the false binary that any legislation that impacts private messaging will damage end-to-end encryption and will mean that encryption will not work or is broken. That argument is completely false. A variety of novel technologies are emerging that could allow for continued CSAM scanning in encrypted environments while retaining the privacy benefits afforded by end-to-end encryption.

Apple, for example, has developed its NeuralHash technology, which allows for on-device scans for CSAM before a message is sent and encrypted. That client-side implementation—rather than service-side encryption—means that Apple does not learn anything about images that do not match the known CSAM database. Apple’s servers flag accounts that exceed a threshold number of images that match a known database of CSAM image hashes, so that Apple can provide relevant information to the National Centre for Missing and Exploited Children. That process is secure and expressly designed to preserve user privacy.

Homomorphic encryption technology can perform image hashing on encrypted data without the need to decrypt the data. No identifying information can be extracted and no details about the encrypted image are revealed, but calculations can be performed on the encrypted data. Experts in hash scanning—including Professor Hany Farid of the University of California, Berkeley, who developed PhotoDNA—insist that scanning in end-to-end encrypted environments without damaging privacy will be possible if companies commit to providing the engineering resources to work on it.

To move beyond the argument that requiring proactive scanning for CSAM means breaking or damaging end-to-end encryption, amendments to the Bill could provide a powerful incentive for companies to invest in technology and engineering resources that will allow them to continue scanning while pressing ahead with end-to-end encryption, so that privacy is preserved but appropriate resources for and responses to online child sexual abuse can continue. It is highly unlikely that some companies will do that unless they have the explicit incentive to do so. Regulation can provide such an incentive, and I urge the Minister to make it possible.

It is a pleasure to follow the shadow Minister, who made some important points. I will focus on clause 37 stand part. I pay tribute to the Minister for his incredible work on the Bill, with which he clearly wants to stop harm occurring in the first place. We had a great debate on the matter of victim support. The Bill requires Ofcom to produce a number of codes of practice to help to achieve that important aim.

Clause 37 is clear: it requires codes of practice on illegal content and fraudulent advertising, as well as compliance with “the relevant duties”, and it is on that point that I hope the Minister can help me. Those codes will help Ofcom to take action when platforms do things that they should not, and will, I hope, provide a way for platforms to comply in the first place rather than falling foul of the rules.

How will the codes help platforms that are harbouring material or configuring their services in a way that might be explicitly or inadvertently promoting violence against women and girls? The Minister knows that women are disproportionately the targets of online abuse on social media or other platforms. The impact, which worries me as much as I am sure it worries him, is that women and girls are told to remove themselves from social media as a way to protect themselves against extremely abusive or harassing material. My concern is that the lack of a specific code to tackle those important issues might inadvertently mean that Ofcom and the platforms overlook them.

Would a violence against women and girls code of practice help to ensure that social media platforms were monitored by Ofcom for their work to prevent tech-facilitated violence against women and girls? A number of organisations think that it would, as does the Domestic Abuse Commissioner herself. Those organisations have drafted a violence against women and girls code of practice, which has been developed by an eminent group of specialists—the End Violence Against Women Coalition, Glitch, Carnegie UK Trust, the NSPCC, 5Rights, and Professors Clare McGlynn and Lorna Woods, both of whom gave evidence to us. They believe it should be mandatory for Ofcom to adopt a violence against women and girls code to ensure that this issue is taken seriously and that action is taken to prevent the risks in the first place. Clause 37 talks about codes, but it is not specific on that point, so can the Minister help us? Like the rest of the Committee, he wants to prevent women from experiencing these appalling acts online, and a code of practice could help us deal with that better.

The Government already recognise that women disproportionately experience the impact of online abuse, and they have a track record of acting. They were the first to outlaw revenge pornography, and they have introduced more laws since. I hope the Minister will put at rest my mind and the minds of those who drew together the code that was issued late last month by setting out how this will be undertaken by Ofcom. Will a code on this issue be pulled together, or will it be incorporated into the codes that are being developed? It is incredibly important for him to do that.

I absolutely agree with the points that have been made about the violence against women code of conduct. It is vital, and it would be a really important addition to the Bill. I associate myself with the shadow Minister’s comments, and am happy to stand alongside her.

I want to make a few comments about new clause 20 and some of the issues it raises. The new clause is incredibly important, and we need to take seriously the concerns that have been raised with us by the groups that advocate on behalf of children. They would not raise those concerns if they did not think the Bill was deficient in this area. They do not have spare people and cannot spend lots of time doing unnecessary things, so if they are raising concerns, those are very important things that will make a big difference.

I want to go a little further than what the new clause says and ask the Minister about future-proofing the Bill and ensuring that technologies can be used as they evolve. I am pretty sure that everybody agrees that there should be no space where it is safe to share child sexual exploitation and abuse, whether physical space or online space, private messaging or a more open forum. None of those places should be safe or legal. None should enable that to happen.

My particular thought about future-proofing is about the development of technologies that are able to recognise self-generated pictures, videos, livestreams and so on that have not already been categorised, do not have a hash number and are not easy for the current technologies to find. There are lots of people out there working hard to stamp out these images and videos online, and I have faith that they are developing new technologies that are able to recognise images, videos, messages and oral communications that cannot currently be recognised.

I agree wholeheartedly with the new clause: it is important that a report be produced within six months of the Bill being passed. It would be great if the Minister would commit to thinking about whether Ofcom will be able to require companies to implement new technologies that are developed, as well as the technologies that are currently available. I am not just talking about child sexual abuse images, material or videos; I am also talking about private messaging where grooming is happening. That is a separate thing that needs to be scanned for, but it is incredibly important.

Some of the stories relayed by the shadow Minister relate to conversations and grooming that happened in advance of the self-generated material being created. If there had been a proactive action to scan for grooming behaviour by those companies whose platforms the direct messaging was taking place on, then those young people would potentially have been in a safer place, because it could have been stopped in advance of that self-generated material being created. Surely, that should be the aim. It is good that we can tackle this after the event—it is good that we have something—but tackling it before it happens would be incredibly important.

Online sexual exploitation is a horrific crime, and we all want to see it ended for good. I have concerns about whether new clause 20 is saying we should open up all messaging—where is the consideration of privacy when the scanning is taking place? Forgive me, I do not know much about the technology that is available to scan for that content. I do have concerns that responsible users will have an infringement of privacy, even when doing nothing of concern.

I do not know whether everybody draws the same distinction as me. For me the distinction is that, because it will be happening with proactive technology—technological means will be scanning those messages rather than humans—nobody will see the messages. Software will scan messages, and should there be anything that is illegal—should there be child sexual abuse material—that is what will be flagged and further action taken.

I am not sure whether the hon. Member for Wolverhampton North East heard during my contribution, but this technology does exist, so it is possible. It is a false argument made by those who believe that impacting end-to-end encryption will limit people’s privacy. The technology does exist, and I named some that is able to scan without preventing the encryption of the data. It simply scans for those images and transfers them over existing databases. It would have no impact on anybody’s right to privacy.

I thank the shadow Minister for her assistance with that intervention, which was incredibly helpful. I do not have concerns that anybody will be able to access that data. The only data that will be accessible is when the proactive technology identifies something that is illegal, so nobody can see any of the messages except for the artificial intelligence. When the AI recognises that something is abuse material, at that point the Bill specifies that it will go to the National Crime Agency if it is in relation to child abuse images.

My concern is that, at the point at which the data is sent to the National Crime Agency, it will be visible to human decision making. I am wondering whether that will stop parents sharing pictures of their babies in the bath? There are instances where people could get caught up in a very innocent situation that is deemed to be something more sinister by AI. However, I will take the advice of the hon. Member for Pontypridd advice and look into the technology.

In terms of the secondary processes that kick in after the AI has scanned the data, I assume it will be up to Ofcom and the provider to discuss what happens then. Once the AI identifies something, does it automatically get sent to the National Crime Agency, or does it go through a process of checking to ensure the AI has correctly identified something? I agree with what the Minister has reiterated on a number of occasions; if it is child sexual abuse material then I have no problem with somebody’s privacy being invaded in order for that to be taken to the relevant authorities and acted on.

I want to make one last point. The wording of new clause 20 is about a report on those proactive technologies. It is about requiring Ofcom to come up with and justify the use of those proactive technologies. To give the hon. Member for Wolverhampton North East some reassurance, it is not saying, “This will definitely happen.” I assume that Ofcom will be able to make the case—I am certain it will be able to—but it will have to justify it in order to be able to require those companies to undertake that use.

My key point is about the future-proofing of this, ensuring that it is not just a one-off, and that, if Ofcom makes a designation about the use of proactive technologies, it is able to make a re-designation or future designation, should new proactive technologies come through, so that we can require those new proactive technologies to be used to identify things that we cannot identify with the current proactive technologies.

I want to associate myself with the comments of the right hon. Member for Basingstoke and the hon. Member for Aberdeen North, and to explore the intersection between the work we are doing to protect children and the violence against women and girls strategy. There is one group, girls, who apply to both. We know that they are sadly one of the most vulnerable groups for online harm and abuse, and we must do everything we can to protect them. Having a belt and braces approach, with a code of conduct requirement for the violence against women and girls strategy, plus implementing new clause 20 on this technology that can protect girls in particular, although not exclusively, is a positive thing. Surely, the more thorough we are in the preventive approach, the better, rather than taking action after it is too late?

I agree 100%. The case that the shadow Minister, the hon. Member for Pontypridd, made and the stories she highlighted about the shame that is felt show that we are not just talking about a one-off impact on people’s lives, but potentially years of going through those awful situations and then many years to recover, if they ever do, from the situations they have been through.

I do not think there is too much that we could do, too many codes of practice we could require or too many compliances we should have in place. I also agree that girls are the most vulnerable group when considering this issue, and we need to ensure that this Bill is as fit for purpose as it can be and meets the Government’s aim of trying to make the internet a safe place for children and young people. Because of the additional risks that there are for girls in particular, we need additional protections in place for girls. That is why a number of us in this room are making that case.

This has been an important debate. I think there is unanimity on the objectives we are seeking to achieve, particularly protecting children from the risk of child sexual exploitation and abuse. As we have discussed two or three times already, we cannot allow end-to-end encryption to frustrate or prevent the protection of children.

I will talk about two or three of the issues that have arisen in the course of the debate. The first is new clause 20, a proposal requiring Ofcom to put together a report. I do not think that is strictly necessary, because the Bill already imposes a requirement to identify, assess and mitigate CSEA. There is no optionality here and no need to think about it; there is already a demand to prevent CSEA content, and Ofcom has to produce codes of practice explaining how it will do that. I think what is requested in new clause 20 is required already.

The hon. Member for Pontypridd mentioned the concern that Ofcom had to first of all prove that the CSEA risk existed. I think that might be a hangover from the previous draft of the Bill, where there was a requirement for the evidence to be “persistent and prevalent”—I think that might have been the phrase—which implied that Ofcom had to first prove that it existed before it could take action against it. So, for exactly the reason she mentioned, that it imposed a requirement to prove CSEA is there, we have changed the wording in the new version. Clause 103(1), at the top of page 87, instead of “persistent and prevalent”, now states “necessary and proportionate”. Therefore, if Ofcom simply considers something necessary, without needing to prove that it is persistent and prevalent—just if it thinks it is necessary—it can take the actions set out in that clause. For the reason that she mentioned, the change has been made already.

That brings me on to the powers in clause 103, which are extremely relevant—I apologise for speaking to that clause, Ms Rees, which we will come to later. That clause contains powers for Ofcom to direct the use of accredited technologies to ensure that CSEA is being scanned for. I have two points to make. First, on the question of whether the technology exists to scan inside an end-to-end encrypted environment, the advice that I have received so far is that, as the shadow Minister said, although it is getting close and is likely to be accomplished in the relatively near future, as of today it is not there. That is worth saying for the record.

Secondly, on the question of the hon. Member for Aberdeen North about whether that can keep up to date with future technology moves—an important question, because this technology will change almost month to month, and certainly year to year—in that context it is worth referring to the definition of “accredited” technology. If my memory is correct, that is to be found in clause 105(9) and (10), on page 90. In essence, those two subsections state that Ofcom may update accreditation whenever it feels that to be necessary—that can be at any time; it is not one-off. Indeed, Ofcom may appoint some other person or body to do the accreditation if it feels that it does not have the expertise itself. The concept of accredited technology is live; it can be updated the whole time.

Given that we are on the topic, however, we are still thinking—this is so important, and the hon. Member for Aberdeen North has rightly raised it two or three times—about whether there are ways to strengthen clause 103 further, to provide even more clear and powerful powers to act in this area. If we can think of ways to do that, or if anyone else can suggest one, we are receptive to that thinking. The reason—as I gave in answer to the hon. Lady two or three times—is that, as far as I am concerned, there can be no compromise when scanning for CSEA content.

We then come to the question of the risk assessments and the codes of practice, to ensure that all the relevant groups get covered and that no one gets forgotten—this brings me back to clause 37, you will be pleased to hear, Ms Rees. Subsection (3), which appears towards the bottom of page 35, states on lines 31 to 33:

“OFCOM must prepare and issue one or more codes of practice for providers of Part 3 services describing measures recommended for the purpose of compliance with the relevant duties”.

What are those relevant duties? The relevant duties are, mercifully, defined at the bottom of the following page, page 36, in subsection (10), which sets out what we mean, and the most important for protecting people are paragraphs (a), (b) and (c): anything that is illegal, anything that concerns the safety of children, and matters concerning the safety of adults, respectively. There is no risk that those very important topics can somehow get forgotten.

I hope that clarifies how the Bill operates. As I said, we are giving careful thought to finding ways—which I hope we can—to strengthen those powers in clause 103.

I think my hon. Friend’s list goes on to page 37, which means there would be a number of different relevant duties that would presumably then be subject to the ability to issue codes of practice. However, the point I was making in my earlier contribution is that this list does not include the issue of violence against women and girls. In looking at this exhaustive list that my hon. Friend has included in the Bill, I must ask whether he might inadvertently be excluding the opportunity for Ofcom to produce a code of practice on the issue of violence against women and girls. Having heard his earlier comments, I felt that he was slightly sympathetic to that idea.

Clearly, and as Members have pointed out, women and girls suffer disproportionately from abuse online; unfortunately, tragically and disgracefully, they are disproportionately victims of such abuse. The duties in the Bill obviously apply to everybody—men and women—but women will obviously disproportionately benefit, because they are disproportionately victims.

Obviously, where there are things that are particular to women, such as particular kinds of abuse that women suffer that men do not, or particular kinds of abuse that girls suffer that boys do not, then we would expect the codes of practice to address those kinds of abuse, because the Bill states that they must keep children safe, in clause 37(10)(b), and adults safe, in clause 37(10)(c). Obviously, women are adults and we would expect those particular issues that my right hon. Friend mentioned to get picked up by those measures.

My hon. Friend is giving me a chink of light there, in that subsection (10)(c) could actively mean that a code of practice that specifically dealt with violence against women and girls would be admissible as a result of that particular point. I had not really thought of it in that way—am I thinking about it correctly?

My right hon. Friend makes an interesting point. To avoid answering a complicated question off the cuff, perhaps I should write to her. However, I certainly see no prohibition in these words in the clause that would prevent Ofcom from writing a particular code of practice. I would interpret these words in that way, but I should probably come back to her in writing, just in case I am making a mistake.

As I say, I interpret those words as giving Ofcom the latitude, if it chose to do so, to have codes of practice that were specific. I would not see this clause as prescriptive, in the sense that if Ofcom wanted to produce a number of codes of practice under the heading of “adults”, it could do so. In fact, if we track back to clause 37(3), that says:

“OFCOM must prepare and issue one or more codes of practice”.

That would appear to admit the possibility that multiple codes of practice could be produced under each of the sub-headings, including in this case for adults and in the previous case for children. [Interruption.] I have also received some indication from officials that I was right in my assessment, so hopefully that is the confirmation that my right hon. Friend was looking for.

Question put and agreed to.

Clause 37 accordingly ordered to stand part of the Bill.

Clause 38 ordered to stand part of the Bill.

Schedule 4

Codes of practice under section 37: principles, objectives, content

Amendment proposed: 63, in schedule 4, page 176, line 29, at end insert “and

(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.—(Alex Davies-Jones.)

This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.

Question put, That the amendment be made.

Amendment proposed: 64, in schedule 4, page 177, line 4, at end insert “and

(vii) the systems and process are appropriate to detect cruelty towards humans and animals;”—(Alex Davies-Jones.)

This amendment would ensure that ensuring systems and processes are appropriate to detect cruelty towards humans and animals is one of the online safety objectives for search services.

Question put, That the amendment be made.

Schedule 4 agreed to.

Clause 39

Procedure for issuing codes of practice

Before we begin the next debate, does anyone wish to speak to Carla Lockhart’s amendment 97? If so, it will be debated as part of this group; otherwise, it will not be selected. The amendment is not selected.

I beg to move amendment 48, in clause 39, page 37, line 17, at beginning insert—

“(A1) OFCOM must prepare the draft codes of practice required under section 37 within the period of six months beginning with the day on which this Act is passed.”

This amendment requires Ofcom to prepare draft codes of practice within six months of the passing of the Act.

With this it will be convenient to discuss the following:

Clause stand part.

Clauses 42 to 47 stand part.

This is a mammoth part of the Bill, and I rise to speak to clause 39. Under the clause, Ofcom will submit a draft code of practice to the Secretary of State and, provided that the Secretary of State does not intend to issue a direction to Ofcom under clause 40, the Secretary of State would lay the draft code before Parliament. Labour’s main concern about the procedure for issuing codes of practice is that, without a deadline, they may not come into force for quite some time, and the online space needs addressing now. We have already waited far too long for the Government to bring forward the Bill. Parliamentary oversight is also fundamentally important, and the codes will have huge implications for the steps that service providers take, so it is vital that they are given due diligence at the earliest opportunity.

Amendment 48 would require Ofcom to prepare draft codes of practice within six months of the passing of the Act. This simple amendment would require Ofcom to bring forward these important codes of practice within an established time period—six months—after the Bill receives Royal Assent. Labour recognises the challenges ahead for Ofcom in both capacity and funding.

On this note, I must raise with the Minister something that I have raised previously. I find it most curious that his Department recently sought to hire an online safety regulator funding policy adviser. The job advert listed some of the key responsibilities:

“The post holder will support ministers during passage of the Online Safety Bill; secure the necessary funding for Ofcom and DCMS in order to set up the Online Safety regulator; and help implement and deliver a funding regime which is first of its kind in the UK.”

That raises worrying questions about how prepared Ofcom is for the huge task ahead. That being said, the Government have drafted the Bill in a way that brings codes of practice to its heart, so they cannot and should not be susceptible to delay.

The hon. Lady is very kind in giving way—I was twitching to stand up. On the preparedness of Ofcom and its resources, Ofcom was given about £88 million in last year’s spending review to cover this and the next financial year—2022-23 and 2023-24—so that it could get ready. Thereafter, Ofcom will fund itself by raising fees, and I believe that the policy adviser will most likely advise on supporting the work on future fees. That does not imply that there will be any delay, because the funding for this year and next year has already been provided by the Government.

I appreciate that intervention, but the Minister must be aware that if Ofcom has to fundraise itself, that raises questions about its future capability as a regulator and its funding and resource requirements. What will happen if it does not raise those funds?

The hon. Lady’s use of the word “fundraise” implies that Ofcom will be going around with a collection tin on a voluntary basis.

I will find the relevant clause in a moment. The Bill gives Ofcom the legal power to make the regulated companies pay fees to finance Ofcom’s regulatory work. It is not voluntary; it is compulsory.

I am grateful to the Minister for that clarification. Perhaps he should make that more obvious in the job requirements and responsibilities.

The fees requirements are in clauses 70 to 76, in particular clause 71, “Duty to pay fees”. The regulated companies have to pay the fees to Ofcom. It is not optional.

I am grateful to the Minister for that clarification.

The Government have drafted the Bill in a way that puts codes of practice at its heart, so they cannot and should not be susceptible to delay. We have heard from platforms and services that stress that the ambiguity of the requirements is causing concern. At least with a deadline for draft codes of practice, those that want to do the right thing will be able to get on with it in a timely manner.

The Age Verification Providers Association provided us with evidence in support of amendment 48 in advance of today’s sitting. The association agrees that early publication of the codes will set the pace for implementation, encouraging both the Secretary of State and Parliament to approve the codes swiftly. A case study it shared highlights delays in the system, which we fear will be replicated within the online space, too. Let me indulge Members with details of exactly how slow Ofcom’s recent record has been on delivering similar guidance required under the audio-visual media services directive.

The directive became UK law on 30 September 2020 and came into force on 1 November 2020. By 24 June 2021, Ofcom had issued a note as to which video sharing platforms were in scope. It took almost a year until, on 6 October 2021, Ofcom issued formal guidance on the measures.

In December 2021, Ofcom wrote to the verification service providers and

“signalled the beginning of a new phase of supervisory engagement”.

However, in March 2022 it announced that

“the information we collect will inform our Autumn 2022 VSP report, which intends to increase the public’s awareness of the measures platforms have in place to protect users from harm.”

There is still no indication that Ofcom intends to take enforcement action against the many VSPs that remain non-compliant with the directive. It is simply not good enough. I urge the Minister to carefully consider the aims of amendment 48 and to support it.

Labour supports the principles of clause 42. Ofcom must not drag out the process of publishing or amending the codes of practice. Labour also supports a level of transparency around the withdrawal of codes of practice, should that arise.

Labour also supports clause 43 and the principles of ensuring that Ofcom has a requirement to review its codes of practice. We do, however, have concerns over the Secretary of State’s powers in subsection (6). It is absolutely right that the Secretary of State of the day has the ability to make representations to Ofcom in order to prevent the disclosure of certain matters in the interests of national security, public safety or relations with the Government of a country outside the UK. However, I am keen to hear the Minister’s assurances about how well the Bill is drafted to prevent those powers from being used, shall we say, inappropriately. I hope he can address those concerns.

On clause 44, Ofcom should of course be able to propose minor amendments to its codes of practice. Labour does, however, have concerns about the assessment that Ofcom will have to make to ensure that the minor nature of changes will not require amendments to be laid before Parliament, as in subsection (1). As I have said previously, scrutiny must be at the heart of the Bill, so I am interested to hear from the Minister how exactly he will ensure that Ofcom is making appropriate decisions about what sorts of changes are allowed to circumvent parliamentary scrutiny. We cannot and must not get to a place where the Secretary of State, in agreeing to proposed amendments, actively prevents scrutiny from taking place. I am keen to hear assurances on that point from the Minister.

On clause 45, as I mentioned previously on amendment 65 to clause 37, as it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in subsection (1). However, providers could take alternative measures to comply, as outlined in subsection (5). Labour supports the clause in principle, but we are concerned that the definition of alternative measures is too broad. I would be grateful if the Minister could elaborate on his assessment of the instances in which a service provider may seek to comply via alternative measures. Surely the codes of practice should be, for want of a better phrase, best practice. None of us want to get into a position where service providers are circumnavigating their duties by taking the alternative measures route.

Again, Labour supports clause 46 in principle, but we feel that the provisions in subsection (1) could go further. We know that, historically, service providers have not always been transparent and forthcoming when compelled to be so by the courts. While we understand the reasoning behind subsection (3), we have broader concerns that service providers could, in theory, lean on their codes of practice as highlighting their best practice. I would be grateful if the Minister could address our concerns.

We support clause 47, which establishes that the duties in respect of which Ofcom must issue a code of practice under clause 37 will apply only once the first code of practice for that duty has come into force. However, we are concerned that this could mean that different duties will apply at different times, depending on when the relevant code for a particular duty comes into force. Will the Minister explain his assessment of how that will work in practice? We have concerns that drip feeding this information to service providers will cause further delay and confusion. In addition, will the Minister confirm how Ofcom will prioritise its codes of practice?

Lastly, we know that violence against women and girls has not a single mention in the Bill, which is an alarming and stark omission. Women and girls are disproportionately likely to be affected by online abuse and harassment. The Minister knows this—we all know this—and a number of us have spoken up on the issue on quite a few occasions. He also knows that online violence against women and girls is defined as including, but not limited to, intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive sexting and the creation and sharing of deepfake pornography.

The Minister will also know that Carnegie UK is working with the End Violence Against Women coalition to draw up what a code of practice to tackle violence against women and girls could look like. Why has that been left out of the redraft of the Bill? What consideration has the Minister given to including a code of this nature in the Bill? If the Minister is truly committed to tackling violence against women and girls, why will he not put that on the face of the Bill?

I have a quick question about timelines because I am slightly confused about the order in which everything will happen. It is unlikely that the Bill will have been through the full parliamentary process before the summer, yet Ofcom intends to publish information and guidance by the summer, even though some things, such as the codes of practice, will not come in until after the Bill has received Royal Assent. Will the Minister give a commitment that, whether or not the Bill has gone through the whole parliamentary process, Ofcom will be able to publish before the summer?

Will Ofcom be encouraged to publish everything, whether that is guidance, information on its website or the codes of practice, at the earliest point at which they are ready? That will mean that anyone who has to apply those codes of practice or those regulations—people who will have to work within those codes, for example, or charities or other organisations that might be able to make super-complaints—will have as much information as possible, as early as possible, and will be able to prepare to fully implement their work at the earliest possible time. They will need that information in order to be able to gear up to do that.

I have three short questions for the Minister about clause 40 and the Secretary of State’s powers of direction. Am in order to cover that?

I will do my best to make sure that we come to it very quickly indeed, by being concise in my replies on this group of amendments.

On amendment 48, which seeks to get Ofcom to produce its codes of practice within six months, obviously we are unanimous in wanting that to be done as quickly as possible. However, Ofcom has to go through a number of steps in order to produce those codes of practice. For example, first we have to designate in secondary legislation the priority categories of content that is harmful to children and content that is harmful to adults, and then Ofcom has to go through a consultation exercise before it publishes the codes. It has in the past indicated that it expects that to be a 12-month, rather than a six-month, process. I am concerned that a hard, six-month deadline may be either impossible to meet or make Ofcom rush and do it in a bad way. I accept the need to get this done quickly, for all the obvious reasons, but we also want to make sure that it is done right. For those reasons, a hard, six-month deadline would not help us very much.

Why does the Minister believe that six months is out of scope? Does he think that Ofcom is not adequately resourced to meet that deadline and make it happen as soon as possible?

There are a number of steps to go through. Regardless of how well resourced Ofcom is and how fast it works, first, we have to designate the priority categories by secondary legislation, and there is a lead time for that. Secondly, Ofcom has to consult. Best practice suggests that consultations need to last for a certain period, because the consultation needs to be written, then it needs to open, and then the responses need to be analysed. Then, Ofcom obviously has to write the codes of practice. It might be counterproductive to set a deadline that tight.

There are quite a few different codes of practice to publish, and the hon. Lady asked about that. The ones listed in clause 47 will not all come out at the same time; they will be staggered and prioritised. Obviously, the ones that are most germane to safety, such as those on illegal content and children’s safety, will be done first. We would expect them to be done as a matter of extreme urgency.

I hope I have partly answered some of the questions that the hon. Member for Aberdeen North asked. The document to be published before the summer, which she asked about, is a road map. I understand it to be a sort of timetable that will set out the plan for doing everything we have just been debating—when the consultations will happen and when the codes of practice will be published. I guess we will get the road map in the next few weeks, if “before the summer” means before the summer recess. We will have all that set out for us, and then the formal process follows Royal Assent. I hope that answers the hon. Lady’s question.

There were one or two other questions from the hon. Member for Pontypridd. She asked whether a Secretary of State might misuse the power in clause 43(2)—a shocking suggestion, obviously. The power is only to request a review; it is nothing more sinister or onerous than that.

On clause 44, the hon. Lady asked what would happen if Ofcom and the Secretary of State between them—it would require both—conspired to allow through a change claiming it is minor when in fact it is not minor. First, it would require both of them to do that. It requires Ofcom to propose it and the Secretary of State to agree it, so I hope the fact that it is not the Secretary of State acting alone gives her some assurance. She asked what the redress is if both the Secretary of State and Ofcom misbehave, as it were. Well, the redress is the same as with any mis-exercise of a public power—namely, judicial review, which, as a former Home Office Minister, I have experienced extremely frequently—so there is legal redress.

The hon. Lady then asked about the alternative measures. What if a service provider, rather than meeting its duties via the codes of practice, does one of the alternative measures instead? Is it somehow wriggling out of what it is supposed to do? The thing that is legally binding, which it must do and about which there is no choice because there is a legal duty, is the duties that we have been debating over the past few days. Those are the binding requirements that cannot be circumvented. The codes of practice propose a way of meeting those. If the service provider can meet the duties in a different way and can satisfy Ofcom that it has met those duties as effectively as it would under the codes of practices, it is open to doing that. We do not want to be unduly prescriptive. The test is: have the duties been delivered? That is non-negotiable and legally binding.

I hope I have answered all the questions, while gently resisting amendment 48 and encouraging the Committee to agree that the various other clauses stand part of the Bill.

Question put, That the amendment be made.

The Committee divided:.

Clause 39 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Adjourned till Tuesday 14 June at twenty-five minutes past Nine o’clock.

Written evidence reported to the House

OSB61 Badger Trust

OSB62 Lego

OSB63 End Violence Against Women Coalition (EVAW)

OSB64 Hacked Off Campaign (further submission) (re: clause 50)

OSB65 Office of the City Remembrancer, on behalf of the City of London Corporation and City of London Police

OSB66 Juul Labs

OSB67 Big Brother Watch, ARTICLE 19, Open Rights Group, Index on Censorship, and Global Partners Digital

OSB68 News Media Association (supplementary submission)

Public Order Bill (First sitting)

The Committee consisted of the following Members:

Chairs: Peter Dowd, †David Mundell

† Anderson, Lee (Ashfield) (Con)

† Bridgen, Andrew (North West Leicestershire) (Con)

† Chamberlain, Wendy (North East Fife) (LD)

Cunningham, Alex (Stockton North) (Lab)

Doyle-Price, Jackie (Thurrock) (Con)

† Elmore, Chris (Ogmore) (Lab)

† Elphicke, Mrs Natalie (Dover) (Con)

† Hunt, Tom (Ipswich) (Con)

† Huq, Dr Rupa (Ealing Central and Acton) (Lab)

† Jones, Sarah (Croydon Central) (Lab)

Longhi, Marco (Dudley North) (Con)

† McCarthy, Kerry (Bristol East) (Lab)

† McLaughlin, Anne (Glasgow North East) (SNP)

† Malthouse, Kit (Minister for Crime and Policing)

† Mann, Scott (North Cornwall) (Con)

† Mohindra, Mr Gagan (South West Hertfordshire) (Con)

† Vickers, Matt (Stockton South) (Con)

Anne-Marie Griffiths, Sarah Thatcher, Committee Clerks

† attended the Committee

Witnesses

Chief Constable Chris Noble, Lead for Protests, National Police Chiefs’ Council

John Groves, Chief Security and Resilience Officer, High Speed 2 Limited

Nicola Bell, Regional Director, South East, National Highways

Public Bill Committee

Thursday 9 June 2022

(Morning)

[David Mundell in the Chair]

Public Order Bill

I have a few preliminary announcements. Hansard colleagues would be grateful if Members could email their speaking notes to hansardnotes@parliament.uk. Please switch electronic devices to silent. Tea and coffee are not allowed during sittings.

We will consider the programme motion on the amendment paper. We will then consider a motion to enable the reporting of written evidence for publication, and a motion to allow us to deliberate in private about questions between the oral evidence sessions. In view of the time available, I hope that we can take these matters formally, without debate. I call the Minister to move the programme motion standing in his name, which was discussed on Tuesday 7 June by the Programming Sub-Committee for this Bill.

Ordered,

That—

(1) the Committee shall (in addition to its first meeting at 11.30 am on Thursday 9 June) meet—

(a) at 2.00 pm on Thursday 9 June;

(b) at 9.25 am and 2.00 pm on Tuesday 14 June;

(c) at 11.30 am and 2.00 pm on Thursday 16 June;

(d) at 9.25 am and 2.00 pm on Tuesday 21 June;

(2) the Committee shall hear oral evidence in accordance with the following Table:

Date

Time

Witness

Thursday 9 June

Until no later than 12.15 pm

The National Police Chiefs’ Council

Thursday 9 June

Until no later than 1.00 pm

High Speed 2 (HS2) Limited; National Highways

Thursday 9 June

Until no later than 2.45 pm

United Kingdom Petroleum Industry Association;

Thursday 9 June

Until no later than 3.05 pm

Adam Wagner, Doughty Street Chambers

Thursday 9 June

Until no later than 3.25 pm

News UK

Thursday 9 June

Until no later than 4.10 pm

Sir Peter Martin Fahy QPM, retired police officer; Matt Parr CB, HM Inspector of Constabulary and HM Inspector of Fire and Rescue Services; Chief Superintendent Phil Dolby, West Midlands Police

Thursday 9 June

Until no later than 4.55 pm

Amnesty International; Justice; Liberty

3. the proceedings shall (so far as not previously concluded) be brought to a conclusion at 5.00 pm on Tuesday 21 June.—(Kit Malthouse.)

The Committee will proceed to line-by-line consideration of the Bill on Tuesday 14 June at 9.25 am.

Resolved,

That, subject to the discretion of the Chair, any written evidence received by the Committee shall be reported to the House for publication.—(Kit Malthouse.)

Copies of written evidence that the Committee receives will be made available in the Committee room and will be circulated to Members by email.

Resolved,

That, at this and any subsequent meeting at which oral evidence is to be heard, the Committee shall sit in private until the witnesses are admitted.—(Kit Malthouse.)

The Committee deliberated in private.

Examination of Witness

Chief Constable Chris Noble gave evidence.

We are now sitting in public again and the proceedings are being broadcast. Before we start hearing from the witnesses, do any Members wish to make declarations of interest in connection with the Bill? No, I take it. We will now hear oral evidence from Chief Constable Chris Noble, lead for protest on the National Police Chiefs’ Council, who is joining us via Zoom. I remind Members that questions should be limited to matters within the scope of the Bill, and that we must stick to the timings in the programme motion. The Committee has agreed that, for this session, we have until 12.15 pm. Can the witness please introduce themselves for the record?

Chris Noble: Good morning, Chair. My name is Chris Noble. I am the chief constable of Staffordshire Police.

Thank you, Mr Noble. If, at any time, you have any difficulty in hearing the questions, please indicate and we will make the necessary technical adjustments.

Q Good morning, chief. Thank you very much for joining us. At the outset, can you outline the current protest situation, and changes in protesters’ tactics over the past three or four years, from your experience? The Bill is responding to those changes in tactics, so it would be helpful for the Committee if you could outline what they are. Also, can you talk about your experience of the disruption caused and the challenges faced on safety grounds, and say what the cost to policing has been over the last couple of years?

Chris Noble: Thank you, Minister. There is a lot, in terms of looking back. There have been a number of trends. We have seen global causes land on our shores very quickly and having significant impacts. Black Lives Matter is a good example. We have seen causes overlapping, both in terms of membership and tactics. There have been some very novel—without giving them any credit—and highly disruptive tactics; that is reflected on the contents page of the Bill. If we look across the breadth of protest organisations and groups, we see that they are very aware of some of the legal gaps, inadequacies and shortcomings; that is very clear from their engagement with police, as well as their tactics. There is a focus, albeit not exclusively, around what we would call non-violent direct action, which is slightly different from previous protest phases, where violence was maybe more commonplace. That said, it is not completely exclusively non-violent.

Most protests are still relatively non-contentious. However, in terms of complexity, intensity and tactics, there has been a step up, and the assessment going forward is very clear that we will still see those challenges around complexity and the co-ordination and the adapting of protests, and we have significant gaps around our information and intelligence. Even though we will have our own, home-grown causes that people will wish to protest against, I anticipate that a lot of protest will potentially be generated from outside these shores. That is a little bit of the picture on what has been, and what may well be to come.

On impacts, there are safety challenges across the board, including safety risks to some of the protestors, challenges to members of the community on our roads or, indeed, in their communities, and challenges for police officers and private contractors in dealing safely with tactics that we will perhaps talk about. Also, there may be increasing cost as we try to deal with more complex issues—costs either to communities, the businesses impacted, or indeed the police, be it financial or opportunity cost, in terms of officers not being able to work in neighbourhoods, or in serious and organised crime, or in the other roles on which they clearly want to be focused. Those are real challenges, but still, the backdrop is that the vast majority of protest activity is relatively non-contentious. However, there is a hard core, a small element, that I do not see going away any time soon.

Q One form of protest that we have seen recently is locking on—people glue themselves to motorways or fuel depots and fuel gantries. Could you illustrate some of the dangers that that may present, particularly in a fuel environment? What steps do officers have to take to deal with that kind of protest?

Chris Noble: In Staffordshire, we have a very experienced protest removal team, and on occasion they have dealt with individuals glued to the top of fuel tankers by cutting them loose, using cutting equipment. There are obvious risks in that. Equally, if you go on to a busy motorway and glue yourself to it, there is a raft of risks from traffic, and risk to police officers. Understandably, we have seen members of the public, through sheer frustration, look to take matters into their own hands. You can translate that to power stations and other vulnerable sites. Although this may be attention-grabbing and headline-grabbing, the risks to the protestors, the police and members of the public are becoming ever more significant.

Q Under current legislation, one of the challenges that you obviously face in looking after protest is balancing the right to protest against the right of others to go about their business. Could you explain to us the training that a police officer has to go through in order to appreciate those balances, and how the judgments are made? What training is there around the danger presented to protesters, officers or the general public in protest situations? Does that colour the picture, when it comes to the conditions that may be put on a protest?

Chris Noble: There is quite a disciplined training regime. The training is licensed through the College of Policing. You have command training at what we call gold, silver and bronze levels. The strategists—those who develop a plan—are at the silver level; those who carry it out on the ground are at the bronze level. There is not only initial very intense and comprehensive training for those individuals, but annual continual professional development, which is annotated and logged. There is also re-accreditation to ensure that people are still fit for operation. There are also annual inputs on what has changed—training on new legislation, new powers, learning from court cases, different protest tactics and emerging risks—so there is a continual learning cycle, as well as a very detailed pass-or-fail approach to training.

This week, we had an early morning dial-in with the vast majority of gold commanders across the country to break out some peer learning around Just Stop Oil. It was about what we could do differently, and how we could learn. There are specialist teams in policing that share information and liaise with the Health and Safety Executive and other bodies on how we do our very best to minimise danger to protesters, the wider public and police officers.

The challenge for policing is that training is at one point in time, and tactics and intentions are constantly moving. There is a constant challenge in making police training fit for purpose. The one thing that stays consistent—you alluded to this—is the police commitment to striking the balance between our positive and negative obligations to protest, and our ongoing responsibility to those impacted by protest.

Q Obviously, a significant amount of effort and capacity goes into this work. A final question from me: do you think the police would benefit from more pre-emptive powers to prevent some of these more dangerous protests and get ahead of them? As you know, the Bill allows the police to do that.

Chris Noble: In short, yes, we would. You have already partly qualified that. For us, the more intrusive our tactics, the more they need to be focused on the harm being caused. In our approach, there has to be a constant test of what is proportionate, and that is subject to significant internal and external scrutiny.

We can see greater risk of harm to communities and protesters if things are left to run. An example was the G7 operation. I was speaking to one of the senior commanders recently, and they described a lack of powers around stop and search for people with items that could only have be used for generating a lock-on device. They had to intervene later in the day, with more significant powers, on a wider group of protesters, therefore interfering with more people’s rights. As long as early intervention and prevention are subject to proportionality tests, and are applied precisely, they are preferable to some of the risks that protesters place themselves under, and some of the significant disruption that they cause to other individuals.

Q Thank you for giving evidence to us today. Could you talk us through some of the powers that you already have to disrupt protests? Can you give us recent examples of when you have used them?

Chris Noble: Sadly, I am no longer a practising operational commander, so I will talk vicariously. You also have Phil Dolby coming to speak to you. He will be able to give you a flavour of the west midlands region. There is a range of powers, but the policing operation begins with communication and engagement. As soon as we are aware of a protest, the first thing we will do is link in with the organisers and understand how we can do our very best to minimise any intrusion on their rights and safeguard the right to protest. Our most powerful tactic is engagement and communication.

Very, very rarely will we ever ban a protest. We hear the lazy soundbite at times that police are looking to ban protests. It has not happened in many years. Even when we apply conditions under sections 12 and 14 of the Public Order Act 1986, which were the subject of the Police, Crime, Sentencing and Courts Act 2022, their usage is limited. We will record those. They are tested, and they are very often subject to court testing as well.

Then we have a range of other powers, depending on the level of criminality or risk that we identify in the protest. We are able to seize items and search properties, but that would be under a plethora of legislation and would be very specific to what we know in advance. In current protests, we often know little until something presents, or until very close to the event time. We have a range of powers, but they are not particularly coherent in the light of what is often a very poor line of sight around protest activity.

Q Can you talk us through some of your powers that have been used for arresting and charging protesters—for instance, aggravated trespass, criminal damage and obstructing a highway?

Chris Noble: Yes. I will take the example of obstructing the highway; those powers have recently been adjusted. With Insulate Britain and some of the obstruction of the M25 motorway, we were dealing with legislation that was drafted without those tactics or activities in mind. The powers are relatively low level, in terms of consequences; individuals who were arrested could be back on the scene the next day. The capability of some of those powers to deal with repeat protest or reckless protest is very limited, and I think a significant number of the protesters were very aware of that.

On criminal damage, there are opportunities, through those powers, for us to intervene where people are carrying specified items and going equipped to commit criminal damage. Aggravated trespass, which you alluded to, is particularly relevant. In the private space, there is no right to protest in anything like the way that there is in the public space. That is just a flavour of a number of the offences that most commonly come into play in protest. There are others that are perhaps a little more rare, including conspiracy to commit various offences.

Q Can you talk us through injunctions and how the police work through somebody getting an injunction? How does that operate?

Chris Noble: We have tried to make an assessment about the impact of injunctions, especially around Insulate Britain and Just Stop Oil. The feedback we have had is that when they are appropriately framed and developed at an appropriate pace, they can be very useful in terms of what we are trying to control and how we are trying to shape people’s behaviour. I think, in general though, while they are a key tool, they are not the only one we need.

We have worked hard with private industry to give them information and knowledge about injunctions. I have worked closely with an industry on my own patch that is very up for taking on the responsibility along-side the police service for trying to target harder and prevent protest. On occasions, they will then look to obtain injunctions in terms of trying to prevent harm from being caused to their business, property and employees. Injunctions have been used increasingly frequently, but the challenge is framing them appropriately and securing them within a reasonable timescale so they can have maximum impact.

Q Is the timescale a frustration? Do they take longer than you would want them to?

Chris Noble: Yes.

Q Obviously, this Bill was first introduced last year as amendments to the Police, Crime, Sentencing and Courts Act 2022 in the Lords. Can you talk us through the consultation the Government have done on policing, both when the amendments were introduced in the Lords and now with this separate Bill?

Chris Noble: Again, this is slightly outside my corporate memory, but there have been very lengthy conversations as far back as 2019 with policing, in terms of the public order and public safety portfolios, about the adequacy of some of the powers. That refined itself down into some further conversations around some bespoke powers, many of which appear in the Act you have just referred to.

There is an ongoing conversation around policy in terms of public order and public safety. For example, in some of the Just Stop Oil protests we have seen a cross-departmental approach. The police were clear in identifying where they see some inadequacies and in the effects that they want to achieve. In many ways, there is a rolling conversation around public policy, some of which will translate into legislation at one point or another.

Q Back in 2019, Matt Parr did a big piece of work with Her Majesty’s inspectorate of constabulary and fire & rescue services. Some of the aspects we are looking at today were debated and he thought about them, but many aspects were not part of that original process whereby he went out to colleagues to ask various questions that the Government ha