Skip to main content

Online Safety Bill (Fifteenth sitting)

Debated on Thursday 23 June 2022

The Committee consisted of the following Members:

Chairs: † Sir Roger Gale, Christina Rees

† Ansell, Caroline (Eastbourne) (Con)

† Bailey, Shaun (West Bromwich West) (Con)

† Blackman, Kirsty (Aberdeen North) (SNP)

† Carden, Dan (Liverpool, Walton) (Lab)

† Davies-Jones, Alex (Pontypridd) (Lab)

† Double, Steve (St Austell and Newquay) (Con)

† Fletcher, Nick (Don Valley) (Con)

† Holden, Mr Richard (North West Durham) (Con)

† Keeley, Barbara (Worsley and Eccles South) (Lab)

† Leadbeater, Kim (Batley and Spen) (Lab)

† Miller, Dame Maria (Basingstoke) (Con)

† Mishra, Navendu (Stockport) (Lab)

† Moore, Damien (Southport) (Con)

Nicolson, John (Ochil and South Perthshire) (SNP)

† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)

Russell, Dean (Watford) (Con)

† Stevenson, Jane (Wolverhampton North East) (Con)

Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks

† attended the Committee

Public Bill Committee

Thursday 23 June 2022

[Sir Roger Gale in the Chair]

Online Safety Bill

Good morning, ladies and gentlemen. Please ensure your phones are switched to silent.

Clause 168

Publication by OFCOM

Question proposed, That the clause stand part of the Bill.

It is a pleasure to serve under your chairmanship, Sir Roger. Clause 168 is a very short and straightforward clause. Ofcom will be required to publish a variety of documents under the Online Safety Bill. The clause simply requires that this be done in a way that is appropriate and likely to bring it to the attention of any audience who are going to be affected by it. Ofcom is already familiar with this type of statutory obligation through existing legislation, such as the Digital Economy Act 2017, which places similar obligations on Ofcom. Ofcom is well versed in publishing documents in a way that is publicly accessible. Clause 168 puts the obligation on to a clear statutory footing.

As the Minister said, clause 168 rightly sets out that the raw material the Bill requires of Ofcom is published in a way that will bring it to the attention of any audience likely to be affected by it. It will be important that all the guidance is published in a way that is easily available and accessible, including for people who are not neurotypical, or experience digital exclusion. I think we would all agree, after the work we have done on the Bill, that the subjects are complex and the landscape is difficult to understand. I hope Ofcom will make its documents as accessible as possible.

Question put and agreed to.

Clause 168 accordingly ordered to stand part of the Bill.

Clause 169

Service of notices

Question proposed, That the clause stand part of the Bill.

Clause 169 sets out the process for the service of any notice under the Bill, including notices to deal with child sexual exploitation and abuse or terrorism content, information notices, enforcement notices, penalty notices and public statement notices to providers of regulated services both within and outside the United Kingdom. The clause sets out that Ofcom may give a notice to a person by handing it to them, leaving it at the person’s last known address, sending it by post to that address or sending it by email to the person’s email address. It provides clarity regarding who Ofcom must give notice to in respect of different structures. For example, notice may be given to an officer of a body corporate.

As the Minister said, clause 169 sets out the process of issuing notices or decisions by Ofcom. It mostly includes provisions about how Ofcom is to contact the company, which seem reasonable. The Opposition do not oppose clause 169.

Question put and agreed to.

Clause 169 accordingly ordered to stand part of the Bill.

Clause 170

Repeal of Part 4B of the Communications Act

Question proposed, That the clause stand part of the Bill.

Clause 170 repeals the video-sharing platform regime. While the VSP and online safety regimes have similar objectives, the new framework in the Bill will be broader and will apply to a wider range of online platforms. It is for this reason that we will repeal the VSP regime and transition those entities regulated as VSPs across to the online safety regime, which is broader and more effective in its provisions. The clause simply sets out the intention to repeal the VSP.

Clause 171 repeals part 3 of the Digital Economy Act 2017. As we have discussed previously, the Online Safety Bill now captures all online sites that display pornography, including commercial pornography sites, social media sites, video sharing platforms, forums and search engines. It will provide much greater protection to children than the Digital Economy Act. The Digital Economy Act was criticised for not covering social media platforms, which this Bill does cover. By removing that section from the Digital Economy Act, we are laying the path to regulate properly and more comprehensively.

Finally, in this group, clause 172 amends section 1B of the Protection of Children Act 1978 and creates a defence to the offence of making an indecent photograph of a child for Ofcom, its staff and those assisting Ofcom in exercising its online safety duties. Clearly, we do not want to criminalise Ofcom staff while they are discharging their duties under the Bill that we are imposing on them, so it is reasonable to set out that such a defence exists. I hope that provides clarity to the Committee on the three clauses.

The provisions in clauses 170 to 172, as the Minister has said, repeal or amend existing laws for the purposes of the Bill. As Labour supports the need to legislate on the issue of online safety, we will not oppose the clauses. However, I want to note that the entire process, up until the final abandonment of part 3 of the Digital Economy Act under clause 171 appears shambolic. It has been five years now since that part of the Act could have been implemented, which means five years during which children could have been better protected from the harms of pornographic content.

When the Government eventually admitted that part 3 was being ditched, the Minister at the time, the hon. Member for Boston and Skegness (Matt Warman), said that the Government would seek to take action on pornography more quickly than on other parts of the online harms regime. Stakeholders and charities have expressed concerns that we could now see a delay to the implementation of the duties on pornographic content providers, which is similar to the postponement and eventual abandonment of part 3 of the Digital Economy Act. I appreciate that the Minister gave some reassurance of his

“desire to get this done as quickly as possible”—[Official Report, Online Safety Bill Committee, 9 June 2022; c. 308.]

in our debate on clauses 31 to 33, but would it not be better to set out timeframes in the Bill?

Under clause 193, it appears that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be the definitions—clause 66 and clause 67(4)—and not the duties. That is because Ofcom is expected to issue a call for evidence, after which draft proposals for consultation are published, which then need to be agreed by the Secretary of State and laid before Parliament. There are opportunities there for delays and objections at any stage and, typically, enforcement will be implemented only in a staged fashion, from monitoring to supervision. The consultations and safeguarding processes are necessary to make the guidance robust; we understand that. However, children cannot wait another three years for protections, having been promised protection under part 3 of the Digital Economy Act five years ago, which, as I have said, was never implemented.

The provisions on pornography in part 5 of the Bill require no secondary legislation so they should be implemented as quickly as possible to minimise the amount of time children continue to be exposed to harmful content. It would be irresponsible to wait any longer than absolutely necessary, given the harms already caused by this drawn-out process.

Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.

On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?

I agree with the hon. Member for Worsley and Eccles South, that we want to see these provisions brought into force as quickly as possible, for the reasons that she set out. We are actively thinking about ways of ensuring that these provisions are brought into force as fast as possible. It is something that we have been actively discussing with Ofcom, and that, I hope, will be reflected in the road map that it intends to publish before the summer. That will of course remain an area of close working between the Department for Digital, Culture, Media and Sport and Ofcom, ensuring that these provisions come into force as quickly as possible. Of course, the illegal duties will be brought into force more quickly. That includes the CSEA offences set out in schedule 6.

The hon. Member for Aberdeen North raised questions in relation to the repeal of part 3 of the Digital Economy Act. Although that is on the statute book, it was never commenced. When it is repealed, we will not be removing from force something that is applied at the moment, because the statutory instrument to commence it was never laid. So the point she raised about whether the Bill would come into force the day after the Digital Economy Act is repealed does not apply; but the point she raised about bringing this legislation into force quickly is reasonable and right, and we will work on that.

The hon. Lady asked about the differences in scope between the video-sharing platform and the online safety regime. As I said, the online safety regime does have an increased scope compared with the VSP regime, but I think it is reasonable to keep an eye on that as she suggested, and keep it under review. There is of course a formal review mechanism in clause 149, but I think that more informally, it is reasonable that as the transition is made we keep an eye on it, as a Government and as parliamentarians, to ensure that nothing gets missed out.

I would add that, separately from the Bill, the online advertising programme is taking a holistic look at online advertising in general, and that will also be looking at matters that may also touch on the VSPs and what they regulate.

Question put and agreed to.

Clause 170 accordingly ordered to stand part of the Bill.

Clauses 171 and 172 ordered to stand part of the Bill.

Clause 173

Powers to amend section 36

Question proposed, That the clause stand part of the Bill.

The clause gives the Secretary of State the power to amend the list of fraudulent offences in section 36 in relation to the duties in relation to fraudulent advertising. These are the new duties that were introduced following feedback from Parliament, the Joint Committee, Martin Lewis and many other people. That is to ensure that we can keep the list of fraudulent offences up to date. The power to make those changes is subject to some constraints, as we would expect. The clause lists the criteria that any new offences must meet before the Secretary of State can include them in the section 36 list, which relates to the prevalence of the paid-for advertisements that amount to the new offence on category 1 services and the risk and severity of harm that that content poses to individuals in the UK.

The clause further limits the Secretary of State’s power to include new fraud offences, listing types of offence that may not be added. Offences from the Consumer Protection from Unfair Trading Regulations would be one instance. As I mentioned, the power to update section 36 is necessary to ensure that the legislation is future-proofed against new legislation and changes in criminal behaviour. Hon. Members have often said that it is important to ensure that the Bill is future-proof, and here is an example of exactly that future-proofing.

Clause 174, also in this group, sets out that the Bill includes a number of particular exemptions to ensure that it remains targeted and proportionate. We recognise that certain kinds of content and services are currently low risk and merit an exemption from the framework, but in the future that may change, and if the risk level does change, parliamentarians and the public will expect us to bring those into the scope of the Bill. Again, this is an example of future-proofing, so that if the world changes in a way that we have not anticipated, the Bill can be updated to ensure that nothing slips through the net. Again, that is consistent with what members of the Committee have been saying about the need to ensure that the Bill is future proof.

Clause 175 provides for updating the list of categories of education and childcare providers. That is essential to ensure that the exemption in paragraph 10 of schedule 1 continues to apply to the correct providers over time, which I think, again, is a reasonable proposition.

Clause 176 provides powers to amend schedules 5, 6 and 7. Those schedules, as colleagues will recall, cover priority criminal offences, which is schedule 7, child sexual exploitation and abuse offences, which is schedule 6, and terrorism offences, which is schedule 5. Clearly, if new offences are created or if there are existing offences that Parliament believes need to be added to these priority lists of offences, we need the flexibility to do that. An example might be a new offence created by a devolved Administration, a new offence that Parliament here at Westminster legislates for that we think needs to be a priority offence, or an existing offence that is not on the list now but in the future we think needs to be added to ensure that platforms proactively protect the public. We need this flexibility. Again, this speaks to the future-proofing of the Bill that Members have spoken about. It is an extremely important aspect of the Bill’s ability to respond to threats that may emerge in the future and to new legislation.

Good morning, Sir Roger. As the Minister has outlined, clause 173 gives the Secretary of State the power to amend the list of fraud offences in what will be section 36 in relation to the duties about fraudulent advertising. Although we recognise that this power is subject to some constraints, Labour has concerns about what we consider to be an unnecessary power given to the Secretary of State to amend duties about fraudulent advertising on category 1 services.

We welcome the provisions outlined in clause 173(2), which lists the criteria that any new offences must meet before the Secretary of State may include them in the list of fraud offences in section 36. The Minister outlined some of those. Along the same lines, the provision in clause 173(3) to further limit the Secretary of State’s power to include new fraud offences—it lists types of offences that may not be added to section 36—is a positive step.

However, we firmly believe that delegated law making of this nature, even when there are these minor constraints in place, is a worrying course for the Government to pursue when we have already strongly verbalised our concerns about Ofcom’s independence. Can the Minister alleviate our concerns by clarifying exactly how this process will work in practice? He must agree with the points that colleagues from across the House have made about the importance of Ofcom being truly independent and free from any political persuasion, influence or control. We all want to see the Bill change things for the better so I am keen to hear from the Minister the specific reasoning behind giving the Secretary of State the power to amend this important legislation through what will seemingly be a simple process.

As we all know, clause 174 allows the Secretary of State to make regulations to amend or repeal provisions relating to exempt content or services. Regulations made under this clause can be used to exempt certain content or services from the scope of the regulatory regime, or to bring them into scope. It will come as no surprise to the Minister that we have genuine concerns about the clause, given that it gives the Secretary of State of the day the power to amend the substantive scope of the regulatory regime. In layman’s terms, we see this clause as essentially giving the Secretary of State the power to, through regulations, exempt certain content and services from the scope of the Bill, or bring them into scope. Although we agree with the Minister that a degree of flexibility is crucial to the Bill’s success and we have indeed raised concerns throughout the Bill’s proceedings about the need to future-proof the Bill, it is a fine balance, and we feel that these powers in this clause are in excess of what is required. I will therefore be grateful to the Minister if he confirms exactly why this legislation has been drafted in a way that will essentially give the Secretary of State free rein on these important regulations.

Clauses 175 and 176 seek to give the Secretary of State additional powers, and again Labour has concerns. Clause 175 gives the Secretary of State the power to amend the list in part 2 of schedule 1, specifically paragraph 10. That list sets out descriptions of education and childcare relating to England; it is for the relevant devolved Ministers to amend the list in their respective areas. Although we welcome the fact that certain criteria must be met before the amendments can be made, this measure once again gives the Secretary of State of the day the ability substantively to amend the scope of the regime more broadly.

Those concerns are felt even more strongly when we consider clause 176, which gives the Secretary of State the power to amend three key areas in the Bill—schedules 5, 6 and 7, which relate to terrorism offences, to child sexual exploitation and abuse content offences—except those extending to Scotland—and to priority offences in some circumstances. Alongside stakeholders, including Carnegie, we strongly feel that the Secretary of State should not be able to amend the substantive scope of the regime at this level, unless moves have been initiated by Ofcom and followed by effective parliamentary oversight and scrutiny. Parliament should have a say in this. There should be no room for this level of interference in a regulatory regime, and the Minister knows that these powers are at risk of being abused by a bad actor, whoever the Secretary of State of the day may be. I must, once again, press the Minister to specifically address the concerns that Labour colleagues and I have repeatedly raised, both during these debates and on Second Reading.

I have a couple of questions, particularly on clause 176 and the powers to amend schedules 6 and 7. I understand the logic for schedule 5 being different—in that terrorism offences are a wholly reserved matter—and therefore why only the Secretary of State would be making any changes.

My question is on the difference in the ways to amend schedules 6 and 7—I am assuming that Government amendment 126, which asks the Secretary of State to consult Scottish Ministers and the Department of Justice in Northern Ireland, and which we have already discussed, will be voted on and approved before we come to clause 176. I do not understand the logic for having different procedures to amend the child sexual exploitation and abuse offences and the priority offences. Why have the Government chosen two different procedures for amending the two schedules?

I understand why that might not be a terribly easy question to answer today, and I would be happy for the Minister to get in touch afterwards with the rationale. It seems to me that both areas are very important, and I do not quite understand why the difference is there.

Let me start by addressing the questions the shadow Minister raised about these powers. She used the phrase “free rein” in her speech, but I would not exactly describe it as free rein. If we turn to clause 179, which we will come to in a moment or two, and subsection (1)(d), (e), (f) and (g), we see that all the regulations made under clauses 173 to 176, which we are debating, require an SI under the affirmative procedure. Parliament will therefore get a chance to have its say, to object and indeed to vote down a provision if it wishes to. It is not that the Secretary of State can act alone; changes are subject to the affirmative SI procedure.

It is reasonable to have a mechanism to change the lists of priority offences and so on by affirmative SI, because the landscape will change and new offences will emerge, and it is important that we keep up to date. The only alternative is primary legislation, and a slot for a new Act of Parliament does not come along all that often—perhaps once every few years for any given topic. I think that would lead to long delays—potentially years—before the various exemptions, lists of priority offences and so on could be updated. I doubt that it is Parliament’s intention, and it would not be good for the public if we had to wait for primary legislation to change the lists. The proposed mechanism is the only sensible and proportionate way to do it, and it is subject to a parliamentary vote.

A comment was made about Ofcom’s independence. The way the offences are defined has no impact on Ofcom’s operational independence. That is about how Ofcom applies the rules; this is about what the rules themselves are. It is right that we are able to update them relatively nimbly by affirmative SI.

The hon. Member for Aberdeen North asked about the differences in the way schedules 6 and 7 can be updated. I will happily drop her a line with further thoughts if she wants me to, but in essence we are happy to get the Scottish child sexual exploitation and abuse offences, set out in part 2 of schedule 6, adopted as soon as Scottish Ministers want. We do not want to delay any measures on child exploitation and abuse, and that is why it is done automatically. Schedule 7, which sets out the other priority offences, could cover any topic at all—any criminal offence could fall under that schedule—whereas schedule 6 is only about child sexual exploitation and abuse. Given that the scope of schedule 7 takes in any criminal offence, it is important to consult Scottish Ministers if it is a Scottish offence but then use the statutory instrument procedure, which applies it to the entire UK internet. Does the hon. Lady want me to write to her, or does that answer her question?

I am grateful to the hon. Lady for saving DCMS officials a little ink, and electricity for an email.

I hope I have addressed the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 173 accordingly ordered to stand part of the Bill.

Clauses 174 and 175 ordered to stand part of the Bill.

Clause 176

Powers to amend Schedules 5, 6 and 7

Amendment made: 126, in clause 176, page 145, line 4, at end insert—

“(5A) The Secretary of State must consult the Scottish Ministers before making regulations under subsection (3) which—

(a) add an offence that extends only to Scotland, or

(b) amend or remove an entry specifying an offence that extends only to Scotland.

(5B) The Secretary of State must consult the Department of Justice in Northern Ireland before making regulations under subsection (3) which—

(a) add an offence that extends only to Northern Ireland, or

(b) amend or remove an entry specifying an offence that extends only to Northern Ireland.”—(Chris Philp.)

This amendment ensures that the Secretary of State must consult the Scottish Ministers or the Department of Justice in Northern Ireland before making regulations which amend Schedule 7 in connection with an offence which extends to Scotland or Northern Ireland only.

Clause 176, as amended, ordered to stand part of the Bill.

Clause 177

Power to make consequential provision

Question proposed, That the clause stand part of the Bill.

With this it will be convenient to discuss the following:

Clause 178 stand part.

Government amendment 160.

Clause 179 stand part.

As new services and functions emerge and evolve, and platforms and users develop new ways to interact online, the regime will need to adapt. Harms online will also continue to change, and the framework will not function effectively if it cannot respond to these changes. These clauses provide the basis for the exercise of the Secretary of State’s powers under the Bill to make secondary legislation. The Committee has already debated the clauses that confer the relevant powers.

Clause 177 gives the Secretary of State the power to make consequential changes to this legislation or regulations made under it. It further provides that the regulations may amend or repeal relevant provisions made under the Communications Act 2003 or by secondary legislation made under that Act. The power is necessary to give effect to the various regulation-making powers in the Bill, which we have mostly already debated, and to ensure that the provisions of the 2003 Act and regulations that relate to online safety can continue to be updated as appropriate. That is consistent with the principle that the Bill must be flexible and future-proof. The circumstances in which these regulation-making powers may be exercised are specified and constrained by the clauses we have previously debated. Clause 178 ensures that the regulation-making powers in the Bill may make different provisions for different purposes, in particular ensuring that regulations make appropriate provisions for different types of service.

Amendment 160 forms part of a group of amendments that will allow Ofcom to recover costs from the regulated services for work that Ofcom carries out before part 6 of the Bill is commenced. As I said previously, the costs may be recouped over a period of three to five years. Currently, the costs of preparations for the exercise of safety functions include only costs incurred after commencement. The amendment makes sure that initial costs incurred before commencement can be recouped as well.

Clause 179 sets out the procedure that should be used for the regulation-making powers in the Bill to ensure that the procedures used are proportionate to the power conferred. We have ensured that all so-called Henry VIII powers, which enable secondary legislation to be used to amend primary legislation, are subject to the affirmative procedure. That way, Parliament will have proper oversight of any changes to this legislation or other Acts of Parliament. I accept the shadow Minister’s point that this cannot be a matter for the Secretary of State acting alone; proper parliamentary approval is needed. That is why clause 179 is constructed as it is.

Again, Labour has concerns about clause 177, which gives the Secretary of State a power to make consequential provisions relating to the Bill or regulations under the Bill. As we know, the power is exercised by regulation and includes the ability to amend the Communications Act 2003. I will spare the Committee a repetition of my sentiments, but we do feel that the clause is part of an extremely worrying package of clauses related to the Secretary of State’s powers, which we feel are broadly unnecessary.

We have the same concerns about clause 178, which sets out how the powers to make regulations conferred on the Secretary of State may be used. Although we recognise that it is important in terms of flexibility and future-proofing that regulations made under the Bill can make different provisions for different purposes, in particular relating to different types of service, we are concerned about the precedent that this sets for future legislation that relies on an independent regulatory system.

Labour supports amendment 160, which will ensure that the regulations made under new schedule 2, which we will debate shortly, are subject to the affirmative procedure. That is vital if the Bill is to succeed. We have already expressed our concerns about the lack of scrutiny of other provisions in the Bill, so we see no issue with amendment 160.

The Minister has outlined clause 179, and he knows that we welcome parliamentary oversight and scrutiny of the Bill more widely. We regard this as a procedural clause and have therefore not sought to amend it.

Question put and agreed to.

Clause 177 accordingly ordered to stand part of the Bill.

Clause 178 ordered to stand part of the Bill.

Clause 179

Parliamentary procedure for regulations

Amendment made: 160, in clause 179, page 146, line 13, at end insert “, or

(k) regulations under paragraph 7 of Schedule (Recovery of OFCOM’s initial costs),—(Chris Philp.)

This amendment provides that regulations under NS2 are subject to the affirmative procedure.

Clause 179, as amended, ordered to stand part of the Bill.

Clause 180

“Provider” of internet service

Question proposed, That the clause stand part of the Bill.

With this it will be convenient to consider the following:

Clauses 181 to 188 stand part.

Amendment 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.

This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.

I will address clauses 180 to 182 together, before moving on to discuss our concerns about the remaining clauses in this group.

As we know, clause 180 determines who is the provider of an internet service and therefore who is subject to the duties imposed on providers. Labour has already raised concerns about the Bill’s lack of future-proofing and its inability to incorporate internet services that may include user-to-user models. The most obvious of those are user-to-user chat functions in gaming, which the hon. Member for Aberdeen North has raised on a number of occasions; we share her concerns.

Broadly, we think the Bill as it stands fails to capture the rapidity of technological advances, and the gaming industry is a key example of this. The Bill targets the providers that have control over who may use the user-to-user functions of a game, but in our view the clarity just is not there for emerging tech in the AI space in particular, so we would welcome the Minister’s comments on where he believes this is defined or specified in the Bill.

Clause 181 defines “user”, “United Kingdom user” and “interested person” in relation to regulated services. We welcome the clarification outlined in subsections (3) and (4) of the role of an employee at a service provider and their position when uploading content. We support the clarity on the term “internet service” in clause 182, and we welcome the provisions to capture services that are accessed via an app specifically, rather than just via an internet browser.

We welcome clause 183, which sets out the meaning of “search engine”. It is important to highlight the difference between search engines and user-to-user services, which has been attempted throughout the Bill. We heard from Google about its definition of “search”, and Labour agrees that, at their root, search services exist as an index of the web, and are therefore different from user-to-user services. We also fully appreciate the rapid nature of the internet—hundreds of web pages are created every single second—meaning that search services have a fundamental role to play in assisting users to find authoritative information that is most relevant to what they are seeking. Although search engines do not directly host content, they have an important role to play in ensuring that a delicate balance is maintained between online safety and access to lawful information. We are therefore pleased to support clause 183, which we feel broadly outlines the responsibilities placed on search services more widely.

On clause 184, Labour supports the need for a proactive technology to be used by regulated service providers to comply with their duties on illegal content, content that is harmful to children, and fraudulent advertising. In our consideration of proactive technology elsewhere in the Bill, Labour has made it clear that we support measures to keep us all safe. When speaking to new clause 20, which we debated with clause 37, I made it clear that we disagree with the Bill’s stance on proactive technology. As it is, the Bill will leave Ofcom unable to proactively require companies to use technology that can detect child abuse. Sadly, I was not particularly reassured by the Minister’s response, but it is important to place on the record again our feeling that proactive technology has an important role to play in improving online safety more widely.

Clause 185 provides information to assist Ofcom in its decision making on whether, in exercising its powers under the Bill, content is communicated publicly or privately. We see no issues with the process that the clause outlines. It is fundamentally right that, in the event of making an assessment of public or private content, Ofcom has a list of factors to consider and a subsequent process to follow. We will therefore support clause 185, which we have not sought to amend.

Clause 186 sets out the meaning of the term “functionality”. Labour supports the clause, particularly the provisions in subsection (2), which include the detailed ways in which platforms’ functionality can affect subsequent online behaviours. Despite our support, I put on the record our concern that the definitions in the clause do little to imagine or capture the broad nature of platforms or, indeed, the potential for them to expand into the AI space in future.

The Minister knows that Labour has advocated a systems-based approach to tackling online safety that would put functionality at the heart of the regulatory system. It is a frustrating reality that those matters are not outlined until clause 186. That said, we welcome the content of the clause, which we have not sought to amend.

Clause 187 aims to define “harm” as “physical or psychological harm”. Again, we feel that that definition could go further. My hon. Friend the Member for Batley and Spen spoke movingly about her constituent Zach in an earlier debate, and made a compelling case for clarity on the interplay between the physical and psychological harm that can occur online. The Minister said that the Government consider the Bill to cover a range of physical and psychological harms, but many charities disagree. What does he say to them?

We will shortly be considering new clause 23, and I will outline exactly how Labour feels that the Bill fails to capture the specific harms that women and girls face online. It is another frustrating reality that the Government have not taken the advice of so many stakeholders, and of so many women and girls, to ensure that those harms are on the face of the Bill.

Labour agrees with the provisions in clause 188, which sets out the meaning of “online safety functions” and “online safety matters”, so we have not sought to amend the clause.

Would it be appropriate for me to speak to the SNP amendment as well, Sir Roger?

Not really. If the hon. Lady has finished with her own amendments, we should, as a courtesy, allow the SNP spokesperson to speak to her amendment first.

Thank you, Sir Roger. I thank the shadow Minister for running through some of our shared concerns about the clauses. Similarly, I will talk first about some of the issues and questions that I have about the clauses, and then I will speak to amendment 76. Confusingly, amendment 76 was tabled to clause 189, which we are not discussing right now. I should have raised that when I saw the provisional selection of amendments. I will do my best not to stray too far into clause 189 while discussing the amendment.

I have raised before with the Minister some of the questions and issues that I have. Looking specifically at clause 181, I very much appreciate the clarification that he has given us about users, what the clause actually means, and how the definition of “user” works. To be fair, I agree with the way the definition of “user” is written. My slight concern is that, in measuring the number of users, platforms might find it difficult to measure the number of unregistered users and the number of users who are accessing the content through another means.

Let us say, for example, that someone is sent a WhatsApp message with a TikTok link and they click on that. I do not know whether TikTok has the ability to work out who is watching the content, or how many people are watching it. Therefore, I think that TikTok might have a difficulty when it comes to the child safety duties and working out the percentage or number of children who are accessing the service, because it will not know who is accessing it through a secondary means.

I am not trying to give anyone a get-out clause. I am trying to ensure that Ofcom can properly ensure that platforms that have a significant number of children accessing them through secondary means are still subject to the child safety duties even though there may not be a high number of children accessing the platform or the provider directly. My major concern is assessing whether they are subject to the child safety duties laid out in the Bill.

I will move straight on to our amendment 76, which would amend the definition of “content” in clause 189. I have raised this issue with the Minister already. The clause, as amended, would state that

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including but not limited to”—

and then a list. The reason I suggest that we should add those words “but not limited to” is that if we are to have a list, we should either make an exhaustive list or have clarity that there are other things that may not be on the list.

I understand that it could be argued that the word “including” suggests that the provision actually goes much wider than what is in the list. I understand that that is the argument that the Minister may make, but can we have some more clarity from him? If he is not willing to accept the amendment but he is willing to be very clear that, actually, the provision does include things that we have not thought of and that do not currently exist and that it genuinely includes anything communicated by means of an internet service, that will be very helpful.

I think that the amendment would add something positive to the Bill. It is potentially the most important amendment that I have tabled in relation to future-proofing the Bill, because it does feel as though the definition of “content”, even though it says “including”, is unnecessarily restrictive and could be open to challenge should someone invent something that is not on the list and say, “Well, it’s not mentioned, so I am not going to have to regulate this in the way we have to regulate other types of content.”

I have other questions about the same provision in clause 189, but I will hold on to those until we come to the next grouping.

I rise briefly to support amendment 76, in the name of the hon. Member for Aberdeen North. Labour supports broadening the definition of “content” in this way. I refer the Minister to our earlier contributions about the importance of including newspaper comments, for example, in the scope of the Bill. This is a clear example of a key loophole in the Bill. We believe that a broadened definition of “content” would be a positive step forward to ensure that there is future-proofing, to prevent any unnecessary harm from any future content.

The shadow Minister, in her first contribution to the debate, introduced the broad purpose of the various clauses in this group, so I do not propose to repeat those points.

I would like to touch on one or two issues that came up. One is that clause 187 defines the meaning of “harm” throughout the Bill, although clause 150, as we have discussed, has its own internal definition of harm that is different. The more general definition of harm is made very clear in clause 187(2), which states:

“‘Harm’ means physical or psychological harm.”

That means that harm has a very broad construction in the Bill, as it should, to make sure that people are being protected as they ought to be.

The shadow Minister made a point about women and girls. I suspect that we will debate this in some detail when we reach one of the new clauses that she has tabled, which I guess we will come to on Tuesday. I do not want to speak to this issue at length, given that we will probably discuss it more then, but I will make a couple of brief points.

On the risk assessment duties from which the safety duties flow, as we have debated previously, clause 10(6)(d) states that the risk assessments have to cover

“individuals with a certain characteristic or members of a certain group”,

which obviously includes women and girls, meaning that matters that are particular to women and girls—or, indeed, to other groups, such as ethnic minorities, people with a particular sexual orientation and so on—will have to be addressed in those risk assessment duties, and that will then flow through into the other safety duties.

The same applies to the safety duties relating to adults in clause 12(5)(d). Again, an individual’s characteristics, which include gender, have to be properly taken into account. I will also mention in passing that many of the priority offences in schedule 7 are offences where women are overwhelmingly likely to be the victims, such as harassment, stalking and so on, but I suspect that we will debate this issue in much more detail on Tuesday, so we can go through these points then.

I understand the point of amendment 76 to clause 189. We agree that the Bill should cover matters that are not on the list, but the word “including” does not limit what can be included to the things that follow it. It can include other things as well—it is not restrictive—so we do not think the words “but not limited to” need to be added, particularly when the very first sentence of the definition says that

“‘content’ means anything communicated by means of an internet service”.

If a judge or Ofcom comes to interpret the clause in due course, the use of the word “anything” in the previous line is very important, because “anything” is universal and, as the words suggests, means absolutely everything. The word “including” follows “anything”, so it is clear that the list of items that follows—I am happy to put this on the record—is not an exclusive or exhaustive list. Indeed, it could not possibly be; if it was, the “anything” in the previous sentence would not work.

On the definition of “user” and the numbers, the duties to protect children apply to children who access content “by means of” a site, which includes those who access it through one site and on to another, as we have discussed previously. The first point of access, from which someone may then go to a second site, would, of course, be tracking the numbers, and that would then get caught. Beyond that, even where a site has people looking at its content who are not registered, very often it will be tracking them by way of cookies. The main point is that the duty is on the primary access point to protect children who are accessing content through its site, and to keep track of the numbers for the purpose of working out whether the “significant” test is met.

I hope I have responded to the points raised. Obviously, I believe that clauses 181 to 189 should stand part of the Bill. Although I agree with the intent behind amendment 76, I do not think it is necessary, because the existing drafting already achieves what the hon. Member for Aberdeen North seeks to achieve.

Question put and agreed to.

Clause 180 accordingly ordered to stand part of the Bill.

Clauses 181 to 188 ordered to stand part of the Bill.

Clause 189

Interpretation: general

Amendment 111 is not claimed; it has been tabled by the hon. Member for Stroud (Siobhan Baillie), who is not a member of the Committee. I am assuming that nobody wishes to take ownership of it and we will not debate it.

If the hon. Member for Aberdeen North wishes to move amendment 76, she will be able to do so at the end of the stand part debate.

Question proposed, That the clause stand part of the Bill.

As we know, the clause sets out the meanings of various terms used in the Bill. Throughout our Committee debates, Labour has raised fundamental concerns on a number of points where we feel the interpretation of the Bill requires clarification. We raised concerns as early as clause 8, when we considered the Bill’s ability to capture harm in relation to newly produced CSEA content and livestreaming. The Minister may feel he has sufficiently reassured us, but I am afraid that simply is not the case. Labour has no specific issues with the interpretations listed in clause 189, but we will likely seek to table further amendments on Report in the areas that we feel require clarification.

In one of our earlier debates, I asked the Minister about the difference between “oral” and “aural”, and I did not get a very satisfactory answer. I know the difference in their dictionary definition—I understand that they are different, although the words sound the same. I am confused that clause 189 uses “oral” as part of the definition of content, but clause 49 refers to

“one-to-one live aural communications”

in defining things that are excluded.

I do not understand why the Government have chosen to use those two different words in different places in the Bill. It strikes me that, potentially, we mean one or the other. If they do mean two different things, why has one thing been chosen for clause 49 and another thing for clause 189? Why has the choice been made that clause 49 relates to communications that are heard, but clause 189 relates to communications that are said? I do not quite get the Government’s logic in using those two different words.

I know this is a picky point, but in order to have good legislation, we want it to make sense, for there to be a good rationale for everything that is in it and for people to be able to understand it. At the moment, I do not properly understand why the choice has been made to use two different words.

More generally, the definitions in clause 189 seem pretty sensible, notwithstanding what I said in the previous debate in respect of amendment 76, which, with your permission, Sir Roger, I intend to move when we reach the appropriate point.

As the hon. Member for Pontypridd said, clause 189 sets out various points of definition and interpretation necessary for the Bill to be understood and applied.

I turn to the question raised by the hon. Member for Aberdeen North. First, I strongly commend and congratulate her on having noticed the use of the two words. Anyone who thinks that legislation does not get properly scrutinised by Parliament has only to look to the fact that she spotted this difference, 110 pages apart, in two different clauses—clauses 49 and 189. That shows that these things do get properly looked at. I strongly congratulate her on that.

I think the best way of addressing her question is probably to follow up with her after the sitting. Clause 49 relates to regulated user-to-user content. We are in clause 49(2)—is that right?

It is cross-referenced in subsection (5). The use of the term “aural” in that subsection refers to sound only—what might typically be considered telephony services. “Oral” is taken to cover livestreaming, which includes pictures and voice. That is the intention behind the use of the two different words. If that is not sufficient to explain the point—it may not be—I would be happy to expand in writing.

That would be helpful, in the light of the concerns I raised and what the hon. Member for Pontypridd mentioned about gaming, and how those communications work on a one-to-one basis. Having clarity in writing on whether clause 49 relates specifically to telephony-type services would be helpful, because that is not exactly how I read it.

Given that the hon. Lady has raised the point, it is reasonable that she requires more detail. I will follow up in writing on that point.

Amendment proposed: 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.(Kirsty Blackman.)

This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.

Question put, That the amendment be made.

Clause 189 ordered to stand part of the Bill.

Clause 190

Index of defined terms

Question proposed, That the clause stand part of the Bill.

Labour has not tabled any amendments to clause 190, which lists the provisions that define or explain terms used in the Bill. However, it will come as no surprise that we dispute the Bill’s definition of harm, and I am grateful to my hon. Friend the Member for Batley and Spen for raising those important points in our lively debate about amendment 112 to clause 150. We maintain that the Minister has missed the point, in that the Bill’s definition of harm fails to truly capture physical harm caused as a consequence of being online. I know that the Minister has promised to closely consider that as we head to Report stage, but I urge him to bear in mind the points raised by Labour, as well as his own Back Benchers.

The Minister knows, because we have repeatedly raised them, that we have concerns about the scope of the Bill’s provisions relating to priority content. I will not repeat myself, but he will be unsurprised to learn that this is an area in which we will continue to prod as the Bill progresses through Parliament.

I have made points on those issues previously. I do not propose to repeat now what I have said before.

Question put and agreed to.

Clause 190 accordingly ordered to stand part of the Bill.

Clause 191 ordered to stand part of the Bill.

Clause 192


I beg to move amendment 141, in clause 192, page 160, line 9, at end insert—

“(aa) section (Offence under the Obscene Publications Act 1959: OFCOM defence);”.

This amendment provides for NC35 to extend only to England and Wales.

With this it will be convenient to discuss Government new clause 35—Offence under the Obscene Publications Act 1959: OFCOM defence

“(1) Section 2 of the Obscene Publications Act 1959 (prohibition of publication of obscene matter) is amended in accordance with subsections (2) and (3).

(2) After subsection (5) insert—

“(5A) A person shall not be convicted of an offence against this section of the publication of an obscene article if the person proves that—

(a) at the time of the offence charged, the person was a member of OFCOM, employed or engaged by OFCOM, or assisting OFCOM in the exercise of any of their online safety functions (within the meaning of section188 of the Online Safety Act 2022), and

(b) the person published the article for the purposes of OFCOM’s exercise of any of those functions.”

(3) In subsection (7)—

(a) the words after “In this section” become paragraph (a), and

(b) at the end of that paragraph, insert “;

(b) “OFCOM” means the Office of Communications.””

This new clause (to be inserted after clause 171) amends section 2 of the Obscene Publications Act 1959 to create a defence for OFCOM and their employees etc to the offence of the publication of an obscene article.

New clause 35 amends section 2 of the Obscene Publications Act 1959 to create a defence for Ofcom to the offence of publishing an obscene article where Ofcom is exercising its online safety duties. Ofcom has a range of functions that may result in its staff handling such content, so we want to ensure that that is covered properly. We have debated that already.

Clause 192 covers territorial extent. The regulation of the internet, as a reserved matter, covers all of the United Kingdom, but particular parts of the Bill extend to particular areas of the UK. In repealing that point in the Obscene Publications Act, we are ensuring that the Bill applies to the relevant parts of the United Kingdom, because that area of legislation has different areas of applicability. The clause and our amendments are important in ensuring that that is done in the right way.

The clause provides that the Bill extends to England, Wales, Scotland and Northern Ireland, subject to the exceptions set out in subsections (2) to (7). We welcome clarification of how the devolved nations may be affected by the provisions of the Bill—that is of particular importance to me as a Welsh MP. It is important to clarify how amendments or appeals, as outlined in subsection (7), may work in the context of devolution more widely.

Labour also supports new clause 35 and Government amendment 141. Clearly, those working for Ofcom should have a defence to the offence of publishing obscene articles as, sadly, we see that as a core part of establishing the online safety regime in full. We know that having such a defence available is likely to be an important part of the regulator’s role and that of its employees. Labour is therefore happy to support this sensible new clause and amendment.

The Opposition spokesperson has said it all.

Amendment 141 agreed to.

Clause 192, as amended, ordered to stand part of the Bill.

Clause 193

Commencement and transitional provision

Amendment 139 was tabled by a Member who is not a member of the Committee, and nobody has claimed it, so we come to amendment 49.

I beg to move amendment 49, in clause 193, page 161, line 1, leave out subsection (2) and insert—

“(2) Subject to subsection (2A) below, the other provisions of this Act come into force on such day as the Secretary of State may by regulations appoint.

(2A) The provisions of Part 5 shall come into force at the end of the period of three months beginning with the day on which this Act is passed.”

This amendment would bring Part 5 into force three months after the Act is passed.

We all understand the need for the Bill, which is why we have been generally supportive in Committee. I hope we can also agree that the measures that the Bill introduces must come into force as soon as is reasonably possible. That is particularly important for the clauses introducing protections for children, who have been subject to the harms of the online world for far too long already. I was glad to hear the Minister say in our discussions of clauses 31 to 33 that the Government share the desire to get such protections in place quickly.

My hon. Friend the Member for Worsley and Eccles South also spoke about our concerns about the commencement and transitional provisions when speaking to clauses 170 to 172. We fundamentally believe that the provisions on pornography in part 5 cannot, and should not, be susceptible to further delay, because they require no secondary legislation. I will come to that point in my comments on the amendment. More broadly, I will touch briefly on the reasons why we cannot wait for the legislation and make reference to a specific case that I know colleagues across the House are aware of.

My hon. Friend the Member for Reading East (Matt Rodda) has been a powerful voice on behalf of his constituents Amanda and Stuart Stephens, whose beloved son Olly was tragically murdered in a field outside his home. A BBC “Panorama” investigation, shown only a few days ago, investigated the role that social media played in Olly’s death. It specifically highlighted disturbing evidence that some social media algorithms may still promote violent content to vulnerable young people. That is another example highlighting the urgent need for the Bill, along with a regulatory process to keep people safe online.

We also recognise, however, the important balance between the need for effective development of guidance by Ofcom, informed by consultation, and the need to get the duties up and going. In some cases, that will mean having to stipulate deadlines in the Bill, which we feel is a serious omission and oversight at present.

The amendment would bring part 5 of the Bill into force three months after it is enacted. The Minister knows how important part 5 is, so I do not need to repeat myself. The provisions of the amendment, including subsequent amendments that Labour and others will likely table down the line, are central to keeping people safe online. We have heard compelling evidence from experts and speeches from colleagues across the House that have highlighted how vital it is that the Bill goes further on pornographic content. The amendment is simple. It seeks to make real, meaningful change as soon as is practically possible. The Bill is long delayed, and providers and users are desperate for clarity and positive change, which is what led us to tabling the amendment.

In the interests of not having to make a speech in this debate, I want to let the hon. Member know that I absolutely support the amendment. It is well balanced, brings the most important provisions into force as soon as possible, and allows the Secretary of State to appoint dates for the others.

I welcome the hon. Member’s intervention, and I am grateful for her and her party’s support for this important amendment.

It is also worth drawing colleagues’ attention to the history of issues, which have been brought forward in this place before. We know there was reluctance on the part of Ministers when the Digital Economy Act 2017 was on the parliamentary agenda to commence the all-important part 3, which covered many of the provisions now in part 5. Ultimately, the empty promises made by the Minister’s former colleagues have led to huge, record failures, even though the industry is ready, having had years to prepare to implement the policy. I want to place on record my thanks to campaigning groups such as the Age Verification Providers Association and others, which have shown fierce commitment in getting us this far.

It might help if I cast colleagues’ minds back to the Digital Economy Act 2017, which received Royal Assent in April of that year. Following that, in November 2018, the then Minister of State for Digital and Creative Industries told the Science and Technology Committee that part 3 of the DEA would be in force “by Easter next year”. Then, in December 2018, both Houses of Parliament approved the necessary secondary legislation, the Online Pornography (Commercial Basis) Regulations 2018, and the required statutory guidance.

But shortly after, in April 2018, the first delay arose when the Government published an online press release stating that part 3 of the DEA would not come into force until 15 July 2019. However, June 2019 came around and still there was nothing. On 20 June, five days after it should have come into force, the then Under-Secretary of State told the House of Lords that the defendant had failed to notify the European Commission of the statutory guidance, which would need to be done, and that that would result in a delay to the commencement of part 3

“in the region of six months”.—[Official Report, House of Lords, 20 June 2019; Vol. 798, c. 883.]

However, on 16 October 2019, the then Secretary of State announced via a written statement to Parliament that the Government

“will not be commencing part 3 of the Digital Economy Act 2017 concerning age verification for online pornography.”—[Official Report, 16 October 2019; Vol. 666, c. 17WS.]

A mere 13 days later, the Government called a snap general election. I am sure those are pretty staggering realities for the Minister to hear—and defend—but I am willing to listen to his defence. It really is not good enough. The industry is ready, the technology has been there for quite some time, and, given this Government’s fondness for a U-turn, there are concerns that part 5 of the Bill, which we have spent weeks deliberating, could be abandoned in a similar way as part 3 of the DEA was.

The Minister has failed to concede on any of the issues we have raised in Committee. It seems we are dealing with a Government who are ignoring the wide-ranging gaps and issues in the Bill. He has a relatively last-ditch opportunity to at least bring about some positive change, and to signify that he is willing to admit that the legislation as it stands is far from perfect. The provisions in part 5 are critical—they are probably the most important in the entire Bill—so I urge him to work with Labour to make sure they are put to good use in a more than reasonable timeframe.

On the implementation of part 3 of the Digital Economy Act 2017, all the events that the shadow Minister outlined predated my time in the Department. In fact, apart from the last few weeks of the period she talked about, the events predated my time as a Minister in different Departments, and I cannot speak for the actions and words of Ministers prior to my arrival in DCMS. What I can say, and I have said in Committee, is that we are determined to get the Bill through Parliament and implemented as quickly as we can, particularly the bits to do with child safety and the priority illegal content duties.

The shadow Minister commented at the end of her speech that she thought the Government had been ignoring parliamentary opinion. I take slight issue with that, given that we published a draft Bill in May 2021 and went through a huge process of scrutiny, including by the Joint Committee of the Commons and the Lords. We accepted 66 of the Joint Committee’s recommendations, and made other very important changes to the Bill. We have made changes such as addressing fraudulent advertising, which was previously omitted, and including commercial pornography—meaning protecting children—which is critical in this area.

The Government have made a huge number of changes to the Bill since it was first drafted. Indeed, we have made further changes while the Bill has been before the Committee, including amending clause 35 to strengthen the fraudulent advertising duties on large search companies. Members of Parliament, such as the right hon. Member for East Ham (Sir Stephen Timms), raised that issue on Second Reading. We listened to what was said at that stage and we made the changes.

There have also been quite a few occasions during these Committee proceedings when I have signalled—sometimes subtly, sometimes less so—that there are areas where further changes might be forthcoming as the Bill proceeds through both Houses of Parliament. I do not think the hon. Member for Pontypridd, or any member of the Committee, should be in any doubt that the Government are very open to making changes to the Bill where we are able to and where they are right. We have done so already and we might do so again in the future.

On the specifics of the amendment, we share the intention to protect children from accessing pornography online as quickly as possible. The amendment seeks to set a three-month timeframe within which part 5 must come into force. However, an important consideration for the commencement of part 5 will be the need to ensure that all kinds of providers of online pornography are treated the same, including those hosting user-generated content, which are subject to the duties of part 3. If we take a piecemeal approach, bringing into force part 5, on commercial pornography, before part 3, on user-to-user pornography, that may enable some of the services, which are quite devious, to simply reconfigure their services to circumvent regulation or cease to be categorised as part 5 services and try to be categorised as part 3 services. We want to do this in a comprehensive way to ensure that no one will be able to wriggle out of the provisions in the Bill.

Parliament has also placed a requirement on Ofcom to produce, consult on and publish guidance for in-scope providers on meeting the duties in part 5. The three-month timescale set out in the amendment would be too quick to enable Ofcom to properly consult on that guidance. It is important that the guidance is right; if it is not, it may be legally challenged or turn out to be ineffective.

I understand the need to get this legislation implemented quickly. I understand the scepticism that flows from the long delays and eventual cancellation of part 3 of the Digital Economy Act 2017. I acknowledge that, and I understand where the sentiment comes from. However, I think we are in a different place today. The provisions in the Bill have been crafted to address some of the concerns that Members had about the previous DEA measures—not least the fact that they are more comprehensive, as they cover user-to-user, which the DEA did not. There is therefore a clear commitment to getting this done, and getting it done fast. However, we also have to get it done right, and I think the process we have set out does that.

The Ofcom road map is expected before the summer. I hope that will give further reassurance to the Committee and to Parliament about the speed with which these things can get implemented. I share Members’ sentiments about needing to get this done quickly, but I do not think it is practical or right to do it in the way set out in amendment 49.

I am grateful for the Minister’s comments. However, I respectfully disagree, given the delays already since 2017. The industry is ready for this. The providers of the age verification services are ready for this. We believe that three months is an adequate timeframe, and it is vital that we get this done as quickly as possible. With that in mind, I will be pushing amendment 49 to a vote.

Question put, That the amendment be made.

Clause 193 ordered to stand part of the Bill.

Clause 194

Short title

Question proposed, That the clause stand part of the Bill.

This very important and concise clause sets out that the Bill, when passed, will be cited as the Online Safety Act 2022, which I hope is prophetic when it comes the lightning speed of passage through the House of Lords.

Question put and agreed to.

Clause 194 accordingly ordered to stand part of the Bill.

New Clause 35

Offence under the Obscene Publications Act 1959: OFCOM defence

“(1) Section 2 of the Obscene Publications Act 1959 (prohibition of publication of obscene matter) is amended in accordance with subsections (2) and (3).

(2) After subsection (5) insert—

‘(5A) A person shall not be convicted of an offence against this section of the publication of an obscene article if the person proves that—

(a) at the time of the offence charged, the person was a member of OFCOM, employed or engaged by OFCOM, or assisting OFCOM in the exercise of any of their online safety functions (within the meaning of section188 of the Online Safety Act 2022), and

(b) the person published the article for the purposes of OFCOM’s exercise of any of those functions.’

(3) In subsection (7)—

(a) the words after ‘In this section’ become paragraph (a), and

(b) at the end of that paragraph, insert ‘;

(b) “OFCOM” means the Office of Communications.’”—(Chris Philp.)

This new clause (to be inserted after clause 171) amends section 2 of the Obscene Publications Act 1959 to create a defence for OFCOM and their employees etc to the offence of the publication of an obscene article.

Brought up, read the First and Second time, and added to the Bill.

New Clause 42

Recovery of OFCOM’s initial costs

“Schedule (Recovery of OFCOM’s initial costs) makes provision about fees chargeable to providers of regulated services in connection with OFCOM’s recovery of costs incurred on preparations for the exercise of their online safety functions.”—(Chris Philp.)

This new clause introduces NS2.

Brought up, and read the First time.

With this it will be convenient to discuss Government new clause 43 and Government new schedule 2.

New clause 42 introduces new schedule 2. New clause 43 provides that the additional fees charged to providers under new schedule 2 must be paid into the consolidated fund. We discussed that a few days ago. That is where the fees are currently destined and I owe my right hon. Friend the Member for Basingstoke some commentary on this topic in due course. The Bill already provided that monetary penalties must be paid into the Consolidated Fund; the provisions are now placed into that clause.

New schedule 2, which is quite detailed, makes provisions in connection with Ofcom’s ability to recover its initial costs, which we have previously debated. As discussed, it is important that the taxpayer not only is protected from the ongoing costs but that the set-up costs are recovered. The taxpayer should not have to pay for the regulatory framework; the people who are being regulated should pay, whether the costs are incurred before or after commencement, in line with the “polluter pays” principle. Deep in new schedule 2 is the answer to the question that the hon. Member for Aberdeen North asked a day or two ago about the period over which set-up costs can be recovered, with that period specified as between three and five years. I hope that provides an introduction to the new clauses and new schedules.

We welcome this grouping, which includes two new clauses and a new schedule. Labour has raised concerns about the future funding of Ofcom more widely, specifically when we discussed groupings on clause 42. The Minister’s response did little to alleviate our concerns about the future of Ofcom’s ability to raise funds to maintain its position as the regulator. Despite that, we welcome the grouping, particularly the provisions in the new schedule, which will require Ofcom to seek to recover the costs it has incurred when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services. This is an important step, which we see as being broadly in line with the kind of mechanisms already in place for other, similar regulatory regimes.

Ultimately, it is right that fees charged to providers under new schedule 2 must be paid into the Consolidated Fund and important that Ofcom can recover its costs before a full fee structure and governance process is established. However, I have some questions for the Minister. How many people has Ofcom hired into roles, and can any of those costs count towards the calculation of fees? We want to ensure that other areas of regulation do not lose out as a consequence. Broadly speaking, though, we are happy to support the grouping and have not sought to table amendment at this stage.

So far as I am aware, all the costs incurred by Ofcom in relation to the duties in the Bill can be recouped by way of fees. If that is not correct, I will write to the hon. Lady saying so, but my understanding is that any relevant Ofcom cost will be in the scope of the fees.

Question put and agreed to.

New clause 42 accordingly read a Second time, and added to the Bill.

New Clause 43

Payment of sums into the Consolidated Fund

“(1) Section 400 of the Communications Act (destination of penalties etc) is amended as follows.

(2) In subsection (1), after paragraph (i) insert—

‘(j) an amount paid to OFCOM in respect of a penalty imposed by them under Chapter 6 of Part 7 of the Online Safety Act 2022;

(k) an amount paid to OFCOM in respect of an additional fee charged under Schedule (Recovery of OFCOM’s initial costs) to the Online Safety Act 2022.’

(3) In subsection (2), after ‘applies’ insert ‘(except an amount mentioned in subsection (1)(j) or (k))’.

(4) After subsection (3) insert—

‘(3A) Where OFCOM receive an amount mentioned in subsection (1)(j) or (k), it must be paid into the Consolidated Fund of the United Kingdom.’

(5) In the heading, omit ‘licence’.”—(Chris Philp.)

This new clause provides that additional fees charged to providers under NS2 must be paid into the Consolidated Fund. The Bill already provided that monetary penalties must be paid into the Consolidated Fund, and those provisions are now placed in this clause.

Brought up, read the First and Second time, and added to the Bill.

New Clause 3

Establishment of Advocacy Body

“(1) There is to be a body corporate (‘the Advocacy Body’) to represent interests of child users of regulated services.

(2) A ‘child user’—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The ‘interests of child users’ means the interest of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) ‘enforceable requirements’ relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.”—(Barbara Keeley.)

This new clause creates a new advocacy body for child users of regulated internet services.

Brought up, and read the First time.

I beg to move, That the clause be read a Second time.

New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.

Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:

“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.

My hon. Friend is making a really valid point. As I look around the room—I mean this with no disrespect to anybody—I see that we are all of an age at which we do not understand the internet in the same way that children and young people do. Surely, one of the key purposes of the Bill is to make sure that children and young people are protected from harms online, and as the Children’s Commissioner said in her evidence, their voices have to be heard. I am sure that, like me, many Members present attend schools as part of their weekly constituency visits, and the conversations we have with young people are some of the most empowering and important parts of this job. We have to make sure that the voices of the young people who we all represent are heard in this important piece of legislation, and it is really important that we have an advocacy body to ensure that.

I very much agree with my hon. Friend. She is quite right: we have to remember that we do not see these things as children and young people do.

The user advocacy body that my hon. Friend has just spoken in support of could also shine a light on the practices that are most harmful to children by using data, evidence and specialist expertise to point to new and emerging areas of harm. That would enable the regulator to ensure its risk profiles and regulatory approach remain valid and up to date. In his evidence, Andy Burrows of the NSPCC highlighted the importance of an advocacy body acting as an early warning system:

“Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

The provision in the new clause is comparable to those that already exist in many other sectors. For example, Citizens Advice is the statutory user advocate for consumers of energy and the postal services, and there are similar arrangements representing users of public transport. Establishing a children’s user advocacy body would ensure that the most vulnerable online users of all—children at risk of online sexual abuse—receive equivalent protections to customers of post offices or passengers on a bus.

The hon. Lady will recall the issue that I raised earlier in the Committee’s deliberations, regarding the importance of victim support that gives people somewhere to go other than the platforms. I think that is what she is now alluding to. Does she not believe that the organisations that are already in place, with the right funding—perhaps from the fines coming from the platforms themselves—would be in a position to do this almost immediately, and that we should not have to set up yet another body, or have I misunderstood what she has said?

I do not think that the right hon. Lady has misunderstood what I said. I said that the new clause would allow the Secretary of State to appoint a new or existing body as the statutory user advocate, so it could very much be either.

New clause 3 would also rebalance the interests of children against the vocal and well-resourced regulated companies. I think that is a key argument for having an advocacy body. Without such a counterbalance, large tech companies could attempt to capture independent expert voices, fund highly selective research with the intent to skew the evidence base, and then challenge regulatory decisions with the evidence base they have created.

Those tactics are not new; similar tactics are used in other regulated sectors, such as the tobacco industry. In line with other sectors, the user advocacy body should be funded by a levy on regulated companies. That would be in line with the “polluter pays” principle in part 6 and would be neutral to the Exchequer—another reason to accept it. Compared with the significant benefits and improved outcomes it would create, the levy would represent only a minimal additional burden on companies.

There is strong support for the creation of a user advocate. Research by the NSPCC shows that 88% of UK adults who responded to a YouGov survey think that it is necessary for the Bill to introduce a requirement for an independent body that can protect the interests of children at risk of online harms, including grooming and child sexual abuse.

It is also a popular option among children. YoungMinds has said that young people do not feel they are being included enough in the drafting of the Bill. It evidenced that with research it undertook that found that almost 80% of young people aged 11 to 25 surveyed had never even heard of the Bill.

A young woman told the NSPCC why she felt a children’s advocacy body is needed. She is a survivor of online grooming, and it is worth sharing what she said in full, because it is powerful and we have not shared the voices of young people enough. She said:

“When I was 13, a man in his 30s contacted me on Facebook. I added him because you just used to add anyone on Facebook. He started messaging me and I liked the attention. We’d speak every day, usually late at night for hours at a time…He started asking for photos, so I sent some. Then he asked for some explicit photos, so I did that too, and he reciprocated…In my eyes, telling anyone in my life about this man was not an option. We need to stop putting the responsibility on a vulnerable child to prevent crime and start living in a world which puts keeping children safe first. That means putting child safety at the heart of policy. I want a statutory child user advocacy body funded by the industry levy. This would play a vital role in advocating for children’s rights in regulatory debates. Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side. Having a body stand up for the rights of children in such a vulnerable position is invaluable…it is so rare that voices like mine have a chance to be heard by policy makers. Watching pre legislative debates I’ve been struck by how detached from my lived experience they can be”—

that is very much the point that my hon. Friend the Member for Batley and Spen made—

“and indeed the lived experiences of thousands of others. If we want to protect children, we need to understand and represent what they need.”

I hope that the Committee will recognise the bravery of that young woman in speaking about her experiences as a survivor of online grooming. I hope that the Minister will respect the insights she offers and consider the merits of having a user advocacy body to support children and young people experiencing harms online.

I read new clause 3 in conjunction with the starred new clause 44, because it makes sense to consider the funding of the advocacy body, and the benefits of that funding, when discussing the merits of such a body. Part of that is because the funding of the advocacy body, and the fact that it needs to be funded, is key to its operation, and a key reason why we need it.

I have talked at length about how stretched charitable and third sector organisations are just now, about how tight their budgets are and about how they are having to make decisions on what they do or do not pursue. They do not have enough money to pursue everything, so they can do only the most important things. I accept that we have a super-complaints procedure, and I am glad about that, but it is not funded—it does not have the funding that we would hope an advocacy body would have—so it is lacking, and charitable organisations will not necessarily be able to raise all their concerns because they may not have the money, time or resources to go through that procedure.

The advocacy body would be key for doing two things. One, as was mentioned by the right hon. Member for Basingstoke, is providing voices and evidence from victims about what has happened and what needs to change for it not to happen again. The other is looking at emerging threats, thereby protecting not just victims but potential victims. As we begin to see threats emerge on the internet that we have not yet considered, the advocacy body would be the best placed organisation to highlight them to Ofcom.

I will gently push back on the point made by the hon. Member for Batley and Spen. Having been online for 28 years—since I was eight—I feel like I have a pretty good grasp of the internet and one that is not dissimilar to my children’s. More than 20 years ago, I was on forums and MSN Messenger having conversations with guys 20 years older than me, so I have lived experience of this, but I agree with the hon. Member for Worsley and Eccles South that the voices and experiences of children and young people are not central enough, given how important the Bill is for their protection.

I am well aware that I am of the first generation with that experience. In fact, not many people of my age have been on the internet for quite as long—my dad was a very early adopter and bought a modem—and I was given quite a free reign online, which I do not recommend people giving their children. I imagine that the majority of people scrutinising and making decisions in Ofcom will not have my experience of seeing and accessing the internet as children. Empathy is all well and good, but that is different from the amplification of voices that a user advocacy body could bring.

We have heard from a number of organisations, including those that have provided oral and written evidence. Apart from the NSPCC and others that we have mentioned, one good organisation that does regular work in this area is Girlguiding, which does an annual survey about girls’ experiences of the internet, and it makes for bleak reading. It shows that young women and girls have overwhelmingly had negative experiences online. However, that must be balanced against how, during the covid lockdowns, when we were all isolated from our friends, our peer groups and the people we normally spend time with, a huge number of those women and girls—particularly those in the most marginalised groups such as LGBTQI—found solace and important community on the internet.

That hammers home how important it is for us to make the internet a safe place, because people need to be able to find that community online. People, particularly young people, need to be able to access friends, groups, advice and assistance, and all the good things that we have online, even games. Games are a whole load of fun, and I have no problem with children and young people playing them. In fact, I think they should be encouraged to play some of the excellent games available online, but we need to be able to keep them safe.

No matter how many staff Ofcom employs to deal with the provisions in the Online Safety Bill, I do not believe it can possibly have the expertise that an advocacy body, specifically one advocating on behalf of child users, could bring to the table in performing scrutiny, working with other organisations, and undertaking risk assessments and child safety duties. If Ofcom is to rely on charities and third sector organisations to do this, it needs to provide funding to them. If it is just going to say, “It’s fine, because we’ll just listen to the NSPCC or Girlguiding,”—or to any of the other organisations bringing concerns forward—there may be a gap without funding for those organisations. Ofcom cannot rely on third sector organisations or place that responsibility on them, because this issue is too important.

The hon. Lady is making some excellent points. I wholeheartedly agree with her about funding for bodies that might be able to support the advocacy body or act as part of it. She makes a really important point, which we have not focused on enough during the debate, about the positive aspects of the internet. It is very easy to get bogged down in all the negative stuff, which a lot of the Bill focuses on, but she is right that the internet provides a safe space, particularly for young people, to seek out their own identity. Does she agree that the new clause is important because it specifically refers to protected characteristics and to the Equality Act 2010? I am not sure where else that appears in the Bill, but it is important that it should be there. We are thinking not just about age, but about gender, disability and sexual orientation, which is why this new clause could be really important.

I absolutely agree. I had not thought about it in those terms, but the hon. Member is right that the new clause gives greater importance to those protected characteristics and lays that out in the Bill.

I appreciate that, under the risk assessment duties set out in the Bill, organisations have to look at protected characteristics in groups and at individuals with those protected characteristics, which I welcome, but I also welcome the inclusion of protected characteristics in the new clause in relation to the duties of the advocacy body. I think that is really important, especially, as the hon. Member for Batley and Spen just said, in relation to the positive aspects of the internet. It is about protecting free speech for children and young people and enabling them to find community and enjoy life online and offline.

Will the Minister give serious consideration to the possibility of a user advocacy body? Third sector organisations are calling for that, and I do not think Ofcom could possibly have the expertise to match such a body.

I want briefly to interject to underline the point I made in my intervention on the hon. Member for Worsley and Eccles South. I welcome the discussion about victims’ support, which picks up on what we discussed on clause 110. At that point I mentioned the NSPCC evidence that talked about the importance of third party advocacy services, due to the lack of trust in the platforms, as well as for some of the other reasons that the hon. Members for Worsley and Eccles South, for Batley and Spen, and for Aberdeen North have raised.

When we discussed clause 110, the Minister undertook to think about the issue seriously and to talk to the Treasury about whether funding could be taken directly from fines rather than those all going into the Treasury coffers. I hope the debate on new clause 3 will serve to strengthen his resolve, given the strength of support for such a measure, whether that is through a formal user advocacy service or by using existing organisations. I hope he uses the debate to strengthen his arguments about such a measure with the Treasury.

I will not support the new clause tabled by the hon. Member for Worsley and Eccles South, because I think the Minister has already undertaken to look at this issue. As I say, I hope this discussion strengthens his resolve to do so.

Let me start by stating the fact that this Bill, as drafted, rightly has incredibly strong protections for children. The children’s safety duties that we have already debated are extremely strong. They apply to any platform with significant numbers of children using it and they impose a duty on such companies to protect children from harm. The priority illegal safety duties are listed in schedule 6, on child sexual exploitation and abuse offences—they have their very own schedule because we attach such importance to them. Committee members should be in no doubt that protecting children is at the very heart of the Bill. I hope that has been obvious from the debates we have had.

On children’s ability to raise complaints and seek redress under the Bill, it is worth reminding ourselves of a couple of clauses that we have debated previously, through which we are trying to make sure it is as easy as possible for children to report problematic content or to raise complaints. Members will recall that we debated clause 17. Clause 17(6)(c) allows for

“a parent of, or other adult with responsibility for, a child”

to raise content-reporting claims with users, so that children are not left on their own. We have also been clear under the complaints procedures set out in clause 18(2)(c) that those procedures must be

“easy to access, easy to use (including by children)”.

That is an explicit reference to accessibility for children.

The hon. Member for Aberdeen North has also already referred to the fact that in both the children’s risk assessment duties and the adult’s risk assessment duties people’s characteristics, including whether they are a member of a particular group, have to be taken into account. The children’s risk assessment duties are set out in clause 10(6)(d). Children with particular characteristics —orientation, race and so on—have to be particularly considered. The fact that a clause on the children’s risk assessment duties even exists in the first place shows that specific and special consideration has to be given to children and the risks they face. That is hardwired right into the architecture of the Bill.

All the provisions that I have just mentioned—starting with clause 10 on children’s risk assessment duties, right through to the end of the Bill and the priority offences in schedule 6, on child sexual exploitation and abuse offences—show that, right throughout the whole Bill, the protection of children is integral to what we are trying to do with the Bill.

On the consultation that happened in forming and framing the Bill, really extensive engagement and consultation took place throughout the preparation of this piece of legislation, including direct consultation with children themselves, their parents and the many advocacy groups for children. There should be no doubt at all that children have been thoroughly consulted as the Bill has been prepared.

On the specifics of new clause 3, which relate to advocacy for children, as the hon. Member for Aberdeen North referred to in passing a moment ago, there is a mechanism in clause 140 for organisations that represent particular groups, such as children, to raise super-complaints with Ofcom when there is a problem. In fact, when we debated that clause, I used children as an example when I spoke about the “eligible entities” that can raise super-complaints—I used the NSPCC speaking for children as a specific example of the organisations I would expect the term “eligible entity” to include. Clause 140 explicitly empowers organisations such as the NSPCC and others to speak for children.

Having a statutory organisation to speak for children is such a good idea that it has been done already. In fairness, I should say that it was done by the Labour Government. The Children Act 2004 established the Children’s Commissioner for England, with equivalents for Wales, Scotland and Northern Ireland. The 2004 Act states, right at the start, that the function of the Children’s Commissioner is:

“promoting awareness of the views and interests of children in England.”

So the very first function of the Children’s Commissioner, as stated in the Act, is to act as an advocate for children.

The Act goes on to state that the Children’s Commissioner may encourage persons

“exercising functions or engaged in activities affecting children…to take account of their views and interests”.

Ofcom, in exercising its regulatory functions, is clearly doing precisely what I have just read out. We therefore have a statutory advocacy organisation for children: it is the Children’s Commissioner and it is doing exactly what the hon. Member for Worsley and Eccles South is calling for.

This is a good moment to pay tribute to the current Children’s Commissioner, Dame Rachel de Souza, who gave evidence to the Committee before the Whitsun recess. Dame Rachel is extremely active, energetic and effective at advocating for children in general, but she is particularly active and effective at advocating for children in the digital sphere. I am sure the whole Committee will want to put on record its thanks to our existing statutory advocate, Dame Rachel, who is doing such a good job in that area.

I hope those comments make it clear that we already have a statutory advocate: the Children’s Commissioner. Clause 140 contains facilities for other organisations besides our existing statutory advocate to formally and legally raise with Ofcom issues that may arise. Ofcom is bound to reply—it is not optional. Ofcom has to listen to complaints and it has to respond.

I agree wholeheartedly about the importance of the role of the Children’s Commissioner and she does a fantastic job, but is it not testament to the fact that there is a need for this advocacy body that she is advocating for it and thinks it is a really good idea? The Children Act 2004 is a fantastic Act, but that was nearly 20 years ago and the world has changed significantly since then. The Bill shows that. The fact that she is advocating for it may suggest that she sees the need for a separate entity.

There is a danger if we over-create statutory bodies with overlapping responsibilities. I just read out the current statutory functions of the Children’s Commissioner under the 2004 Act. If we were to agree to the new clause, we would basically be creating a second statutory advocate or body with duties that are the same as some of those that the Children’s Commissioner already exercises. I read from section 2 of the Act, where those duties are set out. I do not think that having two people with conflicting or competing duties would be particularly helpful.

I am grateful to the Minister for his support for Labour legislation. Does he acknowledge that we have different Children’s Commissioners across the nations of the UK? Each would have the same rights to advocate for children, so we would have four, rather than one focusing on one specific issue, which is what the Children’s Commissioners across the UK are advocating for.

I do not have in front of me the relevant devolved legislation—I have only the Children Act 2004 directly in front of me—but I assume it is broadly similar. The hon. Member for Aberdeen North can correct me if I am wrong, but I assume it is probably broadly similar in the way—[Interruption.] She is not sure, so I do not feel too bad about not being sure either. I imagine it is similar. I am not sure that having similar statutory bodies with the same function—we would create another with the new clause—is necessarily helpful.

The Bill sets out formal processes that allow other organisations, such as the NSPCC, to raise complaints that have to be dealt with. That ensures that the voices of groups—including children, but not just children—will be heard. I suspect that if we have a children’s advocacy body, other groups will want them and might feel that they have been overlooked by omission.

The good thing about the way the super-complaint structure in clause 140 works is that it does not prescribe what the groups are. Although I am sure that children will be top of the list, there will be other groups that want to advocate and to be able to bring super-complaints. I imagine that women’s groups will be on that list, along with groups advocating for minorities and people with various sexual orientations. Clause 140 is not exclusive; it allows all these groups to have a voice that must be heard. That is why it is so effective.

My right hon. Friend the Member for Basingstoke and the hon. Member for Batley and Spen asked whether the groups have enough resources to advocate on issues under the super-complaint process. That is a fair question. The allocation of funding to different groups tends to be done via the spending review process. Colleagues in other Departments—the Department for Education or, in the case of victims, the Ministry of Justice—allocate quite a lot of money to third-sector groups. The victims budget was approximately £200 million a year or two ago, and I am told it has risen to £300 million for the current financial year. That is the sort of funding that can find its way into the hands of the organisations that advocate for particular groups of victims. My right hon. Friend asked whether the proceeds of fines could be applied to fund such work, and I have undertaken to raise that with the Treasury.

We already have a statutory advocate for children: the four Children’s Commissioners for the four parts of the United Kingdom. We have the super-complaints process, which covers more than children’s groups, crucial though they are. We have given Ofcom statutory duties to consult when developing its codes of practice, and we have money flowing via the Ministry of Justice, the DFE and others, into advocate groups. Although we agree with the intention behind new clause 3, we believe its objectives are very well covered via the mechanisms that I have just set out at some length.

There have not been all that many times during the debate on the Bill when the Minister has so spectacularly missed the point as he has on this section. I understand everything he said about provisions already being in place to protect to children and the provisions regarding the super-complaints, but the new clause is not intended to be a replacement for the super-complaints procedure, which we all support—in fact, we have tried to strengthen that procedure. The new clause is intended to be an addition—another, very important layer.

Unfortunately, I do not have at the front of my mind the legislation that set up the Children’s Commissioner for Scotland, or the one for England. The Minister talked through some of the provisions and phrasing in the Children Act 2004. He said that the role of the Children’s Commissioner for England is to encourage bodies to act positively on behalf of children—to encourage. There is no requirement for the body to act in the way the Children’s Commissioner says it should act. Changes have been made in Wales establishing the Future Generations Commissioner, who has far more power.

As far as I can tell, the user advocacy body proposed in new clause 3 would not have the ability to compel Ofcom either.

But it would be a statutory consultee that is specifically mentioned in this provision. I cannot find in the Bill a provision giving Ofcom a statutory duty to consult the four Children’s Commissioners. The new clause would make the children’s advocacy body a statutory consultee in decisions that affect children.

The Bill will require Ofcom to consult people who represent the interests of children. Although not named, it would be astonishing if the first people on that list were not the four Children’s Commissioners when developing the relevant codes of practice. The statutory obligation to consult those groups when developing codes of practice and, indeed, guidance is set out in clauses 37(6)(d) and 69(3)(d).

That is very helpful, but there are still shortcomings in what the Minister says. The Bill, as drafted, requires Ofcom to require things of other organisations. Some of the detail is in the Bill, some of the detail will come in secondary legislation and some of the detail will come in the codes of practice published by Ofcom. We broadly agree that the Bill will ensure people are safer on the internet than they currently are, but we do not have all the detail on the Government’s intent. We would like more detail on some things, but we are not saying, “We need every little bit of detail.” If we did, the Bill would not be future-proof. We would not be able to change and update the Bill if we required everything to be in the Bill.

The Bill is not a one-off; it will continually change and grow. Having a user advocacy body would mean that emerging threats can quickly be brought to Ofcom’s attention. Unlike the Children’s Commissioners, who have a hundred other things to do, the entire purpose of this body would be to advocate on behalf of children online. The Children’s Commissioners do an amazing job, but this is not their No. 1 priority. If the Minister wants this to be a world-leading Bill, its No. 1 priority should be to protect the human rights of children.

I think the hon. Lady is being a little unfair to the Children’s Commissioners. Dame Rachel de Souza is doing a fantastic job of advocating specifically in the digital sphere. She really is doing a fantastic job, and I say that as a Minister. I would not say she is leaving any gaps.

These digital children’s safety issues link to wider children’s safety issues that exist offline, such as sexual exploitation, grooming and so on, so it is useful that the same person advocates for children in both the offline and online worlds.

The new clause asks for an additional body. It is not saying the Children’s Commissioners should be done away with. The Children’s Commissioners do an amazing job, as we have recognised, but the No. 1 priority, certainly for the Children’s Commissioner in Scotland, is to protect the human rights of children; it is not to protect children online, which is what the user advocacy body would do. The body would specifically give the benefit of its experience and specifically use its resources, time and energy to advocate between Ofcom, children and children’s organisations and groups.

The Minister is right that the Bill takes massive steps forward in protecting children online, and he is right that the Children’s Commissioners do a very good job. The work done by the Children’s Commissioners in giving us evidence on behalf of children and children’s organisations has been incredibly powerful and incredibly helpful, but there is still a layer missing. If this Bill is to be future-proof, if it is to work and if it is not to put an undue burden on charitable organisations, we need a user advocacy body. The Minister needs to consider that.

I appreciate that the Government provide money to victim support organisations, which is great, but I am also making a case about potential victims. If the money only goes to those who support people who have already been harmed, it will not allow them to advocate to ensure that more people are not harmed. It will allow them to advocate on the behalf of those who have been harmed—absolutely—but it will not effectively tackle potential and emerging harms. It is a key place where the Bill misses out. I am quite disappointed that the Minister has not recognised that something may be lacking and is so keen to defend his position, because it seems to me that the position of the Opposition is so obviously the right one.

I wholeheartedly agree with what the hon. Member for Aberdeen North just said, but I wish to emphasise some elements because it seems to me that the Minister was not listening, although he has listened to much that has been said. I made some specific points, used quotes and brought forward some evidence. He feels that children have been consulted in the drafting of the Bill; I cited a YoungMinds survey that showed that that was very much not what young people feel. YoungMinds surveyed a large group of young people and a very large proportion of them had not even heard of the Bill.

The evidence of the young survivor of online grooming was very powerful. She very much wanted a user-advocacy body and spoke strongly about that. The Minister is getting it wrong if he thinks that somebody in that situation, who has been groomed, would go to a parent. The quote that I cited earlier was:

“Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side.”

There were clearly adults in her life she could have gone to, but she did not because she was in that vulnerable position—a position of weakness. That is why some kind of independent advocacy body for children is so important.

I do not think children and young people do feel consulted about the Bill because the organisations and charities are telling us that. I join all Opposition Members in supporting and paying tribute to the remarkable job that the Children’s Commissioner does. I quoted her setting out her worries about the Bill. I quoted her saying that

“the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

That is what she said. She did not say, “I’m the person charged with doing this. I’m the person who has the resource and my office has the resource.”

I hope that I did not in any way confuse the debate earlier, because these two things are very separate. The idea of a user-advocacy service and individual victim support are two separate issues. The Minister has already taken up the issue of victim support, which is what the Children’s Commissioner was talking about, but that is separate from advocacy, which is much broader and not necessarily related to an individual problem.

Indeed, but the Children’s Commissioner was very clear about certain elements being missing in the Bill, as is the NSPCC and other organisations. It is just not right for the Minister to land it back with the Children’s Commissioner as part of her role, because she has to do so many other things. The provisions in the Bill in respect of a parent or adult assisting a young people in a grooming situation are a very big concern. The Children’s Commissioner cited her own survey of 2,000 children, a large proportion of whom had not succeeded in getting content about themselves removed. From that, we see that she understands that the problem exists. We will push the new clause to a Division.

Question put, That the clause be read a Second time.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Adjourned till Tuesday 28 June at twenty-five minutes past Nine o’clock.

Written evidence reported to the House

OSB87 Glassdoor, Inc

OSB88 Open Rights Group