Report (5th Day)
Relevant documents: 28th and 38th Reports from the Delegated Powers Committee, 15th Report from the Constitution Committee. Scottish and Welsh Legislative Consent granted.
Amendment 236C
Moved by
236C: After Clause 194, insert the following new Clause—
“Power to impose duty about alternative dispute resolution procedure
(1) The Secretary of State may by regulations amend this Act for or in connection with the imposition on providers of Category 1 services of an ADR duty.(2) An “ADR duty”—(a) is a duty requiring providers of Category 1 services to arrange for and engage in an alternative dispute resolution procedure in specified circumstances for the resolution of disputes about their handling of relevant complaints, and(b) may include a duty requiring such providers to meet the costs incurred by any other person in using a dispute resolution procedure which is so arranged.(3) Complaints are “relevant” for the purposes of subsection (2)(a) if they—(a) relate to a Category 1 service,(b) are of a specified kind, and(c) are made by persons of a specified kind.(4) Regulations under this section may not be made before the publication of a statement by the Secretary of State responding to OFCOM’s report under section (OFCOM’s report about reporting and complaints procedures)(report about reporting and complaints procedures in use by providers of Part 3 services: see subsection (10) of that section). (5) Before making regulations under this section the Secretary of State must consult—(a) OFCOM,(b) the Information Commissioner, and(c) such other persons as the Secretary of State considers appropriate.(6) If the power conferred by subsection (1) is exercised, the first regulations made under the power must—(a) require the use of a dispute resolution procedure which is impartial, and(b) prohibit the use of a dispute resolution procedure which restricts or excludes the availability of civil proceedings.(7) Provision made by regulations under this section may have the effect that the duties set out in any or all of sections 17, 18 and 19 which apply in relation to duties imposed by other provisions of Chapter 2 of Part 3 are also to apply in relation to the ADR duty, and accordingly the regulations may amend—(a) section 17(6),(b) the definition of “safety measures and policies” in section 18(8), or(c) the definition of “relevant duties” in section 19(10).(8) The provisions of this Act that may be amended by the regulations in connection with the imposition of the ADR duty include, but are not limited to, the following provisions (in addition to those mentioned in subsection (7))—(a) section 6(5),(b) section 94(12)(a), and(c) section 120(2).(9) If the power conferred by subsection (1) is exercised, the first regulations made under the power must require OFCOM to—(a) produce and publish guidance for providers of Category 1 services to assist them in complying with the ADR duty, and(b) consult the Secretary of State, the Information Commissioner and such other persons as OFCOM consider appropriate before producing the guidance.(10) Section 184(1) applies for the purposes of the references to Category 1 services in this section.(11) In this section “specified” means specified in regulations under this section.(12) For the meaning of “Category 1 service”, see section 86 (register of categories of services).”Member’s explanatory statement
This amendment provides that the Secretary of State may make regulations amending this Bill so as to impose a new duty on providers of Category 1 services to arrange for and engage in an out of court, impartial dispute resolution procedure. The regulations may not be made until the Secretary of State has responded to OFCOM’s report about content reporting and complaints procedures under the new clause proposed to be inserted after Clause 147 in my name.
My Lords, the government amendments in this group relate to content reporting and complaints procedures. The Bill’s existing duties on each of these topics are a major step forward and will provide users with effective methods of redress. There will now be an enforceable duty on Part 3 services to offer accessible, transparent and easy-to-use complaints procedures. This is an important and significant change from which users and others will benefit directly.
Furthermore, Part 3 services complaints procedures will be required to provide for appropriate action to be taken in response to complaints. The duties here will fundamentally alter how complaints systems are operated by services, and providers will have to make sure that their systems are up to scratch. If services do not comply with their duties, they will face strong enforcement measures.
However, we have listened to concerns raised by your Lordships and others, and share the desire to ensure that complaints are handled effectively. That is why we have tabled Amendments 272AA and 274AA, to ensure that the Bill’s provisions in this area are the subject of a report to be published by Ofcom within two years of commencement.
Amendment 272AA places a requirement on Ofcom to undertake a report about Part 3 services reporting and complaints procedures. The report will assess the measures taken or in use by providers of Part 3 services to enable users and others to report content and make complaints. In assessing the content reporting and complaints measures in place, the report must take into account users’ and others’ experiences of those procedures—including how easy to use and clear they are for reporting content and making complaints, and whether providers are taking appropriate and timely action in response.
In this report, Ofcom must provide advice to the Secretary of State about whether she should use her power set out in Amendment 236C to make regulations imposing an alternative dispute resolution duty on category 1 services. Ofcom may also make wider recommendations about how the complaints and user redress provisions can be strengthened, and how users’ experiences with regard to complaints can be improved more broadly. Amendment 274AA is a consequential amendment ensuring that the usual confidentiality provisions apply to matters contained in that report.
These changes will ensure that the effectiveness of the Bill’s content reporting and complaints provisions can be thoroughly assessed by Ofcom two years after the commencement of the provision, providing time for the relevant reporting and complaints procedures to bed in.
Amendment 236C then provides that the Secretary of State will have a power to make regulations to amend the Act in order to impose an alternative dispute resolution duty on providers of category 1 services. This power can be used after the Secretary of State has published a statement in response to Ofcom’s report. This enables the Secretary of State to impose via regulations a duty on the providers of category 1 services to arrange for and engage in an impartial, out-of-court alternative dispute resolution procedure in respect of complaints. This means that, if the Bill’s existing user redress provisions are found to be insufficient, this requirement can quickly be imposed to strengthen the Bill.
This responds directly to concerns which noble Lords raised about cases where users or parents may feel that they have nowhere to turn if they are dissatisfied with a service’s response to their complaint. We believe that the existing provisions will remedy this, but, if they do not, these new requirements will ensure that there is an impartial, alternative dispute resolution procedure which will work towards the effective resolution of the complaint between the service and the complainant.
At the same time, it will avoid creating a single ombudsman, person or body which may be overwhelmed either through the volume of complaints from multiple services or by the complexity of applying such disparate services’ varying terms of service. Instead, if required, this power will put the onus on the provider to arrange for and engage in an impartial dispute resolution procedure.
Amendment 237D requires that, if regulations are made requiring category 1 services to offer an alternative dispute resolution procedure, such regulation must be subject to the affirmative parliamentary procedure. This ensures that Parliament will continue to have oversight of this process.
I hope that noble Lords are reassured that the Bill not only requires services to provide users and others with effective forms of redress but that these further amendments will ensure that the Bill’s provisions in this area will be thoroughly reviewed and that action can be taken quickly if it is needed. I beg to move.
My Lords, I am grateful to hear what the Minister has just announced. The scheme that was originally prefigured in the pre-legislative scrutiny report has now got some chance of being delivered. I think the process and procedures are quite appropriate; it does need review and thought. There needs to be account taken of practice on the ground, how people have found the new system is working, and whether or not there are gaps that can be filled this way. I give my full support to the proposal, and I am very glad to see it.
Having got to the Dispatch Box early, I will just appeal to our small but very important group. We are on the last day on Report. We are reaching a number of issues where lots of debate has taken place in Committee. I think it would be quite a nice surprise for us all if we were to get through this quickly. The only way to do that is by restricting our contributions.
My Lords, I will speak briefly to Amendments 272AA and 274AA, only because at the previous stage of the Bill I tabled amendments related to the reporting of illegal content and fraudulent advertisements, both in reporting, and complaints and transparency. I have not re-tabled them here, but I have had conversations with my noble friend the Minister. It is still unclear to those in the House and outside why the provisions relating to that type of reporting would not apply to fraudulent advertisements, particularly given that the more information that can be filed about those types of scams and fraudulent advertisements, the easier it would be for the platforms to gather information, and help users and others to start to crack down on that. I wonder if, when he sums up, my noble friend could say something about the reporting provisions relating to fraudulent advertisements generally, and in particular around general reporting and reporting relating to complaints by users.
My Lords, I am mindful of the comments of the noble Lord, Lord Stevenson, to be brief. I add a note of welcome to the mechanism that has been set out.
In this legislation, we are initiating a fundamental change to the way in which category 1 providers will run their reporting systems, in that prior to this they have not had any external oversight. Ofcom’s intervention will be material, given that online service providers will have to explain to Ofcom what they are doing and why.
We should note that we are also asking providers to do some novel prioritisation. The critical thing with all these reporting systems is that they operate at such huge volumes. I will not labour the points, but if noble Lords are interested they can look at the Meta and YouTube transparency reports, where it is explained that they are actioning tens of millions of pieces of content each month, on the basis of hundreds of millions of reports. If you get even 1% of 10 million reports wrong, that is 100,000 errors. We should have in mind the scale we are operating at. Ofcom will not be able to look at each one of those, but I think it will be able to produce a valuable system and make sure that quality control is improved across those systems, working with the providers. Having additional powers to create an alternative dispute resolution mechanism where one does not exist and would prove to be useful is helpful. However, the slow and steady approach of seeing what will happen with those systems under Ofcom supervision before jumping into the next stage is right.
I also note that we are asking platforms to do some prioritisation in the rest of the Online Safety Bill. For example, we are saying that we wish journalistic and politician content to be treated differently from ordinary user content. All of those systems need to be bedded in, so it makes sense to do it at a reasonable pace.
I know that the noble Baroness, Lady Newlove, who cannot be here today, was also very interested in this area and wanted to make sure we made the point that the fact there is a reasonable timescale for the review does not mean that we should take our foot off the pedal now for our expectations for category 1 service providers. I think I heard that from the Minister, but it would be helpful for him to repeat it. We will be asking Ofcom to keep the pressure on to get these systems right now, and not just wait until it has done the report and then seek improvements at that stage. With that—having been about as brief as I can be— I will sit down.
My Lords, I promise I will be brief. I, too, welcome what the Minister has said and the amendments that the Government have proposed. This is the full package which we have been seeking in a number of areas, so I am very pleased to see it. My noble friend Lady Newlove and the noble Baroness, Lady Kidron, are not in their places, but I know I speak for both of them in wanting to register that, although the thoughtful and slow-and-steady approach has some benefits, there also some real costs to it. The UK Safer Internet Centre estimates that there will be some 340,000 individuals in the UK who will have no recourse for action if the platforms complaints mechanism does not work for them in the next two years. That is quite a large number of people, so I have one very simple question for the Minister: if I have exhausted the complaints procedure with an existing platform in the next two years, where do I go? I cannot go to Ofcom. My noble friend Lord Grade was very clear in front of the committee I sit on that it is not Ofcom’s job. Where do I go if I have a complaint that I cannot get resolved in the next two years?
My Lords, I declare an interest as chair of Trust Alliance Group, which operates the energy and communications ombudsman schemes, so I have a particular interest in the operation of these ADR schemes. I thank the Minister for the flexibility that he has shown in the provision about the report by Ofcom and in having backstop powers for the Secretary of State to introduce such a scheme.
Of course, I understand that the noble Baroness, Lady Newlove, and the UK Safer Internet Centre are very disappointed that this is not going to come into effect immediately, but there are advantages in not setting out the scheme at this very early point before we know what some of the issues arising are. I believe that Ofcom will definitely want to institute such a scheme, but it may be that, in the initial stages, working out the exact architecture is going to be necessary. Of course, I would have preferred to have a mandated scheme, in the sense that the report will look not at the “whether” but the “how”, but I believe that at the end of the day it will absolutely obvious that there needs to be such an ADR scheme in order to provide the kind of redress the noble Baroness, Lady Harding, was talking about.
I also agree with noble Baroness, Lady Morgan, that the kinds of complaints that this would cover should include fraudulent adverts. I very much hope that the Minister will be able to answer the questions that both noble Baronesses asked. As my noble friend said, will he reassure us that the department and Ofcom will not take their foot off the pedal, whatever the Bill may say?
I am grateful to noble Lords for their warm support and for heeding the advice of the noble Lord, Lord Stevenson, on brevity. We must finish our Report today. The noble Lord, Lord Allan, is right to mention my noble friend Lady Newlove, who I have spoken to about this issue, as well as the noble Lord, Lord Russell of Liverpool, who has raised some questions here.
Alongside the strong duties on services to offer content reporting and complaints procedures, our amendments will ensure that the effectiveness of these provisions can be reviewed after they have had sufficient time to bed in. The noble Lord, Lord Allan, asked about timing in more detail. Ofcom must publish the report within the two-year period beginning on the day on which the provision comes into force. That will allow time for the regime to bed in before the report takes place, ensuring that its conclusions are informed by how the procedures work in practice. If necessary, our amendments will allow the Secretary of State to impose via regulations a duty on the providers of category 1 services to arrange for and engage in an impartial, out-of-court alternative dispute resolution procedure, providing the further strengthening which I outlined in opening.
I can reassure my noble friend Lady Morgan of Cotes that reporting mechanisms to facilitate providers’ removal of fraudulent advertisements are exactly the kinds of issues that Ofcom’s codes of practice will cover, subject to consultation and due process. As companies have duties to remove fraudulent advertising once they are alerted to it, we expect platforms will need the necessary systems and processes in place to enable users to report fraudulent adverts so that providers can remove them.
My noble friend Lady Harding asked the question which was posed a lot in Committee about where one goes if all avenues are exhausted. We have added further avenues for people to seek redress if they do not get it but, as I said in Committee, the changes that we are bringing in through this Bill will mark a significant change for people. Rather than focusing on the even-further-diminished possibility of their not having their complaints adequately addressed through the additional amendments we are bringing today, I hope she will see that the provisions in the Bill and in these amendments as bringing in the change we all want to see to improve users’ safety online.
Amendment 236C agreed.
Amendment 237
Moved by
237: After Clause 195, insert the following new Clause—
“Powers to amend sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”)
(1) The Secretary of State may by regulations amend—(a) section (“Primary priority content that is harmful to children”) (primary priority content that is harmful to children);(b) section (“Priority content that is harmful to children”) (priority content that is harmful to children).But the power to add a kind of content is limited by subsections (2) to (4).(2) A kind of content may be added to section (“Primary priority content that is harmful to children”) only if the Secretary of State considers that, in relation to Part 3 services—(a) there is a material risk of significant harm to an appreciable number of children presented by content of that kind that is regulated user- generated content or search content, and(b) it is appropriate for the duties set out in sections 11(3)(a) and 25(3)(a) (duty in relation to children of all ages) to apply in relation to content of that kind.(3) A kind of content may be added to section (“Priority content that is harmful to children”) only if the Secretary of State considers that, in relation to Part 3 services, there is a material risk of significant harm to an appreciable number of children presented by content of that kind that is regulated user-generated content or search content.(4) A kind of content may not be added to section (“Primary priority content that is harmful to children”) or (“Priority content that is harmful to children”) if the risk of harm presented by content of that kind flows from—(a) the content’s potential financial impact, (b) the safety or quality of goods featured in the content, or(c) the way in which a service featured in the content may be performed (for example, in the case of the performance of a service by a person not qualified to perform it).(5) The Secretary of State must consult OFCOM before making regulations under this section.(6) In this section references to children are to children in the United Kingdom.(7) In this section—“regulated user-generated content” has the same meaning as in Part 3 (see section 49);“search content” has the same meaning as in Part 3 (see section 51).”Member’s explanatory statement
This amendment gives power for the Secretary of State to make regulations changing the kinds of content that count as primary priority content and priority content harmful to children, subject to certain constraints set out in the Clause.
Amendment 237 agreed.
Amendment 237ZA not moved.
Clause 200: Regulations: general
Amendment 237A
Moved by
237A: Clause 200, page 168, line 5, after “State” insert “or OFCOM”
Member’s explanatory statement
This amendment has the effect that regulations made by OFCOM under the Bill must be made by statutory instrument.
My Lords, Amendments 238A and 238D seek to change the parliamentary process for laying—oh, I am skipping ahead with final day of Report enthusiasm.
As noble Lords know, companies will fund the costs of Ofcom’s online safety functions through annual fees. This means that the regime which the Bill ushers in will be cost neutral to the taxpayer. Once the fee regime is operational, regulated providers with revenue at or above a set threshold will be required to notify Ofcom and to pay a proportionate fee. Ofcom will calculate fees with reference to the provider’s qualifying worldwide revenue.
The Delegated Powers and Regulatory Reform Committee of your Lordships’ House has made two recommendations relating to the fee regime which we have accepted, and the amendments we are discussing in this group reflect this. In addition, we are making an additional change to definitions to ensure that Ofcom can collect proportionate fees.
A number of the amendments in my name relate to qualifying worldwide revenue. Presently, the Bill outlines that this should be defined in a published statement laid before Parliament. Your Lordships’ committee advised that it should be defined through regulations subject to the affirmative procedure. We have agreed with this and are proposing changes to Clause 76 so that Ofcom can make provisions about qualifying worldwide revenue by regulations which, as per the committee’s recommendations, will be subject to the affirmative procedure.
Secondly, the committee recommended that we change the method by which the revenue threshold is defined. Presently, as set out in the Bill, it is set by the Secretary of State in a published statement laid before Parliament. The committee recommended that the threshold be set through regulations subject to the negative procedure and we are amending Clause 77 to make the recommended change.
Other amendments seek to make a further change to enable Ofcom to collect proportionate fees from providers. A provider of a regulated service the qualifying worldwide revenue of which is equal to, or greater than, the financial threshold will be required to notify Ofcom and pay an annual fee, calculated by reference to its qualifying worldwide revenue. Currently, this means that that fee calculation can be based only on the revenue of the regulated provider. The structure of some technology companies, however, means that how they accrue revenue is not always straightforward. The entity which meets the definition of a provider may therefore not be the entity which generates revenue referable to the regulated service.
Regulations to be made by Ofcom about the qualifying worldwide revenue will therefore be able to provide that the revenue accruing to certain entities in the same group as a provider of a regulated service can be taken into account for the purposes of determining qualifying worldwide revenue. This will enable Ofcom, when making such regulations, to make provisions, if necessary, to account for instances where a provider has a complex group structure; for example, where the regulated provider might accrue only a portion of the revenue referrable to the regulated service, the rest of which might be accrued by other entities in the group’s structure. These amendments to Clause 76 address these issues by allowing Ofcom to make regulations which provide that the revenue from certain other entities within the provider’s group structure can be taken into account. I beg to move.
My Lords, we have not talked much about fees in our consideration of the Bill, and I will not talk much about them today, but there are some important questions. We should not skip too lightly over the fact that we will be levying revenues from online providers. That might have a significant impact on the markets. I have some specific questions about this proposed worldwide revenue method but I welcome these amendments and that we will now be getting a better procedure. This will also allow the Minister to say, “All these detailed points can be addressed when these instruments come before Parliament”. That is a good development. However, there are three questions that are worth putting on the record now so that we have time to think about them.
First, what consideration will be given to the impact on services that do not follow a classic revenue model but instead rely on donations and other sorts of support? I know that we will come back to this question in a later group but there are some very large internet service providers that are not the classic advertising-funded model, instead relying on foundations and other things. They will have significant questions about what we would judge their qualifying worldwide revenue to be, given that they operate to these very different models.
The second question concerns the impact on services that may have a very large footprint outside the UK, and significant worldwide revenues, but which do very little business within the UK. The amendment that the Minister has tabled about group revenues is also relevant here. You can imagine an entity which may be part of a very large worldwide group making very significant revenues around the world. It has a relatively small subsidiary that is offering a service in the UK, with relatively low revenues. There are some important questions there around the potential impact of the fees on decision-making within that group. We have discussed how we do not want to end up with less choice for consumers of services in the UK. There is an interesting question there as to whether getting the fee level wrong might lead to worldwide entities saying, “If you’re going to ask me to pay a fee based on my qualifying worldwide revenue, the UK market is just not worth it”. That may particularly true if, for example, the European Union and other markets are also levying a fee. You can see a rational business choice of, “We’re happy to pay the fee to the EU but not to Ofcom if it is levied at a rate that is disproportionate to the business that we do here”.
The third and very topical question is about the Government’s thinking about services with declining revenues but whose safety needs are not reducing and may even be increasing. I hope as I say this that people have Twitter in mind, which has very publicly told us that its revenue is going down significantly. It has also very publicly fired most of its trust and safety staff. You can imagine a model within which, because its revenue is declining, it is paying less to Ofcom precisely when Ofcom needs to do more supervision of it.
I hope that we can get some clarity around the Government’s intentions in these circumstances. I have referenced three areas where the worldwide qualifying revenue calculation may go a little awry. The first is where the revenue is not classic commercial income but comes from other sources. The second is where the footprint in the UK is very small but it is otherwise a large global company which we might worry will withdraw from the market. The third, and perhaps most important, is what the Government’s intention is where a company’s revenue is declining and it is managing its platform less well and its Ofcom needs increase, and what we would expect to happen to the fee level in those circumstances.
My Lords, there is very little to add to that. These are important questions. I simply was struck by the thought that the amount of work, effort and thought that has gone into this should not be kept within this Bill. I wonder whether the noble Lord has thought of offering his services to His Majesty’s Treasury, which has difficulty in raising tax from these companies. It would be nice to see that problem resolved.
I am looking forward to returning to arts and heritage; I will leave that to my noble friend Lady Penn.
The noble Lord, Lord Allan, asked some good questions. He is right: the provisions and the parliamentary scrutiny allow for the flexibility for all these things to be looked at and scrutinised in the way that he set out. I stress that the fee regime is designed to be fair to industry; that is central to the approach we have taken. The Bill stipulates that Ofcom must charge only proportionate and justifiable fees to industry. The provisions that Ofcom can make via regulation about the qualifying worldwide revenue aim to ensure that fees are truly representative of the revenue relating to the regulated service and that they will encourage financial transparency. They also aim to aid companies with complex structures which would otherwise struggle to segregate revenues attributable to the provider and its connected entities.
The revenue of the group undertaking can be considered in scope of a provider’s qualifying worldwide revenue if the entity was a member of the provider’s group during any part of the qualifying period and the entity receives during the qualifying period any amount referrable to a regulated service. The regulations provide Ofcom with a degree of flexibility as to whether or not to make such provisions, because Ofcom will aim to keep the qualifying worldwide revenue simple.
I am grateful for noble Lords’ support for the amendments and believe that they will help Ofcom and the Government to structure a fair and transparent fee regime which charges proportionate fees to fund the cost of the regulatory regime that the Bill brings in.
Amendment 237A agreed.
Amendment 237B
Moved by
237B: Clause 200, page 168, line 6, at end insert—
“(3A) The Statutory Instruments Act 1946 applies in relation to OFCOM’s powers to make regulations under this Act as if OFCOM were a Minister of the Crown.(3B) The Documentary Evidence Act 1868 (proof of orders and regulations etc) has effect as if—(a) OFCOM were included in the first column of the Schedule to that Act;(b) OFCOM and persons authorised to act on their behalf were mentioned in the second column of that Schedule.”Member’s explanatory statement
This amendment makes technical provision in relation to regulations made by OFCOM under the Bill.
Amendment 237B agreed.
Clause 201: Parliamentary procedure for regulations
Amendments 237C to 237DA
Moved by
237C: Clause 201, page 168, line 11, at end insert—
“(aa) regulations under section (“Regulations by OFCOM about qualifying worldwide revenue etc”)(1),”Member’s explanatory statement
This amendment provides that regulations made by OFCOM under subsection (1) of the new Clause 76 proposed in my name regarding “qualifying worldwide revenue” etc are subject to the affirmative Parliamentary procedure.
237D: Clause 201, page 168, line 14, at end insert—
“(da) regulations under section (Power to regulate app stores)(1),”Member’s explanatory statement
This amendment provides that regulations made under the new Clause proposed in my name after Clause 194 are subject to the affirmative Parliamentary procedure.
237DA: Clause 201, page 168, line 14, at end insert—
“(da) regulations under section (Power to impose duty about alternative dispute resolution procedure)(1),”Member’s explanatory statement
This amendment provides that regulations made under the new Clause proposed to be inserted in my name after Clause 194, concerning regulations to impose a duty on providers of Category 1 services about using an alternative dispute resolution procedure, are subject to the affirmative Parliamentary procedure.
Amendments 237C to 237DA agreed.
Amendment 237DB not moved.
Amendments 237E and 238
Moved by
237E: Clause 201, page 168, line 23, at end insert—
“(m) regulations under paragraph 5(9) of Schedule 13,”Member’s explanatory statement
This amendment provides that regulations made by OFCOM under paragraph 5(9) of Schedule 13 regarding “qualifying worldwide revenue” etc for the purposes of that paragraph are subject to the affirmative Parliamentary procedure.
238: Clause 201, page 168, line 26, leave out “54(2) or (3)” and insert “(Powers to amend sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”))(1)”
Member’s explanatory statement
This amendment ensures that regulations made under the new Clause proposed to be inserted after Clause 195 in my name are subject to the affirmative procedure, except in cases of urgency.
Amendments 237E and 238 agreed.
Amendment 238A
Moved by
238A: Clause 201, page 169, line 3, at end insert—
“(7A) A statutory instrument containing the first regulations under paragraph 1(1) of Schedule 11 (whether alone or with regulations under paragraph 1(2) or (3) of that Schedule) may not be made unless a draft of the instrument has been laid before, and approved by a resolution of, each House of Parliament.(7B) Any other statutory instrument containing regulations under paragraph 1(1) of Schedule 11 is subject to annulment in pursuance of a resolution of either House of Parliament.”Member’s explanatory statement
This amendment provides that the first regulations made under paragraph 1(1) of Schedule 11 (regulations specifying Category 1 threshold conditions) are subject to the affirmative Parliamentary procedure.
My Lords, as I was eagerly anticipating, government Amendments 238A and 238D seek to change the parliamentary process for laying the first regulations specifying the category 1 threshold conditions from the negative to the affirmative procedure. I am pleased to bring forward this change in response to the recommendation of your Lordships’ Delegated Powers and Regulatory Reform Committee.
The change will ensure that there are adequate levels of parliamentary scrutiny of the first regulations specifying the category 1 threshold conditions. This is appropriate given that the categorisation of category 1 services will lead to the most substantial duties on the largest and most influential services. As noble Lords are aware, these include the duties on user empowerment, user identity verification, journalistic and news publisher content, content of democratic importance, and fraudulent advertising.
Category 2A services will have only additional transparency and fraudulent advertising duties, and category 2B services will be subject only to additional transparency reporting duties. The burden of these duties is significantly less than the additional category 1 duties, and we have therefore retained the use of the negative resolution procedure for these regulations, as they require less parliamentary scrutiny.
Future changes to the category 1 threshold conditions will also use the negative procedure. This will ensure that the regime remains agile in responding to change, which I know was of particular concern to noble Lords when we debated the categorisation group in Committee. Keeping the negative procedure for such subsequent uses will avoid the risk of future changes being subject to delays because of parliamentary scheduling. I beg to move.
My Lords, I shall speak to Amendment 245. I would like to thank my noble friend the Minister, and also the Minister on leave, for the conversations that I have had with them about this amendment and related issues. As we have already heard, the platform categorisation is extremely important. So far, much of it is unknown, including which sites are actually going to be in which categories. For example, we have not yet seen any proposed secondary regulations. As my noble friend has just outlined, special duties apply, especially for those sites within category 1—user empowerment in particular, but also other duties relating to content and fraudulent advertisements.
Clause 85 and Schedule 11 set out the thresholds for determining which sites will be in category 1, category 2A or category 2B. I am very mindful of the exhortation of the noble Lord, Lord Stevenson, about being brief, but it is amazing how much you have to say about one word to explain this amendment. This amendment proposes to change an “and” to an “or” in relation to determining which sites would fall within category 1. It would move from a test of size “and” functionality to a test of size “or” functionality. This would give Ofcom more flexibility to decide which platforms really need category 1 designation. Category 1 should not be decided just on size; it should also be possible to determine it on the basis of functionality.
Functionality is defined in the Bill in Clause 208. We will get to those amendments shortly, but there is no doubt from what the Government have already conceded, or agreed with those of us who have been campaigning passionately on the Bill for a number of years, that functionality can make a platform harmful. It is perfectly possible to have small platforms that both carry highly harmful content and themselves become harmful in the way that they are designed. We have heard many examples and I will not detain the House with them, but I draw attention to two particular sites which capture how broad this is. The perpetrators of offline hate crimes are often linked to these small platforms. For example, the perpetrator of the 2018 Tree of Life synagogue mass shooting had an online presence on the right-wing extremist social network Gab. In the UK, Jake Davison, the self-proclaimed incel who killed five people in Plymouth in 2021, frequented smaller incel forums after he was banned from Reddit in the days leading up to the mass shooting.
I also want to share with noble Lords an email that I received just this week from a family who had been to see their Member of Parliament, Matt Rodda MP, and also the noble Baroness, Lady Kidron, who I know is very regretful that she cannot be here today. I thank Victoria and Jean Eustace for sharing the story of their sister and daughter. Victoria wrote: “I am writing to you regarding the Online Safety Bill, as my family and I are concerned it will not sufficiently protect vulnerable adults from harm. My sister, Zoe Lyalle, killed herself on 26 May 2020, having been pointed towards a method using an online forum called Sanctioned Suicide. Zoe was 18 years old at the time of her death and as such technically an adult, but she was autistic, so she was emotionally less mature than many 18 year- olds. She found it difficult to critically analyse written content”. She says that “The forum in question is not large and states on its face that it does not encourage suicide, although its content does just that”. The next part I was even more shocked about: “Since Zoe’s death, we have accessed her email account. The forum continues to email Zoe, providing her with updates on content she may have missed while away from the site, as well as requesting donations. One recent email included a link to a thread on the forum containing tips on how best to use the precise method that Zoe had employed”.
In her note to me, the Minister on leave said that she wanted to catch some of the platforms we are talking about with outsized influence. In my reply, I said that those sites on which people are encouraged to take their own lives or become radicalised and therefore take the harms they are seeing online into the real world undoubtedly exercise influence and should be tackled.
It is also perfectly possible for us to have large but safe platforms. I know that my noble friend Lord Moylan may want to discuss this in relation to sites that he has talked about already on this Bill. The risk of the current drafting is a flight of users from these large platforms, newly categorised as category 1, to the small, non-category 1 platforms. What if a platform becomes extremely harmful very quickly? How will it be recategorised speedily but fairly and involving parliamentary oversight?
The Government have run a variety of arguments as to why the “and” in the Bill should not become an “or”. They say that it creates legal uncertainty. Every Bill creates legal uncertainty; that is why we have an army of extremely highly paid lawyers, not just in this country but around the world. They say that what we are talking about is broader than illegal content or content related to children’s safety, but they have already accepted an earlier amendment on safety by design and, in subsections (10) to (12) of Clause 12, that specific extra protections should be available for content related to
“suicide or an act of deliberate self-injury, or … an eating disorder or behaviours associated with an eating disorder”
or abusive content relating to race, religion, sex, sexual orientation, disability or gender reassignment and that:
“Content is within this subsection if it incites hatred against people”.
The Government have already breached some of their own limits on content that is not just illegal or relates to child safety duties. In fact, they have agreed that that content should have enhanced triple-shield protection.
The Government have also said that they want to avoid burdens on small but low-harm platforms. I agree with that, but with an “or” it would be perfectly possible for Ofcom to decide by looking at size or functionality and to exclude those smaller platforms that do not present the harm we all care about. The Minister may also offer me a review of categorisation; however, it is a review of the tiers of categorisation and not the sites within the categories, which I think many of us will have views on over the years.
I come to what we should do on this final day of Report. I am very thankful to those who have had many conversations on this, but there is a fundamental difference of opinion in this House on these matters. We will talk about functionality shortly and I am mindful of the pre-legislative scrutiny committee’s recommendation that this legislation should adopt
“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.
There should be other factors. Ofcom should have the ability to decide whether it takes one factor or another, and not have a series of all the thresholds to be passed, to give it the maximum flexibility. I will listen very carefully to what my noble friend the Minister and other noble Lords say, but at this moment I intend to test the opinion of the House on this amendment.
My Lords, I strongly support Amendment 245. The noble Baroness, Lady Morgan of Cotes, has explained the nub of the problem we are facing—that size and functionality are quite separate. You can have large sites that perform a major social function and are extremely useful across society. Counter to that, you can have a small site focused on being very harmful to a small group of people. The problem is that, without providing the flexibility to Ofcom to determine how the risk assessment should be conducted, the Bill would lock it into leaving these small, very harmful platforms able to pursue their potentially ever-increasingly harmful activities almost out of sight. It does nothing to make sure that their risk assessments are appropriate.
We have already discussed the need to future-proof the Bill and I have tried to lay some amendments to that effect which the Government have not accepted. I hope that they will accept this amendment because this one change of wording would allow the flexibility that could provide a degree of future-proofing that is not provided otherwise within the Bill.
The amendment does not remove the sites completely. Those sites promoting suicide, serious self-harm and other activities across society will still continue, but because they will potentially be able to be captured and required to look at their risk assessment, their activities will perhaps at least be curtailed and, to a certain extent, regulated. It seems that the amendment simply provides a level playing field in the core issue of safety, which has been a theme we have addressed right through the Bill. I hope the Minister will accept the amendment as it is; one change of wording could allow Ofcom to do its job so much better. If he does not, I hope the amendment will be strongly supported by all sides of the House.
My Lords, I am pleased to follow the noble Baroness, Lady Morgan of Coates, and her amendment, which tries to help parliamentary counsel draft better regulations later on. I am really struggling to see why the Government want to resist something that will make their life easier if they are going to do what we want them to do, which is to catch those high-risk services—as the noble Baroness, Lady Finlay, set out—but also, as we have discussed in Committee and on Report, exclude the low-risk services that have been named, such as Wikipedia and OpenStreetMap.
I asked the Minister on Report how that might happen, and he confirmed that such services are not automatically exempt from the user-to-user services regulations, but he also confirmed that they might be under the subsequent regulations drafted under Schedule 11. That is precisely why we are coming back to this today; we want to make sure that they can be exempt under the regulations drafted under Schedule 11. The test should be: would that be easier under the amended version proposed by the noble Baroness, Lady Morgan, or under the original version? I think it would be easier under the amended version. If the political intent is there to exclude the kind of services that I have talked about—the low-risk services—and I think it should be, because Ofcom should not be wasting time, in effect, supervising services that do not present a risk and, not just that, creating a supervisory model that may end up driving those services out of the UK market because they cannot legally say that they will make the kind of commitments Ofcom would expect them to make, having two different thresholds, size and functionality, gives the draftspeople the widest possible choice. By saying “or”, we are not saying they cannot set a condition that is “and” or excludes “and”, but “and” does exclude “or”, if I can put it that way. They can come back with a schedule that says, “You must be of this size and have this kind of functionality”, or they could say “this functionality on its own”—to the point made by the two noble Baronesses about some sites. They might say, “Look, there is functionality which is always so high-risk that we do not care what size you are; if you’ve got this functionality, you’re always going to be in”. Again, the rules as drafted at the moment would not allow them to do that; they would have to say, “You need to have this functionality and be of this size. Oh, whoops, by saying that you have to be of this size, we’ve now accidentally caught somebody else who we did not intend to catch”.
I look forward to the Minister’s response, but it seems entirely sensible that we have the widest possible choice. When we come to consider this categorisation under Schedule 11 later on, the draftspeople should be able to say either “You must be this size and have this functionality” or “If you’ve got this functionality, you’re always in” or “If you’re of this size, you’re always in”, and have the widest possible menu of choices. That will achieve the twin objectives which I think everyone who has taken part in the debate wants: the inclusion of high-risk services, no matter their size, and the exclusion of low-risk services, no matter their size—if they are genuinely low risk. That is particularly in respect of the services we have discussed and which the noble Lord, Lord Moylan, has been a very strong advocate for. In trying to do good, we should not end up inadvertently shutting down important information services that people in this country rely on. Frankly, people would not understand it if we said, “In the name of online safety, we’ve now made it so that you cannot access an online encyclopaedia or a map”.
It is going to be much harder for the draftspeople to draft categorisation under Schedule 11, as it is currently worded, that has the effect of being able to exclude low-risk services. The risk of their inadvertently including them and causing that problem is that much higher. The noble Baroness is giving us a way out and I hope the Minister will stand up and grab the lifeline. I suspect he will not.
My Lords, I welcome the Minister’s Amendment 238A, which I think was in response to the DPRRC report. The sentiment around the House is absolutely clear about the noble Baroness’s Amendment 245. Indeed, she made the case conclusively for the risk basis of categorisation. She highlighted Zoe’s experience and I struggle to understand why the Secretary of State is resisting the argument. She knocked down the nine pins of legal uncertainty, and how it was broader than children and illegal by reference to Clause 12. The noble Baroness, Lady Finlay, added to the knocking down of those nine pins.
Smaller social media platforms will, on the current basis of the Bill, fall outside category 1. The Royal College of Psychiatrists made it pretty clear that the smaller platforms might be less well moderated and more permissive of dangerous content. It is particularly concerned about the sharing of information about methods of suicide or dangerous eating disorder content. Those are very good examples that it has put forward.
I return to the scrutiny committee again. It said that
“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”
should be adopted. It seems that many small, high-harm services will be excluded unless we go forward on the basis set out by the noble Baroness, Lady Morgan. The kind of breadcrumbing we have talked about during the passage of the Bill and, on the other hand, sites such as Wikipedia, as mentioned by noble friend, will be swept into the net despite being low risk.
I have read the letter from the Secretary of State which the noble Baroness, Lady Morgan, kindly circulated. I cannot see any argument in it why Amendment 245 should not proceed. If the noble Baroness decides to test the opinion of the House, on these Benches we will support her.
My Lords, I have good news and bad news for the Minister. The good news is that we have no problem with his amendments. The bad news, for him, is that we strongly support Amendment 245 from the noble Baroness, Lady Morgan of Coates, which, as others have said, we think is a no-brainer.
The beauty of the simple amendment has been demonstrated; it just changes the single word “and” to “or”. It is of course right to give Ofcom leeway—or flexibility, as the noble Baroness, Lady Finlay, described it—in the categorisation and to bring providers into the safety regime. What the noble Baroness, Lady Morgan, said about the smaller platforms, the breadcrumbing relating to the Jake Davison case and the functionality around bombarding Zoe Lyalle with those emails told the story that we needed to hear.
As it stands, the Bill requires Ofcom to always be mindful of size. We need to be more nuanced. From listening to the noble Lord, Lord Allan of Hallam—with his, as ever, more detailed analysis of how things work in practice—my concern is that in the end, if it is all about size, Ofcom will end up having to have a much larger number in scope on the categorisation of size in order to cover all the platforms that it is worried about. If we could give flexibility around size or functionality, that would make the job considerably easier.
We on this side think categorisation should happen with a proportionate, risk-based approach. We think the flexibility should be there, the Minister is reasonable—come on, what’s not to like?
My Lords, I shall explain why the simple change of one word is not as simple as it may at first seem. My noble friend’s Amendment 245 seeks to amend the rule that a service must meet both a number-of-users threshold and a functionality threshold to be designated as category 1 or 2B. It would instead allow the Secretary of State by regulation to require a service to have to meet only one or other of the two requirements. That would mean that smaller user-to-user services could be so categorised by meeting only a functionality threshold.
In practical terms, that would open up the possibility of a future Secretary of State setting only a threshold condition about the number of users, or alternatively about functionality, in isolation. That would create the risk that services with a high number of users but limited functionality would be caught in scope of category 1. That could be of particular concern to large websites that operate with limited functionality for public interest reasons, and I am sure my noble friend Lord Moylan can think of one that fits that bill. On the other hand, it could capture a vast array of low-risk smaller services merely because they have a specific functionality—for instance, local community fora that have livestreaming capabilities. So we share the concerns of the noble Lord, Lord Allan, but come at it from a different perspective from him.
My noble friend Lady Morgan mentioned the speed of designation. The Bill’s approach to the pace of designation for the category 1 watchlist and register is flexible—deliberately so, to allow Ofcom to act as quickly as is proportionate to each emerging service. Ofcom will have a duty proactively to identify, monitor and evaluate emerging services, which will afford it early visibility when a service is approaching the category 1 threshold. It will therefore be ready to act accordingly to add services to the register should the need arise.
The approach set out in my noble friend’s Amendment 245 would not allow the Secretary of State to designate individual services as category 1 if they met one of the threshold conditions. Services can be designated as category 1 only if they meet all the relevant threshold conditions set out in the regulations made by the Secretary of State. That is the case regardless, whether the regulations set out one condition or a combination of several conditions.
The noble Baroness, Lady Finlay, suggested that the amendment would assist Ofcom in its work. Ofcom itself has raised concerns that amendments such as this—to introduce greater flexibility—could increase the risk of legal challenges to categorisation. My noble friend Lady Morgan was part of the army of lawyers before she came to Parliament, and I am conscious that the noble Lord, Lord Clement-Jones, is one as well. I hope they will heed the words of the regulator; this is not a risk that noble Lords should take lightly.
I will say more clearly that small companies can pose significant harm to users—I have said it before and I am happy to say it again—which is why there is no exemption for small companies. The very sad examples that my noble friend Lady Morgan gave in her speech related to illegal activity. All services, regardless of size, will be required to take action against illegal content, and to protect children if they are likely to be accessed by children. This is a proportionate regime that seeks to protect small but excellent platforms from overbearing regulation. However, I want to be clear that a small platform that is a font of illegal content cannot use the excuse of its size as an excuse for not dealing with it.
Category 1 services are those services that have a major influence over our public discourse online. Again, I want to be clear that designation as a category 1 service is not based only on size. The thresholds for category 1 services will be based on the functionalities of a service as well as the size of the user base. The thresholds can also incorporate other characteristics that the Secretary of State deems relevant, which could include factors such as a service’s business model or its governance. Crucially, Ofcom has been clear that it will prioritise engagement with high-risk or high-impact services, irrespective of their categorisation, to understand their existing safety systems and how they plan to improve them.
Requiring all companies to comply with the full range of category 1 duties would divert these companies’ resources away from the vital task of tackling illegal content and protecting children, but we are clear that the popularity and characteristics of services can change. To that end, the Government have placed a new duty on Ofcom to identify and publish a list of companies that are close to the category 1 thresholds. That will ensure that Ofcom proactively identifies emerging category 1 companies and is ready to assess and add them to the category 1 register without delay. This tiered approach will be kept under review by both Ofcom and the Government, both as part of the thresholds and as part of the post-legislative review conducted by the Secretary of State.
I am very grateful to noble Lords and Members of another place, as well as groups including the Antisemitism Policy Trust, the Center for Countering Digital Hate, Samaritans and Kick It Out for their tireless work on this issue. I hope that explains to my noble friend why we cannot support her amendment. I hope that she will not press it, but if she does the rest of these Benches will oppose it and the Government cannot accept adding it to the Bill.
Amendment 238A agreed.
Amendments 238B to 238E
Moved by
238B: Clause 201, page 169, line 6, leave out “74(3)(b)” and insert “(“Regulations by OFCOM about qualifying worldwide revenue etc”)(2)”
Member’s explanatory statement
This amendment provides that regulations made by OFCOM about supporting evidence to be supplied by providers for the purposes of Part 6 of the Bill (fees) are subject to the negative Parliamentary procedure.
238C: Clause 201, page 169, line 6, at end insert—
“(ba) regulations under section 77,”Member’s explanatory statement
This amendment provides that regulations made by the Secretary of State specifying the threshold figure for the purposes of Part 6 of the Bill are subject to the negative Parliamentary procedure.
238D: Clause 201, page 169, line 11, leave out “(1),”
Member’s explanatory statement
This amendment is consequential on the amendment in my name inserting new subsections (7A) and (7B) into this Clause.
238E: Clause 201, page 169, line 13, at end insert—
“(8A) As soon as a draft of a statutory instrument containing regulations under section (“Regulations by OFCOM about qualifying worldwide revenue etc”)(1) or paragraph 5(9) of Schedule 13 (whether alone or with provision under section (“Regulations by OFCOM about qualifying worldwide revenue etc”)(2)) is ready for laying before Parliament, OFCOM must send the draft to the Secretary of State, and the Secretary of State must lay the draft before Parliament.(8B) Immediately after making a statutory instrument containing only regulations under section (“Regulations by OFCOM about qualifying worldwide revenue etc”)(2), OFCOM must send the instrument to the Secretary of State, and the Secretary of State must lay it before Parliament.”Member’s explanatory statement
This amendment provides for the Secretary of State’s involvement in the Parliamentary procedure to which regulations made by OFCOM under this Bill are subject.
Amendments 238B to 238E agreed.
Amendment 239
Moved by
239: After Clause 201, insert the following new Clause—
“Regulations: consultation and impact assessments
(1) This section applies if the Secretary of State seeks to exercise powers under—(a) section 55 (regulations under section 54),(b) section 195 (powers to amend section 35),(c) section 196 (powers to amend or repeal provisions relating to exempt content or services),(d) section 197 (powers to amend Part 2 of Schedule 1),(e) section 198 (powers to amend Schedules 5, 6 and 7), or(f) paragraph 1 of Schedule 11 (regulations specifying threshold conditions for categories of Part 3 services),or where the Secretary of State intends to direct OFCOM under section 39.(2) The Secretary of State may not exercise the powers under the provisions in subsection (1) unless any select committee charged by the relevant House of Parliament with scrutinising such regulations has—(a) completed its consideration of the draft regulations and accompanying impact assessment provided by the Secretary of State; and(b) reported on their deliberation to the relevant House; andthe report of the committee has been debated in that House, or the period of six weeks beginning on the day on which the committee reported has elapsed.”
My Lords, this amendment would require the Secretary of State, when seeking to exercise certain powers in the Bill, to provide the relevant Select Committees of both Houses with draft regulations and impact assessments, among other things. I should admit up front that this is a blatant attempt to secure an Online Safety Bill version of what I have called the “Grimstone rule”, established in the international trade Bill a few years ago. Saving his blushes, if the ideas enshrined in the amendment are acceptable to the Government, I hope that the earlier precedent of the “Grimstone rule” would ensure that any arrangements agreed under this amendment would be known in future as the “Parkinson rule”. Flattery will get you many things.
The Bill places a specific consultation requirement on the Government for the fee regime, which we were just talking about, categorisation thresholds, regulations about reports to the NCA, statements of strategic priorities, regulations for super-complaints, and a review of the Act after three years—so a wide range of issues need to be put out for consultation. My context here, which is all-important, is a growing feeling that Parliament’s resources are not being deployed to the full in scrutinising and reviewing the work of the Executive on the one hand and feeding knowledge and experience into future policy on the other. There is continuing concern about the effectiveness of the secondary legislation approval procedures, which this amendment would bear on.
Noble Lords have only to read the reports of the Select Committees of both Houses to realise what a fantastic resource they represent. One has only to have served on a Select Committee to realise what potential also exists there. In an area of rapid technical and policy development, such as the digital world, the need to be more aware of future trends and potential problems is absolutely crucial.
The pre-legislative scrutiny committee report is often quoted here, and it drew attention to this issue as well, recommending
“a Joint Committee of both Houses to oversee digital regulation with five primary functions: scrutinising digital regulators and overseeing the regulatory landscape … scrutinising the Secretary of State’s work into digital regulation; reviewing the codes of practice laid by Ofcom under any legislation relevant to digital regulation … considering any relevant new developments such as the creation of new technologies and the publication of independent research … and helping to generate solutions to ongoing issues in digital regulation”—
a pretty full quiver of issues to be looked at.
I hope that when he responds to this debate, the Minister will agree that ongoing parliamentary scrutiny would be helpful in providing reassurances that the implementation of the regime under the Bill is going as intended, and that the Government would also welcome a system under which Parliament, perhaps through the Select Committees, can contribute to the ways suggested by the Joint Committee. I say “perhaps”, because I accept that it is not appropriate for primary legislation to dictate how, or in what form, Parliament might offer advice in the manner that I have suggested; hence the suggestion embedded in the amendment—which I will not be pressing to a Division—which I call the “Parkinson rule”. Under this, the Minister would agree at the Dispatch Box a series of commitments which will provide an opportunity for enhanced cross-party scrutiny of the online safety regime and an opportunity to survey and report on future developments of interest.
The establishment of the new Department for Science, Innovation and Technology and its Select Committee means that there is a new dedicated Select Committee in the Commons. The Lords Communications and Digital Committee will continue, I hope, to play a vital role in the scrutiny of the digital world, as it has with the online safety regime to date. While it would be for the respective committees to decide their priorities, I hope the Government would encourage the committees in both Houses to respond to their required consultation processes and to look closely at the draft codes of practice, the uses of regulation-making powers and the powers of direction contained in the Bill ahead of the formal processes in both Houses. Of course, it could be a specialist committee if that is what the Houses decide, but there is an existing arrangement under which this “Parkinson rule” could be embedded. I have discussed the amendment with the Minister and with the Bill team. I look forward to hearing their response to the ideas behind the amendment. I beg to move the “Parkinson rule”.
I support the amendment of the noble Lord, Lord Stevenson. Here is an opportunity for the Minister to build a legislative monument. I hope he will take it. The reason I associate myself with it is because the noble Lord, Lord Stevenson—who has been sparing in his quoting of the Joint Committee’s report, compared with mine—referred to it and it all made very good sense.
The amendment stumbles only in the opinion of the Government, it seems, on the basis that parliamentary committees need to be decided on by Parliament, rather than the Executive. But this is a very fine distinction, in my view, given that the Government, in a sense, control the legislature and therefore could will the means to do this, even if it was not by legislation. A nod from the Minister would ensure that this would indeed take place. It is very much needed. It was the Communications and Digital Committee, I think, that introduced the idea that we picked up in the Joint Committee, so it has a very good provenance.
My Lords, I offer my support to the amendment. I spent some time arguing in the retained EU law Bill for increased parliamentary scrutiny. My various amendments did not succeed but at the end of the day—on the final day of ping-pong—the Minister, the noble Lord, Lord Callanan, gave certain assurances based on what is in Schedule 5 to that Act, as it now is, involving scrutiny through committees. So the basic scheme which my noble kinsman has proposed is one which has a certain amount of precedent—although it is not an exact precedent; what might have been the “Callanan rule” is still open to reconstruction as the “Parkinson rule”. I support the amendment in principle.
My Lords, as the noble Lords, Lord Stevenson and Lord Clement-Jones, have already said, the Communications and Digital Select Committee did indeed recommend a new Joint Committee of both Houses to look specifically at the various different aspects of Ofcom’s implementation of what will be the Online Safety Act and ongoing regulation of digital matters. It is something I still have a lot of sympathy for. However, there has not been much appetite for such a Joint Committee at the other end of the Corridor. I do not necessarily think we should give up on that, and I will come back to that in a moment, but in place of that, I am not keen on what is proposed in Amendment 239, because my fear about how that is laid out is that it introduces something that appears a bit too burdensome and probably introduces too much delay in implementation.
To return to the bigger question, I think that we as parliamentarians need to reflect on our oversight of regulators, to which we are delegating significant new powers and requiring them to adopt a much more principles-based approach to regulation to cope with the fast pace of change in the technological world. We have to reflect on whether our current set-up is adequate for the way in which that is changing. What I have in mind is very much a strategic level of oversight, rather than scrutinising operational decisions, although, notwithstanding what the noble Lord has said, something specific in terms of implementation of the Bill and other new legislation is an area I would certainly wish to explore further.
The other aspect of this is making sure that our regulators keep pace too, not just with technology, and apply the new powers we give them in a way which meets our original intentions, but with the new political dynamics. Earlier today in your Lordships’ Chamber, there was a Question about how banks are dealing with political issues, and that raises questions about how the FCA is regulating the banking community. We must not forget that the Bill is about regulating content, and that makes it ever more sensitive. We need to keep reminding ourselves about this; it is very new and very different.
As has been acknowledged, there will continue to be a role for the Communications and Digital Select Committee, which I have the great privilege of chairing, in overseeing Ofcom. My noble friend Lord Grade and Dame Melanie Dawes appeared before us only a week ago. There is a role for the SIT Committee in the Commons; there is also probably some kind of ongoing role for the DCMS Select Committee in the Commons too, I am not sure. In a way, the fractured nature of that oversight makes it all the more critical that we join up a bit more. So I will take it upon myself to give this more thought and speak to the respective chairs of those committees in the other place, but I think that at some point we will need to consider, in some other fora, the way in which we are overseeing the work of regulators.
At some point, I think we will need to address the specific recommendations in the pre-legislative committee’s report, which were very much in line with what my own committee thought was right for the future of digital regulatory oversight, but on this occasion, I will not be supporting the specifics of Amendment 239.
My Lords, very briefly, I was pleased to see this, in whatever form it takes, because as we finish off the Bill, one thing that has come up consistently is that some of us have raised problems of potential unintended consequences, such as whether age gating will lead to a huge invasion of the privacy of adults rather than just narrowly protecting children, or whether the powers given to Ofcom will turn it into the most important and powerful regulator in the country, if not in Europe. In a highly complex Bill, is it possible for us to keep our eye on it a bit more than just by whingeing on the sidelines?
The noble Baroness, Lady Stowell, makes a very important point about the issue in relation to the FCA and banking. Nobody intended that to be the outcome of PEPs, for example, and nobody intended when they suggested encouraging banks to have values such as ESG or EDI—equality, diversity and inclusion—that that would lead to ordinary citizens of this country being threatened with having their banking turned off. It is too late to then retrospectively say, “That wasn’t what we ever intended”.
Straightforwardly, from the point of view of scrutiny, I hope we do not say that it will be left up to arm’s-length regulators and do not look at it again. On my consistent concerns about free speech being threatened by this Bill, you can come back and say to me, “Oh, you were wrong, Lady Fox”, but you can say that only if we have a very clear view that Ofcom is not behaving in a way that is going to damage the freedom of expression rights of people in this country.
I associate myself with the comments of my noble friend Lady Stowell on this whole issue, and I refer to my register of interests. One question we should be asking, which goes wider than this Bill, is: who regulates the regulators? It is a standard problem in political science and often known as principal agent theory, whereby the principals delegate powers to the agents for many reasons, and you see agency slack, whereby they develop their own powers beyond what was perhaps originally intended. For that reason, I completely associate myself with my noble friend Lady Stowell’s comments—and not because she chairs a committee on which I sit and I hope to get a favour of more speaking time on that committee. It is simply because, on its merit, we should all be asking who regulates the regulators and making sure that they are accountable. We are asking the same question of the Secretary of State, and quite rightly, the Secretary of State should be accountable for any measures they propose, but we should also be asking it of regulators.
My Lords, I have always felt rather sorry for the first Viscount Addison, because what we refer to as the Salisbury convention is really the Salisbury-Addison convention. So while I am grateful to the noble Lord, Lord Stevenson, for his flattering speech, I shall insist on calling it the “Parkinson-Stevenson rule”, not least in the hope that that mouthful will encourage people to forget its name more swiftly.
I am grateful to the noble Lord for his attention to this matter and the useful discussions that we have had. His Amendment 239 would go beyond the existing legislative process for the delegated powers in the Bill by providing for parliamentary committees to be, in effect, inserted into the secondary legislative process. The delegated powers in the Bill are crucial for implementing the regime effectively and for ensuring that it keeps pace with changes in technology. Regulation-making powers are an established part of our legislative practice, and it would not be appropriate to deviate from existing processes.
However, I agree that ongoing parliamentary scrutiny of the regime will be crucial in helping to provide noble Lords and Members in another place with the reassurance that the implementation of the regime is as we intended. As the noble Lord noted, the establishment of the Science, Innovation and Technology Select Committee in another place means that there is a new dedicated committee looking at this important area of public policy. That provides an opportunity for cross-party scrutiny of the online safety regime and broader issues. While it will be, as he said, for respective committees to decide their priorities, we welcome any focus on online safety, and certainly welcome committees in both Houses co-operating effectively on this matter. I am certain that the Communications and Digital Committee of your Lordships’ House will continue to play a vital role in the scrutiny of the online safety regime.
We would fully expect these committees to look closely at the codes of practice, the uses of regulation-making powers and the powers of direction in a way that allows them to focus on key issues of interest. To support that, I can commit that the Government will do two things. First, where the Bill places a consultation requirement on the Government, we will ensure that the relevant committees have every chance to play a part in that consultation by informing them that the process is open. Secondly, while we do not wish to see the implementation process delayed, we will, where possible, share draft statutory instruments directly with the relevant committees ahead of the formal laying process. These timelines will be on a case-by-case basis, considering what is appropriate and reasonably practical. It will be for the committees to decide how they wish to engage with the information that we provide, but it will not create an additional approval process to avoid delaying implementation. I am grateful to my noble friend Lady Stowell of Beeston for her words of caution and wisdom on that point as both chairman of your Lordships’ committee and a former Leader of your Lordships’ House.
I hope that the noble Lord will be satisfied by what I have set out and will be willing to withdraw his amendment so that our rule might enter into constitutional history more swiftly.
I am very grateful to everyone who has contributed to the debate, despite my injunction that no one was to speak other than those key persons—but it was nice to hear views around the House in support for this proposal, with caution. The noble Baroness, Lady Stowell, was right to be clear that we have to be focused on where we are going on this; there is quite a lot at stake here, and it is a much bigger issue than simply this Bill and these particular issues. Her willingness to take this on in a wider context is most welcome, and I look forward to hearing how that goes. I am also very grateful for the unexpected but very welcome support from the noble Baroness, Lady Fox. It was nice that she finally agreed to meet on one piece of territory, if we cannot agree on some of the others. The noble Lord, Lord Kamall, is right to say that we need to pick up the much broader question about who regulates those who regulate us. This is not the answer, but it certainly gets us a step in the direction.
I was grateful to the Minister for suggesting that the “Parkinson rule” could take flight, but I shall continue to call it by a single name—double-barrelled names are not appropriate here. We will see the results of that in the consultation; the things that already have to be consulted about will be offered to the committees, and it is up to them to respond on that, but it is a very good start. The idea that drafts and issues that are being prepared for future regulation will be shown ahead of the formal process is exactly where I wanted to be on this, so I am very grateful for that. I withdraw the amendment.
Amendment 239 withdrawn.
Amendment 239A not moved.
Clause 74: Duty to notify OFCOM
Amendments 239B to 239E
Moved by
239B: Clause 74, page 70, line 3, leave out from “information” to end of line 5 and insert “as required by regulations made by OFCOM under section (“Regulations by OFCOM about qualifying worldwide revenue etc”).”
Member’s explanatory statement
This amendment omits a reference to regulations made by the Secretary of State. Details about supporting evidence etc to accompany providers’ notifications for the purposes of the fees regime are now to be contained in regulations made by OFCOM (see the new Clause 76 proposed in my name).
239C: Clause 74, page 70, line 6, leave out subsection (4) and insert—
“(4) Section (“Regulations by OFCOM about qualifying worldwide revenue etc”) confers power on OFCOM to make regulations about the determination of a provider’s qualifying worldwide revenue, and the meaning of “qualifying period”, for the purposes of this Part.”Member’s explanatory statement
This amendment is a signpost to the new Clause 76 proposed in my name, conferring power on OFCOM to make regulations about the meaning of qualifying worldwide revenue and qualifying period for the purposes of the fees regime.
239D: Clause 74, page 70, line 11, leave out “threshold figure under section 77 is published” and insert “regulations under section 77 come into force (first threshold figure)”
Member’s explanatory statement
This amendment is consequential on the first amendment of Clause 77 in my name (threshold figure now to be specified in regulations made by the Secretary of State).
239E: Clause 74, page 70, line 29, leave out subsection (11)
Member’s explanatory statement
This amendment omits a provision about procedure for regulations made by the Secretary of State under subsection (3)(b). That is no longer required because details about supporting evidence etc to accompany providers’ notifications for the purposes of the fees regime are now to be contained in regulations made by OFCOM (see the new Clause 76 proposed in my name).
Amendments 239B to 239E agreed.
Clause 76: OFCOM’s statement about “qualifying worldwide revenue” etc
Amendment 239F
Moved by
239F: Clause 76, leave out Clause 76 and insert the following new Clause—
“Regulations by OFCOM about qualifying worldwide revenue etc
(1) For the purposes of this Part, OFCOM may by regulations make provision—(a) about how the qualifying worldwide revenue of a provider of a regulated service is to be determined, and(b) defining the “qualifying period” in relation to a charging year.(2) OFCOM may by regulations also make provision specifying or describing evidence, documents or other information that providers must supply to OFCOM for the purposes of section 74 (see subsection (3)(b) of that section), including provision about the way in which providers must supply the evidence, documents or information.(3) Regulations under subsection (1)(a) may provide that the qualifying worldwide revenue of a provider of a regulated service (P) who is a member of a group during any part of a qualifying period is to include the qualifying worldwide revenue of any entity that—(a) is a group undertaking in relation to P for all or part of that period, and(b) receives or is due to receive, during that period, any amount referable (to any degree) to a regulated service provided by P.(4) Regulations under subsection (1)(a) may, in particular—(a) make provision about circumstances in which amounts do, or do not, count as being referable (to any degree) to a regulated service for the purposes of the determination of the qualifying worldwide revenue of the provider of the service or of an entity that is a group undertaking in relation to the provider;(b) provide for cases or circumstances in which amounts that—(i) are of a kind specified or described in the regulations, and(ii) are not referable to a regulated service,are to be brought into account in determining the qualifying worldwide revenue of the provider of the service or of an entity that is a group undertaking in relation to the provider.(5) Regulations which make provision of a kind mentioned in subsection (3) may include provision that, in the case of an entity that is a group undertaking in relation to a provider for part (not all) of a qualifying period, only amounts relating to the part of the qualifying period for which the entity was a group undertaking may be brought into account in determining the entity’s qualifying worldwide revenue.(6) Regulations under subsection (1)(a) may make provision corresponding to paragraph 5(8) of Schedule 13.(7) Before making regulations under subsection (1) OFCOM must consult—(a) the Secretary of State,(b) the Treasury, and(c) such other persons as OFCOM consider appropriate.(8) Before making regulations under subsection (2) OFCOM must consult the Secretary of State.(9) Regulations under this section may make provision subject to such exemptions and exceptions as OFCOM consider appropriate.(10) In this section—“group” means a parent undertaking and its subsidiary undertakings, reading those terms in accordance with section 1162 of the Companies Act 2006;“group undertaking” has the meaning given by section 1161(5) of that Act.”Member’s explanatory statement
This amendment substitutes Clause 76, which is about what is meant by “qualifying worldwide revenue”. The new Clause provides for OFCOM to make regulations about this and related matters for the purposes of the fees regime, and allows the regulations (among other things) to provide that revenue arising to certain entities in the same group as a provider of a regulated service is to be brought into account.
Amendment 239F agreed.
Clause 77: Threshold figure
Amendments 239G to 239M
Moved by
239G: Clause 77, page 72, line 2, leave out from “must” to “the” in line 3 and insert “make regulations specifying”
Member’s explanatory statement
This amendment provides that the Secretary of State must specify the threshold figure in regulations (rather than in a published statement).
239H: Clause 77, page 72, line 4, leave out subsection (3)
Member’s explanatory statement
This amendment is consequential on the first amendment of this Clause in my name.
239J: Clause 77, page 72, line 11, leave out “to (3)” and insert “and (2)”
Member’s explanatory statement
This amendment is consequential on the preceding amendment of this Clause in my name.
239K: Clause 77, page 72, line 12, leave out “A” and insert “Regulations must provide that a”
Member’s explanatory statement
This amendment is consequential on the first amendment of this Clause in my name.
239L: Clause 77, page 72, line 14, leave out from beginning to “at” and insert “Regulations specifying a threshold figure must be in force”
Member’s explanatory statement
This amendment provides that regulations specifying a threshold figure must be in force at least 9 months before the first charging year for which that figure applies.
239M: Clause 77, page 72, line 17, leave out “threshold figure published” and insert “regulations made”
Member’s explanatory statement
This amendment is consequential on the first amendment of this Clause in my name.
Amendments 239G to 239M agreed.
Clause 79: OFCOM’s fees statements
Amendments 239N and 239P
Moved by
239N: Clause 79, page 73, line 18, leave out from “period”” to end of line 19 and insert “for the purposes of this Part, and”
Member’s explanatory statement
This amendment is consequential on the new Clause 76 proposed in my name.
239P: Clause 79, page 73, line 20, leave out “published in accordance with” and insert “contained in regulations under”
Member’s explanatory statement
This amendment is consequential on the first amendment of Clause 77 in my name (threshold figure now to be specified in regulations made by the Secretary of State).
Amendments 239N and 239P agreed.
Clause 82: General duties of OFCOM under section 3 of the Communications Act
Amendment 240
Moved by
240: Clause 82, page 74, line 25, leave out “presented by content”
Member’s explanatory statement
This amendment ensures that Ofcom is empowered to consider harms presented by features, functionalities, behaviours and the design and operation of services not just by content.
My Lords, if I may, I shall speak very briefly, in the absence of my noble friend Lady Kidron, and because I am one of the signatories of this amendment, alongside the noble Lord, Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. Amendment 240, together with a number of amendments that we will be debating today, turns on a fundamental issue that we have not yet resolved.
I came in this morning being told that we would be voting on this amendment and that other amendments later today would be consequential—I am a novice at this level of parliamentary procedure, so forgive me if I have got myself confused during the day—but I now understand that my noble friend considers this amendment to be consequential but, strangely, the amendments right at the end of the day are not. I just wanted to flag to the House that they all cover the same fundamental issue of whether harms can be unrelated to content, whether the harms of the online world can be to do with functionality—the systems and processes that drive the addiction that causes so much harm to our children.
It is a fundamental disagreement. I pay tribute to the amount of time the department, the Secretary of State and my noble friend have spent on it, but it is not yet resolved and, although I understand that I should now say that I beg leave to move the amendment formally, I just wanted to mark, with apologies, the necessity, most likely, of having to bring the same issue back to vote on later today.
My Lords, His Majesty’s Government indeed agree that this is consequential on the other amendments, including Amendment 35, which the noble Baroness, Lady Kidron, previously moved at Report. We disagreed with them, but we lost that vote; this is consequential, and we will not force a Division on it.
We will have further opportunity to debate the fundamental issues that lie behind it, to which my noble friend Lady Harding just referred. Some of the amendments on which we may divide later, the noble Baroness, Lady Kidron, tabled after defeating the Government the other day, so we cannot treat them as consequential. We look forward to debating them; I will urge noble Lords not to vote for them, but we will have opportunity to discuss them later.
Amendment 240 agreed.
Amendment 241
Moved by
241: Clause 82, page 74, line 31, leave out “or 3” and insert “, 3 or 3A”
Member’s explanatory statement
Clause 82 is about OFCOM’s general duties. This amendment and the next amendment in my name insert a reference to Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name.
Amendment 241 agreed.
Amendment 242 not moved.
Amendment 243
Moved by
243: Clause 82, page 75, line 2, leave out “or 3” and insert “, 3 or 3A”
Member’s explanatory statement
See the explanatory statement for the preceding amendment in my name.
Amendment 243 agreed.
Amendment 244 not moved.
Schedule 11: Categories of regulated user-to-user services and regulated search services: regulations
Amendment 245
Moved by
245: Schedule 11, page 223, line 32, leave out “and” and insert “or”
My Lords, I wish to test the opinion of the House and I beg to move.
Clause 91: Power to require information
Amendments 246 and 247
Moved by
246: Clause 91, page 83, line 14, leave out “(an “information notice”)”
Member’s explanatory statement
This technical amendment is needed because the new notice requiring information in connection with an investigation into the death of a child (see the new Clause proposed after Clause 91 in my name) is also a form of information notice.
247: Clause 91, page 83, line 19, at end insert—
“(b) provide information about the use of a service by a named individual.”Member’s explanatory statement
This amendment makes it clear that OFCOM have power by notice to require providers to provide information about a particular person’s use of a service.
Amendments 246 and 247 agreed.
Amendment 247A
Moved by
247A: Clause 91, page 83, line 19, at end insert—
“(2A) The power conferred by subsection (1) also includes power to require a person within any of paragraphs (a) to (d) of subsection (4) to take steps so that OFCOM are able to remotely access the service provided by the person, or remotely access equipment used by the service provided by the person, in order to view, in particular—(a) information demonstrating in real time the operation of systems, processes or features, including functionalities and algorithms, used by the service;(b) information generated in real time by the performance of a test or demonstration of a kind required by a notice under subsection (1).”Member’s explanatory statement
This amendment makes it clear that OFCOM have the power by notice to require a provider of a regulated service (among others) to take steps to allow OFCOM to remotely access the service so that they can view the operation in real time of systems, processes, functionalities and algorithms, and tests and demonstrations.
I beg to move Amendment 247A.
Amendment 247B (to Amendment 247A) not moved.
Amendment 247A agreed.
Amendments 248 to 248C
Moved by
248: Clause 91, page 84, line 2, at end insert—
“(iva) any duty set out in section (Disclosure of information about use of service by deceased child users) (deceased child users),”Member’s explanatory statement
This amendment mentions the new duties imposed by the Clause proposed after Clause 67 in my name in the Clause that sets out the purposes for which OFCOM may require people to provide information.
248A: Clause 91, page 84, line 12, leave out “section 75 (duty to pay fees)” and insert “Part 6 (fees)”
Member’s explanatory statement
This amendment makes it clear that OFCOM’s powers to gather information in relation to a provider’s qualifying worldwide revenue apply for the purposes of Part 6.
248B: Clause 91, page 84, line 37, leave out “duty” and insert “duties”
Member’s explanatory statement
This amendment is consequential on the new clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
248C: Clause 91, page 84, line 38, leave out “duty to promote”
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
Amendments 248 to 248C agreed.
Amendment 249
Moved by
249: After Clause 91, insert the following new Clause—
“Information in connection with an investigation into the death of a child
(1) OFCOM may by notice under this subsection require a relevant person to provide them with information for the purpose of—(a) responding to a notice given by a senior coroner under paragraph 1(2) of Schedule 5 to the Coroners and Justice Act 2009 in connection with an investigation into the death of a child, or preparing a report under section (OFCOM’s report in connection with investigation into a death) in connection with such an investigation;(b) responding to a request for information in connection with the investigation of a procurator fiscal into, or an inquiry held or to be held in relation to, the death of a child, or preparing a report under section (OFCOM’s report in connection with investigation into a death) in connection with such an inquiry;(c) responding to a notice given by a coroner under section 17A(2) of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.)) in connection with—(i) an investigation to determine whether an inquest into the death of a child is necessary, or(ii) an inquest in relation to the death of a child,or preparing a report under section (OFCOM’s report in connection with investigation into a death) in connection with such an investigation or inquest. (2) The power conferred by subsection (1) includes power to require a relevant person to provide OFCOM with information about the use of a regulated service by the child whose death is under investigation, including, in particular—(a) content encountered by the child by means of the service,(b) how the content came to be encountered by the child (including the role of algorithms or particular functionalities),(c) how the child interacted with the content (for example, by viewing, sharing or storing it or enlarging or pausing on it), and(d) content generated, uploaded or shared by the child.(3) The power conferred by subsection (1) includes power to require a relevant person to obtain or generate information.(4) The power conferred by subsection (1) must be exercised in a way that is proportionate to the purpose mentioned in that subsection.(5) The power conferred by subsection (1) does not include power to require the provision of information in respect of which a claim to legal professional privilege, or (in Scotland) to confidentiality of communications, could be maintained in legal proceedings. (6) Nothing in this section limits the power conferred on OFCOM by section 91.(7) In this section—“inquiry” means an inquiry held, or to be held, under the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2);“information” includes documents, and any reference to providing information includes a reference to producing a document (and see also section 92(9));“relevant person” means a person within any of paragraphs (a) to (e) of section 91(4).”Member’s explanatory statement
This amendment makes it clear that OFCOM have the power to obtain information for the purposes of responding to a notice given to them by a coroner or, in Scotland, a request from a procurator fiscal, in connection with the death of a child, including a power to obtain information from providers about the use of a service by the deceased child.
Amendment 249 agreed.
Clause 92: Information notices
Amendments 250 and 250A
Moved by
250: Clause 92, page 85, line 3, at end insert—
“(A1) A notice given under section 91(1) or (Information in connection with an investigation into the death of a child)(1) is referred to in this Act as an information notice.”Member’s explanatory statement
This amendment provides that a notice under the new Clause proposed in my name concerning OFCOM’s power to obtain information in connection with an investigation into the death of a child is called an “information notice” (as well as a notice under Clause 91). This ensures that provisions of the Bill that relate to information notices also apply to a notice given under that Clause.
250A: Clause 92, page 85, line 24, leave out “provide the information” and insert “act”
Member’s explanatory statement
This amendment ensures that the duty to comply with an information notice covers the case where a provider is required to take steps to allow OFCOM to remotely access the service.
Amendments 250 and 250A agreed.
Clause 94: Reports by skilled persons
Amendment 250B
Moved by
250B: Clause 94, page 86, line 26, leave out “any” and insert “either”
Member’s explanatory statement
This amendment is consequential on the next amendment of Clause 94 in my name.
My Lords, these amendments are concerned with Ofcom’s powers under Clause 111 to issue notices to deal with terrorism content and child sexual exploitation and abuse content.
I acknowledge the concerns which have been aired about how these powers work with encrypted services. I want to make it clear that the Bill does not require companies to break or weaken encryption, and we have built in strong safeguards to ensure that users’ privacy is protected. Encryption plays an important role online, and the UK supports its responsible use. I also want to make it clear that we are not introducing a blanket requirement for companies to monitor all content for all harms, at all times. That would not be proportionate.
However, given the serious risk of harm to children from sexual abuse and exploitation online, the regulator must have appropriate, tightly targeted powers to compel companies to take the most effective action to tackle such reprehensible illegal activity which is taking place on their services. We must ask companies to do all that is technically feasible to keep children safe, subject to stringent legal safeguards.
The powers in the Bill are predicated on risk assessments. If companies are managing the risks on their platform appropriately, Ofcom will not need to use its powers. As a last resort, however, where there is clear evidence of child sexual abuse taking place on a platform, Ofcom will be able to direct companies either to use, or to make best efforts to develop or source, accredited and accurate technology to identify and remove this illegal content. To be clear, these powers will not enable Ofcom or our law enforcement agencies to obtain any automatic access to the content detected. It is simply a matter of making private companies take effective action to prevent child sexual abuse on their services.
Ofcom must consider a wide range of matters when deciding whether a notice is necessary and proportionate, including the impacts on privacy and freedom of expression of using a particular technology on a particular service. Ofcom will only be able to require the use of technology accredited as highly accurate in detecting illegal child sexual abuse or terrorism content, vastly minimising the risk that content is wrongly identified.
In addition to these safeguards, as a public body, Ofcom is bound through the Human Rights Act 1998 by the European Convention on Human Rights, including Articles 8 and 10. Ofcom has an obligation not to act in a way which unduly interferes with the right to privacy and freedom of expression when carrying out its duties, for which it is held to account.
If appropriate technology does not exist which meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a solution. It is right that we can require technology companies to use their considerable resources and expertise to develop the best possible protections for children in encrypted environments.
Despite the breadth of the existing safeguards, we recognise that concerns remain about these powers, and we have listened to the points that noble Lords raised in Committee about privacy and technical feasibility. That is why we are introducing additional safeguards. I am grateful for the constructive engagement I have had with noble Lords across your Lordships’ House on this issue, and I hope that the government amendments alleviate their concerns.
I turn first to our Amendments 250B, 250C, 250D, 255A, 256A, 257A, 257B, 257C and 258A, which require that Ofcom obtain a skilled persons’ report before issuing a warning notice and exercising its powers under Clause 111. This independent expert scrutiny will supplement Ofcom’s own expertise to ensure that it has a full understanding of relevant technical issues to inform its decision-making. That will include issues specific to the service in question, such as its design and relevant factors relating to privacy.
We are confident that, in addition to Ofcom’s existing routes of evidence-gathering, Amendment 256A will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place. That will further help Ofcom to issue a notice which is targeted and proportionate.
Ofcom will need to appoint a skilled person and notify the provider about the appointment and the relevant matters to be explored in the report before issuing its final notice. Ofcom will have discretion over what should be included in the report, as this will depend on the specific circumstances. Under Amendments 257A and 257B, Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. That will enable the provider to make representations based on Ofcom’s own analysis and that of the skilled person.
I turn now to Amendments 257D, 257E and 257F. We have heard concerns about the impact that scanning technologies could have on journalistic content and sources. Any technology required by Ofcom must be highly accurate in detecting only terrorism content on public channels or only child sexual exploitation and abuse content on public or private channels. So, the likelihood of journalistic content or sources being compromised will be low—but to reassure your Lordships further, we have expanded the matters that Ofcom must consider in its decision-making.
Amendment 257D requires Ofcom to consider the impact that the use of a particular technology on a particular service would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. It builds on the existing safeguards in Clause 113 regarding freedom of expression and privacy. I am grateful to the noble Lord, Lord Stevenson of Balmacara, for his constructive engagement on this issue. I beg to move.
My Lords, I am conscious of the imprecation earlier from the noble Lord, Lord Stevenson of Balmacara, that we keep our contributions short, but I intend to take no notice of it. That is for the very good reason that I do not think the public would understand why we disposed of such a momentous matter as bringing to an end end-to-end encryption on private messaging services as a mere technicality and a brief debate at the end of Report.
It is my view that end-to-end encryption is assumed nowadays by the vast majority of people using private messaging services such as WhatsApp, iMessage and Signal. They are unaware, I think, of the fact that it is about to be taken from them by Clause 111 of the Bill. My amendment would prevent that. It is fairly plain; it says that
“A notice under subsection (1)”
of Clause 111
“may not impose a requirement relating to a service if the effect of that requirement would be to require the provider of the service to weaken or remove end-to-end encryption applied in relation to the service”.
My noble friend says that there is no threat of ending end-to-end encryption in his proposal, but he achieves that by conflating two things—which I admit my own amendment conflates, but I will come back to that towards the end. They are the encryption of platforms and the encryption of private messaging services. I am much less concerned about the former. I am concerned about private messaging services. If my noble friend was serious in meaning that there was no threat to end-to-end encryption, then I cannot see why he would not embrace my amendment, but the fact that he does not is eloquent proof that it is in fact under threat, as is the fact that the NSPCC and the Internet Watch Foundation are so heavily lobbying against my amendment. They would not be doing that if they did not think it had a serious effect.
I shall not repeat at any length the technical arguments we had in Committee, but the simple fact is that if you open a hole into end-to-end encryption, as would be required by this provision, then other people can get through that hole, and the security of the system is compromised. Those other people may not be very nice; they could be hostile state actors—we know hostile state actors who are well enough resourced to do this—but they could also be our own security services and others, from whom we expect protection. Normally, we do get a degree of protection from those services, because they are required to have some form of warrant or prior approval but, as I have explained previously in debate on this, these powers being given to Ofcom require no warrant or prior approval in order to be exercised. So there is a vulnerability, but there is also a major assault on privacy. That is the point on which I intend to start my conclusion.
If we reflect for a moment, the evolution of this Bill in your Lordships’ House has been characterised and shaped, to a large extent, by the offer made by the noble Lord, Lord Stevenson of Balmacara, when he spoke at Second Reading, to take a collaborative approach. But that collaborative approach has barely extended to those noble Lords concerned about privacy and freedom of expression. As a result, in my view, those noble Lords rightly promoting child protection have been reckless to the point of overreaching themselves.
If we stood back and had to explain to outsiders that we were taking steps today that took end-to-end encryption and the privacy they expect on their private messaging services away from them, together with the security and protection it gives, of course, in relation to scams and frauds and all the other things where it has a public benefit, then I think they would be truly outraged. I do not entirely understand how the Government think they could withstand that outrage, were it expressed publicly. I actually believe that the battle for this Bill—this part of this Bill, certainly—is only just starting. We may be coming to the end here, but I do not think that this Bill is settled, because this issue is such a sensitive one.
Given the manifest and widespread lack of support for my views on this question in your Lordships’ House in Committee, I will not be testing the opinion of the House today. I think I know what the opinion of the House is, but it is wrong, and it will have to be revised. My noble friend simply cannot stand there and claim that what he is proposing is proportionate and necessary, because it blatantly and manifestly is not.
My Lords, the powers in Clause 111 are perhaps the most controversial outstanding issue in the Bill. I certainly agree with the noble Lord, Lord Moylan, that they deserve some continued scrutiny. I suspect that Members of another place are being lobbied on this extensively right now. Again, it is one of the few issues; they may not have heard of the Online Safety Bill, but they will do in the context of this particular measure.
We debated the rights and wrongs of encryption at some length in Committee, and I will not repeat those points today, not least because the noble Lord, Lord Moylan, has made some of the arguments as to why encryption is important. I will instead today focus on the future process, assuming that the Clause 111 powers will be available to Ofcom as drafted and that we are not going to accept the amendment from the noble Lord, Lord Moylan.
Amendments 258 and 258ZA, in my name and that of my noble friend Lord Clement-Jones, both aim to improve the process of issuing a Clause 111 order by adding in some necessary checks and balances.
As we debate this group, we should remember that the Clause 111 powers are not specific to encrypted services—I think the Minister made this point—and we should have the broader context in mind. I often try to bring some concrete scenarios to our discussions, and it may be helpful to consider three different scenarios in which Ofcom might reach for a Clause 111 notice.
The first is where a provider has no particular objections to using technology to identify and remove child sexual exploitation and abuse material or terrorist material but is just being slow to do this. There are mature systems out there. PhotoDNA is very well known in the industry and effectively has a database with digital signatures of known child sexual exploitation material. All the services we use on a daily basis such as Facebook, Instagram and others will check uploaded photos against that database and, where it is child sexual exploitation material, they will make sure that it does not get shown and that those people are reported to the authorities.
I can imagine scenarios where Ofcom is dealing with a service which has not yet implemented the technology—but does not have a problem doing it—and the material is unencrypted so there is no technical barrier; it is just being a bit slow. In those scenarios, Ofcom will tell the service to get on with it or it will get a Clause 111 notice. In those circumstances, in most cases the service will just get on with it, so Ofcom will be using the threat of the notice as a way to encourage the slow coaches. That is pretty unexceptional; it will work in a pretty straightforward way. I think the most common use of these notices may be to bring outliers into the pack of those who are following best practice. Ofcom may not even need to issue any kind of warning notice at all and will not get past the warning notice period. Waving a warning notice in front of a provider may be sufficient to get it to move.
The second scenario is one where the provider equally does not object to the use of the technology but would prefer to have a notice before it implements it. Outside the world of tech companies, it may seem a little strange why a provider would want to be ordered to do something rather than doing the right thing voluntarily, but we have to remember that the use of this kind of technology is legally fraught in many jurisdictions. There have been court cases in a number of places, not least the European Union, where there are people who will challenge whether you should use this technology on unencrypted services, never mind encrypted ones. In those cases, you can imagine there will be providers, particularly those established outside the United Kingdom, which may say, “Look, we are fine implementing this technology, but Ofcom please can you give us a notice? Then when someone challenges it in court, we can say that the UK regulator made us do it”. That would be helpful to them. This second group will want a notice and here we will get to the point of the notice being issued. They are not going to contest it; they want to have the notice because it gives them some kind of legal protection.
I think those two groups are relatively straightforward: we are dealing with companies which are being slow or are looking for legal cover but do not fundamentally object. The third scenario, though, is the most challenging and it is where I think the Government could get into real trouble. My amendments seek to help the Government in situations where a provider fundamentally objects to being ordered to deploy a particular technology because it believes that that technology will create real privacy threats and risks to the service that it offers. I do not think the provider is being awkward in these circumstances; it has genuine concerns about the implications of the technology being developed or which it is being instructed to deploy.
In these circumstances, Ofcom may have all the reasons in the world to argue why it thinks that what it is asking for is reasonable. However, the affected provider may not accept those reasons and take quite a strong counterview and have all sorts of other arguments as to why what it is being asked to do is unacceptable and too high-risk. This debate has been swirling around at the moment as we think about current models of end-to-end encryption and client-side scanning technology, but we need to recognise that this Bill is going to be around for a while and there may be all sorts of other technologies being ordered to be deployed that we do not even know about and have not even been developed yet. At any point, we may hit this impasse where Ofcom is saying it thinks it is perfectly reasonable to order a company to do it and the service provider is saying, “No, as we look at this, our experts and our lawyers are telling us that this is fundamentally problematic from a privacy point of view”.
The amendments I have tabled do not stop Ofcom from issuing any kind of order for any kind of technology. In Amendment 258, we are saying that, where there is a disagreement, there should be a point where the public can join the debate. However, I really want to focus on Amendment 258ZA, where we are saying that in those circumstances the provider should have a statutory right to refer it to the Information Commissioner’s Office. We will try to press this to a vote, so I hope people are listening to the argument because I think it is well intended. The right to refer it to the Information Commissioner’s Office is not only the sensible thing to do but will help the Government if they are trying to get a company to deploy a technology. It will help and strengthen their case if they have made it clear that there will be this important check and balance.
As we have discussed a lot through this debate, these are safety/privacy trade-offs. In some cases, it is a real trade-off—more privacy can sometimes compromise safety because people are able to do things in private that are problematic. At the same time, more privacy can sometimes be a benefit for safety if it protects you from fraudsters. But there certainly are occasions where there are genuine safety/privacy trade-offs. In this Bill, we are charging Ofcom with creating the highest level of safety for the people in the UK when they are online. Ofcom will be our safety regulator and that is its overriding duty. I am sure the Minister will argue it also has to think about privacy and other things, but if you look at the Bill in total, Ofcom’s primary responsibility is clear: it is safety, particularly child safety.
We have created the Information Commissioner’s Office as the guardian of our privacy rights and tasked it with enforcing a body of data protection law, so Ofcom is our primary safety regulator and the ICO is our primary privacy regulator. It seems to me entirely sensible and rational to say that, if our safety regulator is ordering a provider to do something on the grounds of safety and the provider thinks it is problematic, the provider should be able to go to our privacy regulator and say, “You two regulators both have a look at this. Both come to a view and, on the basis of that, we can decide what to do”.
That is really what Amendment 258ZA is intended to do. It does not intend to handcuff Ofcom in any way or stop it doing anything it thinks is important. It does not intend to frustrate child safety; it simply intends to make sure that there is a proper balance in place so that, where a provider has genuine concerns, it can go to the regulator that we have charged with being responsible for privacy regulation.
The reason that this is important and why we are spending a little more time on it today is that there is a genuine risk that services being used by millions of people in the United Kingdom could leave. We often focus on some of the more familiar brands such as WhatsApp and others, but we need to remember things such as iMessage. If you use an Apple phone and use iMessage, that is end-to-end encrypted. That is the kind of service that could find it really problematic. Apple has said, “Privacy is all”, and is going to be thinking about the global market. If it was ordered to deploy a technology which it thought was unsafe, Apple would have to think very carefully about being in the UK market.
To say that Apple has a right to go to the ICO and ask for a review is perfectly reasonable and sensible. I suspect that the Minister may try to argue for the skilled person’s concession that they have made—which is helpful and is material—could involve a review by data protection officials, but that is not the same as getting an authoritative decision from the privacy regulator, the Information Commissioner’s Office. It is helpful and those amendments are welcome; I would say that the skilled person’s report is necessary but far from sufficient in these circumstances where there is that fundamental view.
If I can try to sell it to the Government, if they accept this amendment and Ofcom says it needs this technology to be deployed for safety reasons and the ICO says it has looked at it as the privacy regulator and has no objections, the onus is on the company. Then it looks like the provider, if it chooses to leave the United Kingdom market, is doing so voluntarily because there are no fundamental objections.
If, on the other hand, the ICO says it is problematic, we know then that we need to carry on discussing and debating whether that technology is appropriate and whether the safety/privacy balance has been got right. So, whether you support more scanning of content or are concerned about more scanning of content, to have the providers of the services that we all use every day, in the circumstances where they think there is a fundamental threat, being able to go to our privacy regulator, which we have set up precisely to guard our privacy rights, and ask it for an opinion, I do not think is an excessive ask. I hope that the Government will either accept the amendment or make a commitment that they will bring in something comparable at Third Reading. Absent that, we feel that this is important enough that we should test the opinion of the House in due course.
My Lords, I have put my name to and support Amendment 255, laid by the noble Lord, Lord Moylan, which straight- forwardly means that a notice may not impose a requirement relating to a service that would require that provider to weaken or remove end-to-end encryption. It is very clear. I understand that the other amendments introduce safeguards, which is better than nothing. It is not what I would like, but I will support them if they are pushed a vote. I think that the Government should really consider seriously not going anywhere near getting rid of encryption in this Bill and reconsider it by the time we get to Third Reading.
As the noble Lord, Lord Moylan, explained, this is becoming widely known about now, and it is causing some concern. If passed, this Bill, as it is at the moment, gives Ofcom far-reaching powers to force services, such as WhatsApp, to install software that would scan all our private messages to see whether there is evidence of terrorism, child sexual exploitation or abusive content and would automatically send a report to third parties, such as law enforcement, if it suspects wrongdoing—all without the consent or control of the sender or the intended recipient.
I would just like to state that encryption is a wonderful innovation. That is why more than 40 million people in the UK use it every day. It ensures that our private messages cannot be viewed, compromised or altered by anyone else, not even providers of chat services. It really requires somebody handing them over to a journalist and saying, “You can have my WhatsApp messages for anyone to read them”: beyond that, you cannot read them.
One of the interesting things that we have discussed throughout the passage of the Bill is technologies, their design and functionality and making sure they are not harmful. Ironically, it is the design and function of encryption that actually helps to keep us safe online. That is why so many people talk about civil libertarians, journalists and brave dissenters using it. For the rest of us, it is a tool to protect our data and private communications in the digital realm. I just want to pose here that it is an irony that the technologies being proposed in terms of client-side scanning are the technologies that are potentially harmful because it is, as people have noted, the equivalent of putting video cameras in our homes to listening in to every conversation and send reports to the police if we discuss illicit topics. As I have said before, while child sexual abuse is horrendous and vile, we know that it happens largely in the home and, as yet, the Government have not advocated that we film in everybody’s home in order to stop child sexual abuse. We should do almost anything and everything that we can, but I think this is the wrong answer.
Focusing on encryption just makes no sense. The Government have made exemptions, recognising the importance, in a democracy, of private correspondence: so exempted in the Bill are text messages, MSN, Zoom, oral commentaries and email. It seems perverse to demonise encryption in this way. I also note that there are exemptions for anything sent on message apps by law enforcement or public sector or emergency responders. I appreciate that some government communications are said to be done over apps such as WhatsApp. It seems then that the target of this part of the Bill is UK private citizens and residents and that the public are seen as the people who must be spied on.
In consequence, I do not think it surprising that more than 80 national or international civil society organisations have said that this would make the UK the first liberal democracy to require the routine scanning of people’s private chat messages. What does the Minister say to the legal opinion from the technology barrister Matthew Ryder KC, commissioned by Index on Censorship precisely on this part of the Bill? He compares this to law enforcement wiretapping without a warrant and says that the Bill will grant Ofcom a wider remit of surveillance powers over the public than GCHQ has.
Even if the Minister is not interested in lawyers or civil libertarians, surely we should be listening to the advice of science and technology experts in relation to complex technological solutions. Which experts have been consulted? I noted that Matthew Hodgson, the boss of encrypting messaging app Element, has said wryly that
“the Government has not consulted with UK tech firms, only with huge multinational corporations and companies that want to sell software that scans messages, who are unsurprisingly telling lawmakers that it is possible to scan messages without breaking encryption”.
The problem is that it is not possible to scan those messages without breaking encryption. It is actually misinformation to say that. That is why whole swathes of leading scientists and technologists from across the globe have recently put out an open letter explaining why and how it is not true. They explained that it creates really dangerous side-effects that can be harmful in the way that the noble Lord, Lord Moylan, explained, in terms of security, and makes the online world less safe for many of us. Existing scanning technologies are flawed and ineffective and scanning will nullify the purpose of encryption. I refer noble Lords to the work of the Internet Society and the academic paper Bugs in Our Pockets: The Risks of Client-Side Scanning for more details on all the peer-reviewed work.
I understand that, given the horrific nature of child sexual abuse—and, of course, terrorism, but I shall concentrate on child sexual abuse because the Bill is so concerned with it—it can be tempting for the Government to hope that there is a technological silver bullet to eradicate it. But the evidence suggests otherwise. One warning from scientists is that scanning billions of pieces of content could lead to millions of false positives and that could not only frame innocent users but could mean that the police become overwhelmed, diverting valuable resources away from real investigations into child sexual abuse.
A study by the Max Planck Institute for the study of crime of a similar German law that lasted from 2008 to 2010 found that the German police having access to huge amounts of data did not have any deterrent effect, did not assist in cleaning up crimes or increase convictions, but did waste a lot of police time. So it is important that this draconian invasion of privacy is not stated as necessary for protecting children. I share the exasperation of Signal’s president Meredith Whittaker, who challenged the Secretary of State and pointed out that there were some double standards here: for example, slashing early intervention programmes over the past decade did not help protect children and chronically underfunding and underresourcing child social care does not help.
My own bugbears are that when I, having talked to social workers and colleagues, raised the dangers to child protection when we closed down schools in lockdown, they were brushed to one side. When I and others raised the horrors of the young girls who had been systematically raped by grooming gangs whom the authorities had ignored for many, many years, I was told to stop talking about it. There are real threats to children that we ignore. I do not want us in this instance to use that very emotive discussion to attack privacy.
I also want to stress that there is no complacency here. Law enforcement agencies in the UK already possess a wide range of powers to seize devices and compel passwords and even covertly to monitor and hack accounts to identify criminal activity. That is good. Crucially, private messaging services can and do— I am sure they could do more—work in a wide range of ways to tackle abuse and keep people safe without the need to scan or read people’s messages.
I did listen to the noble Lord, Lord Stevenson, saying, “Don’t speak too long”. Noble Lords will be delighted to know that I did not speak on any other group so that I could make these points—I spoke very briefly to agree with the noble Lord, but that was for one minute. I cannot stress enough that while I have talked a lot about freedom of speech, it is hugely important that we do not jeopardise the public’s privacy online by falsely claiming that it will protect children. It will also see the end of Rishi Sunak’s dream of the UK becoming a technology superpower, in the way that the noble Lord, Lord Allan of Hallam, explained. It is not good for the growth agenda because those organisations—WhatsApp and so on—will leave the UK, but, largely, it is in defence of privacy that I urge noble Lords to support the amendment from the noble Lord, Lord Moylan, or vote for whichever is moved.
My Lords, I rise to speak in favour of my noble friend Lord Moylan’s amendment. Given that I understand he is not going to press it, and while I see Amendment 255 as the ideal amendment, I thank the noble Lords, Lord Stevenson and Lord Clement- Jones, for their Amendments 256, 257 and 259, and the noble Lords, Lord Clement-Jones and Lord Allan of Hallam, for Amendments 258 and 258ZA.
I will try to be as brief as I can. I think about two principles—unintended consequences and the history of technology transfer. The point about technology transfer is that once a technology is used it becomes available to other people quickly, even bad guys, whether that was intended or not. There is obviously formal technology transfer, where you have agreement or knowledge transfer via foreign investment, but let us think about the Cold War and some of the great technological developments—atomic secrets, Concorde and the space shuttle. In no time at all, the other side had that access, and that was before the advent of the internet.
If we are to open a door for access to encrypted messages, that technology will be available to the bad guys in no time at all, and they will use it against dissidents, many of whom will be in contact with journalists and human rights organisations in this country and elsewhere. Therefore, the unintended consequence may well be that in seeking to protect children in this country by accessing encrypted messages or unencrypted messages, we may well be damaging the childhoods of children in other countries when their parents, who are dissidents, are suddenly taken away and maybe the whole family is wiped out. Let us be careful about those unintended consequences.
I also welcome my noble friend Lord Parkinson’s amendments about ensuring journalistic integrity, such as Amendment 257D and others. They are important. However, we must remember that once these technologies are available, everyone has a price and that technology will be transferred to the bad guys.
Given that my noble friend Lord Moylan will not press Amendment 255, let us talk about some of the other amendments—I will make some general points rather than go into specifics, as many noble Lords have raised these points. These amendments are sub-optimal, but at least there is some accountability for Ofcom being able to use this power and using it sensibly and proportionately. One of the things that has run throughout this Bill and other Bills is “who regulates the regulators?” and ensuring that regulators are accountable. The amendments proposed by the noble Lords, Lord Stevenson and Lord Clement-Jones, and by the noble Lords, Lord Clement-Jones and Lord Allan of Hallam, go some way towards ensuring that safeguards are in place. If the Government are not prepared to have an explicit statement that they will not allow access to encrypted messages, I hope that there will be some support for the noble Lords’ amendments.
My Lords, I promise to speak very briefly. I welcome the Government’s amendments. I particularly welcome that they appear to mirror partly some of the safeguards that are embedded in the Investigatory Powers Act 2016.
I have one question for my noble friend the Minister about the wording, “a skilled person”. I am worried that “a skilled person” is a very vague term. I have been taken all through the course of this Bill by the comparison with the Investigatory Powers Act and the need to think carefully about how we balance the importance of privacy with the imperative of protecting our children and being able to track down the most evil and wicked perpetrators online. That is very similar to the debates that we had here several years ago on the Investigatory Powers Act.
The IPA created the Technical Advisory Board. It is not a decision-making body. Its purpose is to advise the Investigatory Powers Commissioner and judicial commissioners on the impact of changing technology and the development of techniques to use investigatory powers while maintaining privacy. It is an expert panel constituted to advise the regulator—in this case, the judicial commissioner—specifically on technology interventions that must balance this really difficult trade-off between privacy and child protection. Why have we not followed the same recipe? Rather than having a skilled person, why would we not have a technology advisory panel of a similar standing where it is clear to all who the members are. Those members would be required to produce a regular report. It might not need to be as regular as the IPA one, but it would just take what the Government have already laid one step further towards institutionalising the independent check that is really important if these Ofcom powers were ever to be used.
My Lords, I added my name to some amendments on this issue in Committee. I have not done so on Report, not least because I have been so occupied with other things and have not had the time to focus on this. However, I remain concerned about this part of the Bill. I am sympathetic to my noble friend Lord Moylan’s Amendment 255, but listening to this debate and studying all the amendments in this group, I am a little confused and so have some simple questions.
First, I heard my noble friend the Minister say that the Government have no intention to require the platforms to carry out general monitoring, but is that now specific in any of the amendments that he has tabled? Regarding the amendments which would bring further safeguards around the oversight of Ofcom’s use of this power, like my noble friend Lady Harding, I have always been concerned that the oversight approach should be in line with that for the Investigatory Powers Act and could never understand why it was not in the original version of the Bill. Like her, I am pleased that the Government have tabled some amendments, but I am not yet convinced that they go far enough.
That leads me to the amendments that have been tabled by the noble Lords, Lord Stevenson and Lord Clement-Jones, and particularly that in the name of the noble Lord, Lord Allan of Hallam. As his noble friend Lord Clement-Jones has added his name to it, perhaps he could answer my question when he gets up. Would the safeguards that are outlined there—the introduction of the Information Commissioner—meet the concerns of the big tech companies? Do we know whether it would meet their needs and therefore lead them not to feel it necessary to withdraw their services from the UK? I am keen to understand that.
There is another thing that might be of benefit for anyone listening to this debate who is not steeped in the detail of this Bill, and I look to any of those winding up to answer it—including my noble friend the Minister. Is this an end to end-to-end encryption? Is that what is happening in this Bill? Or is this about ensuring that what is already permissible in terms of the authorities being able to use their powers to go after suspected criminals is somehow codified in this Bill to make sure it has proper safeguards around it? That is still not clear. It would be very helpful to get that clarity from my noble friend, or others.
My Lords, it is a pleasure to follow the noble Baroness, Lady Stowell. My noble friend has spoken very cogently to Amendment 258ZA, and I say in answer to the question posed by the noble Baroness that I do not think this is designed to make big tech companies content. What it is designed to do is bring this out into the open and make it contestable; to see whether or not privacy is being invaded in these circumstances. To that extent it airs the issues and goes quite a long way towards allaying the concerns of those 80 organisations that we have heard from.
I am not going to repeat all the arguments of my noble friend, but many noble Lords, not least on the opposite Benches, have taken us through some of the potential security and privacy concerns which were also raised by my noble friends, and other reasons for us on these Benches putting forward these amendments. We recognise those concerns and indeed we recognise concerns on both sides. We have all received briefs from the NSPCC and the IWF, but I do not believe that essentially what is being proposed here in our amendments, or indeed by the amendments put forward by the noble Lord, Lord Stevenson, are designed in any way to prevent Ofcom doing its duty in relation to child sexual abuse and exploitation material in private messaging. We believe that review by the ICO to ensure that there is no invasion of privacy is a very useful mechanism.
We have all tried to find solutions and the Minister has put forward his stab at this with the skilled persons report. The trouble is, that does not go far enough, as the noble Baroness, Lady Stowell, said. Effectively, Ofcom can choose the skilled person and what the skilled person is asked to advise on. It is not necessarily comprehensive and that is essentially the major flaw.
As regards the amendments put forward by the noble Lord, Lord Stevenson, it is interesting that the Equality and Human Rights Commission itself said:
“We are concerned by the extent and seriousness of CSEA content being shared online. But these proposed measures may be a disproportionate infringement on millions of individuals’ right to privacy where those individuals are not suspected of any wrongdoing”.
It goes on to say:
“We recommend that Ofcom should be required to apply to an independent judicial commissioner—as is the case for mass surveillance under the Investigatory Powers Act”.
I am sure that is the reason why the noble Lord, Lord Stevenson, put forward his amendments; if he put them to a vote, we would follow and support. Otherwise, we will put our own amendments to the House.
My Lords, this has been—since we first got sight of the Bill and right the way through—one of the most difficult issues to try to find balance and a solution. I know that people have ridiculed my attempt to try and get people to speak less in earlier amendments. Actually, in part it was so we could have a longer debate here—so the noble Lord, Lord Moylan, should not be so cross with me, and I hope that we can continue to be friends, as we are outside the Chamber, on all points, not just this one.
Talk is not getting us to a solution on this, unfortunately. I say to the Minister: I wonder whether there is a case here for pausing a little bit longer on this, because I still do not think we have got to the bottom of where the balance lies. I want to explain why I say that, because, in a way, I follow the noble Baroness, Lady Stowell, in worrying that there are some deeper questions here that we have not quite got the answers to. Nothing in the current amendments gets us to quite the right place.
I started by thinking that, if only because Ofcom was being seen to be placed in a position of both being a part of the regulatory process, but also having the rights to interpose itself into where this issue about encryption came up, Ofcom needed the safety of an external judicial review along the lines of the current RIPA system. That has led us to my Amendments 256, 257 and 259, which try to distil that sensibility into a workable frame for the Bill and these issues. I will not push it to a vote. It is there because I wanted to have in the discussion a proper look at what the RIPA proposal would look like in practice.
In the intervening period between Committee and today, the Government have also laid their amendments, which the noble Lord, Lord Parkinson, has introduced very well. I like a lot of them; they go a long way down the track to where we want to be on this. I particularly like the protection for journalistic content, which I think was lacking before. The way of introducing the skilled person into it, and the obtaining of an externally sourced report—even it is internally commissioned—does bring in an outside voice which will be helpful. The addition of privacy to that makes sure that the issues addressed will be the ones that need to be bottomed out before any decisions are taken.
It is helpful to state—again, repeated, and implicit in the amendments, but it is nice to see it again—that this is not about Ofcom as the regulator looking at people’s messages. It is clearly about making sure that there is the capacity to investigate criminality where it is clearly suspected and where evidence exists for that, and that the companies themselves have the responsibility for taking the action as a result. That is a good place to be, and I think it is right. The difficulty is that I do not think that that sensibility quite takes the trick about what society as a whole should expect from the regulator in relation to this particular activity. It leaves open the question of how much external supervision there would really be.
If we accept that it is not about Ofcom reading private messages or general monitoring—which I note the Minister has confirmed today; he was a bit reluctant to do so in Committee, but I am grateful to him for repeating it today—where is it in the statute? That is a good question: perhaps we could have an answer to it, because I do not think it does appear. It is important to reassure people that there is nothing in this set of proposals, and nothing in the Bill, that requires Ofcom to generally survey the material which is passing through the system. I still think people are concerned that this is about breaking end-to-end encryption, and that this is an attack on privacy which would be a very bad step. If that remains the situation and the Government have failed to convince, that therefore reinforces my suggestion to the Minister that at the end of this debate he might feel it necessary to spend a little more time and discussion trying to get us to where we want to go.
I am very grateful to those who have suggested that our amendments are the right way to go. As I have said, I will not be pushing them—the reasons being that I think they go a little too far, but a little more of that would not be a bad thing. The Government are almost there with that, but I think a bit more time, effort and concern about some of the suggestions would probably get us to a better place than we are at the moment. I particularly think that about those from the noble Baroness, Lady Harding, about taking the lessons from what has happened in other places and trying to systematise that so it is clear that there are external persons and we know who they are, what their backgrounds are and what their roles will be. I look forward to hearing from the Minister when he comes to respond, but, just for confirmation, I do not think this is the appropriate place to vote, and should a vote be called, we will be abstaining.
I am grateful to noble Lords for their further scrutiny of this important but complex area, and for the engagement that we have had in the days running up to it as well. We know how child sexual exploitation and abuse offenders sadly exploit private channels, and the great danger that this poses, and we know how crucial these channels are for secure communication. That is why, where necessary and proportionate, and where all the safeguards are met, it is right that Ofcom can require companies to take all technically feasible measures to remove this vile and illegal content.
The government amendments in this group will go further to ensure that a notice is well informed and targeted and does not unduly restrict users’ rights. Privacy and safety are not mutually exclusive—we can and must have both. The safety of our children depends on it.
I make it clear again that the Bill does not require companies to break or weaken end-to-end encryption on their services. Ofcom can require the use of technology on an end-to-end encrypted service only when it is technically feasible and has been assessed as meeting minimum standards of accuracy. When deciding whether to issue a notice, Ofcom will engage in continual dialogue with the company and identify reasonable, technically feasible solutions to the issues identified. As I said in opening, it is right that we require technology companies to use their considerable resources and expertise to develop the best possible protections to keep children safe in encrypted environments. They are well placed to innovate to find solutions that protect both the privacy of users and the safety of children.
Just to be clear, am I right to understand my noble friend as saying that there is currently no technology that would be technically acceptable for tech companies to do what is being asked of them? Did he say that tech companies should be looking to develop the technology to do what may be required of them but that it is not currently available to them?
For clarification, if the answer to that is that the technology does not exist—which I believe is correct, although there are various snake oil salespeople out there claiming that it does, as the noble Baroness, Lady Fox of Buckley, said—my noble friend seems to be saying that the providers and services should develop it. This seems rather circular, as the Bill says that they must adopt an approved technology, which suggests a technology that has been imposed on them. What if they cannot and still get such a notice? Is it possible that these powers will never be capable of being used, especially if they do not co-operate?
To answer my noble friend Lady Stowell first, it depends on the type of service. It is difficult to give a short answer that covers the range of services that we want to ensure are covered here, but we are seeking to keep this and all other parts of the Bill technology neutral so that, as services develop, technology changes and criminals, unfortunately, seek to exploit that, technology companies can continue to innovate to keep children safe while protecting the privacy of their users. That is a long-winded answer to my noble friend’s short question, but necessarily so. Ofcom will need to make its assessments on a case- by-case basis and can require a company to use its best endeavours to innovate if no effective and accurate technology is currently available.
While I am directing my remarks towards my noble friend, I will also answer a question she raised earlier on general monitoring. General monitoring is not a legally defined concept in UK law; it is a term in European Union law that refers to the generalised monitoring of user activity online, although its parameters are not clearly defined. The use of automated technologies is already fundamental to how many companies protect their users from the most abhorrent harms, including child sexual abuse. It is therefore important that we empower Ofcom to require the use of such technology where it is necessary and proportionate and ensure that the use of these tools is transparent and properly regulated, with clear and appropriate safeguards in place for users’ rights. The UK’s existing intermediary liability regime remains in place.
Amendment 255 from my noble friend Lord Moylan seeks to prevent Ofcom imposing any requirement in a notice that would weaken or remove end-to-end encryption. He is right that end-to-end encryption should not be weakened or removed. The powers in the Bill will not do that. These powers are underpinned by proportionality and technical feasibility; if it is not proportionate or technically feasible for companies to identify child sexual exploitation abuse content on their platform while upholding users’ right to privacy, Ofcom cannot require it.
I agree with my noble friend and the noble Baroness, Lady Fox, that encryption is a very important and popular feature today. However, with technology evolving at a rapid rate, we cannot accept amendments that would risk this legislation quickly becoming out of date. Naming encryption in the Bill would risk that happening. We firmly believe that the best approach is to focus on strong safeguards for upholding users’ rights and ensuring that measures are proportionate to the specific situation, rather than on general features such as encryption.
The Bill already requires Ofcom to consider the risk that technology could result in a breach of any statutory provision or rule of law concerning privacy and whether any alternative measures would significantly reduce the amount of illegal content on a service. As I have said in previous debates, Ofcom is also bound by the Human Rights Act not to act inconsistently with users’ rights.
Will the Minister write to noble Lords who have been here in Committee and on Report in response to the fact that it is not just encryption companies saying that the demands of this clause will lead to the breaching of encryption, even though the Minister and the Government keep saying that it will not? As I have indicated, a wide range of scientists and technologists are saying that, whatever is said, demanding that Ofcom insists that technology notices are used in this way will inadvertently lead to the breaking of encryption. It would be useful if the Government at least explained scientifically and technologically why those experts are wrong and they are right.
I am very happy to put in writing what I have said from the Dispatch Box. The noble Baroness may find that it is the same, but I will happily set it out in further detail.
I should make it clear that the Bill does not permit law enforcement agencies to access information held on platforms, including access to private channels. The National Crime Agency will be responsible for receiving reports from in-scope services via secure transmission, processing these reports and, where appropriate, disseminating them to other UK law enforcement bodies and our international counterparts. The National Crime Agency will process only information provided to it by the company; where it determines that the content is child sexual abuse content and meets the threshold for criminality, it can request further information from the company using existing powers.
I am glad to hear that my noble friend Lord Moylan does not intend to divide on his amendment. The restrictions it sets out are not ones we should impose on the Bill.
Amendments 256, 257 and 259 in the name of the noble Lord, Lord Stevenson of Balmacara, require a notice to be approved by a judicial commissioner appointed under the Investigatory Powers Act 2016 and remove Ofcom’s power to require companies to make best endeavours to develop or source new technology to address child sexual exploitation and abuse content.
The Investigatory Powers Act and this Bill are very different regimes which should not be conflated. The Investigatory Powers Act comprehensively sets out the powers of public bodies, including the UK’s intelligence agencies and police, to access communications and content data, and establishes safeguards and restrictions for that. This Bill, by contrast, is a risk-based regime requiring companies to take responsibility for the harms facilitated by their own services. The powers in Clause 111 can require a company to tackle the huge volume of child sexual exploitation and abuse that is, sadly, manifested on private channels. This does not supplement or overlap with the roles or powers of the security services or the police, which are provided for by the Investigatory Powers Act.
It is right that these two regimes are overseen by two different, independent regulators. It would not be appropriate for the judicial commissioners to oversee Ofcom’s work. More importantly, it is not necessary: the Bill already contains robust safeguards requiring Ofcom to consider large quantities of information to allow for evidence-based decision-making. The government amendments in this group, which I have spoken to, further strengthen those safeguards.
In removing the “best endeavours” power, the noble Lord’s amendments would significantly reduce the capacity for a notice to be flexible and pragmatic. The power allows Ofcom to require companies to innovate and design practical, proportionate solutions that are compatible with their own service. Without this power, Ofcom could be left with the choice of requiring the use of incompatible and ineffective technologies, or doing nothing and allowing child abuse material to continue to proliferate on a service. Removing this power would not be in anyone’s interest.
I turn now to the amendments in the name of the noble Lord, Lord Allan of Hallam, which seek to introduce a public consultation before Ofcom issues a notice, and a review of notices by the Information Commissioner’s Office. I recognise that the aim of Amendment 258 is to provide transparency, but it will not always be appropriate for Ofcom to share the details of a notice with a public audience. There is a high risk that its content and context could be exploited by criminals and used to further illegal activities online. We agree, however, that Ofcom must be as transparent as possible in carrying out its functions. That is why Ofcom must report on its use of Clause 111 powers in an annual report. That will ensure that key facts about Ofcom’s decisions are placed in the public domain. In addition, Ofcom is required by the Communications Act, when carrying out its functions, to have regard to the principles under which regulatory activities should be transparent and accountable.
On Amendment 258ZA, on which the noble Lord said he may test the opinion of your Lordships’ House, while he is right to emphasise the expertise of the Information Commissioner’s Office, I hope will not seek to divide, because the requirement in his amendment is duplicative. I agree with him that it is important that Ofcom and the ICO work closely together. Ofcom is required to consult the ICO before producing guidance on how it will use its Clause 111 powers, and it may consult the Information Commissioner’s Office before issuing a notice, where necessary—for example, if a company has made representations about a proposed requirement.
Ofcom cannot take any action which breaches data protection and privacy legislation, nor can it require services to do so. That is already set out both in the Bill and in existing regulations. Should services wish a notice to be reviewed, the Bill already provides robust routes of appeal under Clause 151 via the Upper Tribunal, which will consider whether the regulator’s decisions have been made lawfully.
I will touch on the question raised by my noble friend Lady Harding of Winscombe—
I appreciate the tone of the Minister’s comments very much, but they are not entirely reassuring me. There is a debate going on out there: there are people saying, “We’ve got these fabulous technologies that we would like Ofcom to order companies to install” and there are companies saying, “That would be disastrous and break encryption if we had to install them”. That is a dualistic situation where there is a contest going on. My amendment seeks to make sure the conflict can be properly resolved. I do not think Ofcom on its own can ever do that, because Ofcom will always be defending what it is doing and saying “This is fine”. So, there has to be some other mechanism whereby people can say it is not fine and contest that. As I say, in this debate we are ignoring the fact that they are already out there: people saying “We think you should deploy this” and companies saying “It would be disastrous if we did”. We cannot resolve that by just saying “Trust Ofcom”.
To meet the expectation the noble Lord voiced earlier, I will indeed point out that Ofcom can consult the ICO as a skilled person if it wishes to. It is important that we square the circle and look at these issues. The ICO will be able to be involved in the way I have set out as a skilled person.
Before I conclude, I want to address my noble friend Lady Harding’s questions on skilled persons. Given that notices will be issued on a case-by-case basis, and Ofcom will need to look at specific service design and existing systems of a provider to work out how a particular technology would interact with that design system, a skilled person’s report better fits this process by requiring Ofcom to obtain tailored advice rather than general technical advice from an advisory board. The skilled person’s report will be largely focused on the technical side of Ofcom’s assessment: that is to say, how the technology would interact with the service’s design and existing systems. In this way, it offers something similar to but more tailored than a technical advisory board. Ofcom already has a large and expert technology group, whose role it is to advice policy teams on new and existing technologies, to anticipate the impact of technologies and so on. It already has strong links with academia and with external researchers. A technical advisory board would duplicate that function. I hope that reassures my noble friend that the points she raised have been taken into account.
So I hope the noble Lord, Lord Allan, will not feel the need to divide—
Before the Minister finishes, I posed the question about whether, given the debate and issues raised, he felt completely satisfied that we had arrived at the right solution, and whether there was a case for withdrawing the amendment at this stage and bringing it back at Third Reading, having had further discussions and debate where we could all agree. I take it his answer is “no”.
I am afraid it is “no”, and if the noble Lord, Lord Allan, does seek to divide, we will oppose his amendment. I commend the amendments standing in my name in this group to the House.
Amendment 250B agreed.
Amendments 250C to 251
Moved by
250C: Clause 94, page 86, line 34, leave out paragraph (c)
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted in my name after Clause 111. It omits words in Clause 94 (skilled person’s reports) because that new Clause now requires OFCOM to obtain a skilled person’s report before giving a provider a notice under Clause 111.
250D: Clause 94, page 86, line 41, at end insert—
“(2A) Section (Requirement to obtain skilled person’s report) requires OFCOM to exercise the power in subsection (3) for the purpose of assisting OFCOM in connection with a notice under section 111(1).”Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted in my name after Clause 111. It inserts a signpost in Clause 94 (skilled persons’ reports).
251: Clause 94, page 87, line 39, at end insert—
“(iiia) section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2));”Member’s explanatory statement
This amendment ensures that OFCOM are able to require a skilled person’s report about a failure or possible failure to comply with the new duties to carry out assessments (see the new Clause proposed after Clause 11 in my name).
Amendments 250C to 251 agreed.
Amendment 252
Moved by
252: Clause 94, page 88, line 2, at end insert—
“(xiia) section (Disclosure of information about use of service by deceased child users) (deceased child users);” Member’s explanatory statement
This amendment has the effect that OFCOM may require a skilled person’s report in relation to compliance with the new duties imposed by the Clause proposed after Clause 67 in my name.
Amendment 252 agreed.
Schedule 12: OFCOM’s powers of entry, inspection and audit
Amendments 252A to 252G
Moved by
252A: Schedule 12, page 228, line 4, at end insert—
“(4A) The power to observe the carrying on of the regulated service at the premises includes the power to view, using equipment or a device on the premises, information generated in real time by the performance of a test or demonstration required by a notice given under paragraph 3.”Member’s explanatory statement
This amendment ensures that during an inspection of a service, OFCOM have the power to observe a test or demonstration of which notice has been given.
252B: Schedule 12, page 228, line 7, leave out from “paragraph” to “is” in line 9 and insert “only so far as”
Member’s explanatory statement
This is a technical amendment consequential on the preceding amendment in my name.
252C: Schedule 12, page 228, line 15, leave out “or relevant documents to be produced,” and insert “relevant documents to be produced, or a relevant test or demonstration to be performed,”
Member’s explanatory statement
This amendment, and the next two amendments in my name, concern OFCOM giving advance notice to a provider that they will want to observe a test or demonstration during an inspection.
252D: Schedule 12, page 228, line 19, leave out “documents are “relevant” if they are” and insert “a document, test or demonstration is “relevant” if it is”
Member’s explanatory statement
See the explanatory statement to the preceding amendment in my name.
252E: Schedule 12, page 228, line 23, leave out “or the documents to be produced,” and insert “the documents to be produced, or the test or demonstration to be performed,”
Member’s explanatory statement
See the explanatory statement to the preceding amendment in my name.
252F: Schedule 12, page 229, line 3, at end insert—
“(da) to assist an authorised person to view, using equipment or a device on the premises, information demonstrating in real time the operation of systems, processes or features of a specified description, including functionalities or algorithms of a specified description;“(db) to assist an authorised person to view, using equipment or a device on the premises, information generated in real time by the performance of a test or demonstration of a specified description;”Member’s explanatory statement
This amendment makes it clear that the powers of OFCOM during an audit of a service extend to using equipment on the premises to view real time information showing the operation of the service or the performance of a test or demonstration, if specified in advance in the audit notice.
252G: Schedule 12, page 233, line 38, leave out paragraph (ii)
Member’s explanatory statement
This is a drafting change removing a redundant paragraph from the Bill.
Amendments 252A to 252G agreed.
Amendment 253 not moved.
Clause 105: Disclosure of information
Amendment 254
Moved by
254: Clause 105, page 94, line 33, at end insert—
“(3A) In subsection (3), after paragraph (h) insert—“(ha) a person appointed under—(i) paragraph 1 of Schedule 3 to the Coroners and Justice Act 2009, or(ii) section 2 of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.));(hb) the procurator fiscal, within the meaning of the enactment mentioned in subsection (5)(s);”.(3B) In subsection (5)—(a) before paragraph (d) insert—“(ca) the Coroners Act (Northern Ireland) 1959;”,(b) after paragraph (na) insert—“(nb) Part 1 of the Coroners and Justice Act 2009;”, and(c) after paragraph (r) insert—“(s) the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2).”.”Member’s explanatory statement
This amendment ensures that it is not necessary for OFCOM to obtain the consent of providers of internet services before disclosing information to a coroner or, in Scotland, procurator fiscal, who is investigating a person’s death.
Amendment 254 agreed.
Clause 107: Provision of information to the Secretary of State
Amendments 254A and 254B
Moved by
254A: Clause 107, page 95, line 20, leave out “(2)” and insert “(3)”
Member’s explanatory statement
This is a technical drafting change needed because section 24B of the Communications Act 2003 has been amended after this Bill was introduced.
254B: Clause 107, page 95, leave out line 21 and insert—
(4) Subsection (2) does not apply to information—”Member’s explanatory statement
This is a technical drafting change needed because section 24B of the Communications Act 2003 has been amended after this Bill was introduced.
Amendments 254A and 254B agreed.
Clause 111: Notices to deal with terrorism content or CSEA content (or both)
Amendment 255 not moved.
Amendment 255A
Moved by
255A: Clause 111, page 98, line 8, at end insert—
“(za) section (Requirement to obtain skilled person’s report), which requires OFCOM to obtain a skilled person’s report before giving a notice under subsection (1),”Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted in my name after Clause 111. It inserts a signpost to the requirement in that new Clause to obtain a skilled person’s report before giving a provider a notice under Clause 111.
Amendment 255A agreed.
Amendment 256 not moved.
Amendment 256A
Moved by
256A: After Clause 111, insert the following new Clause—
“Requirement to obtain skilled person’s report
(1) OFCOM may give a notice under section 111(1) to a provider only after obtaining a report from a skilled person appointed by OFCOM under section 94(3).(2) The purpose of the report is to assist OFCOM in deciding whether to give a notice under section 111(1), and to advise about the requirements that might be imposed by such a notice if it were to be given.”Member’s explanatory statement
This amendment requires OFCOM to obtain a skilled person’s report under Clause 94 before giving a notice to a provider under Clause 111.
Amendment 256A agreed.
Amendment 257 not moved.
Clause 112: Warning notices
Amendments 257A and 257B
Moved by
257A: Clause 112, page 98, line 24, at end insert—
“(za) contain a summary of the report obtained by OFCOM under section (Requirement to obtain skilled person’s report),”Member’s explanatory statement
This amendment requires a warning notice given to a provider to contain a summary of the skilled person’s report obtained by OFCOM under the new Clause proposed to be inserted in my name after Clause 111.
257B: Clause 112, page 98, line 37, at end insert—
“(za) contain a summary of the report obtained by OFCOM under section (Requirement to obtain skilled person’s report),”Member’s explanatory statement
This amendment requires a warning notice given to a provider to contain a summary of the skilled person’s report obtained by OFCOM under the new Clause proposed to be inserted in my name after Clause 111.
Amendments 257A and 257B agreed.
Clause 113: Matters relevant to a decision to give a notice under section 111(1)
Amendments 257C to 257F
Moved by
257C: Clause 113, page 99, line 32, at end insert—
“(ga) the contents of the skilled person’s report obtained as required by section (Requirement to obtain skilled person’s report);”Member’s explanatory statement
This amendment requires OFCOM to consider the contents of the skilled person’s report obtained as required by the new Clause proposed to be inserted in my name after Clause 111, as part of OFCOM’s decision about whether it is necessary and proportionate to give a notice to a provider under Clause 111.
257D: Clause 113, page 99, line 40, at end insert—
“(ia) in the case of a notice relating to a user-to-user service (or to the user-to-user part of a combined service), the extent to which the use of the specified technology would or might—(i) have an adverse impact on the availability of journalistic content on the service, or(ii) result in a breach of the confidentiality of journalistic sources;”Member’s explanatory statement
This amendment requires OFCOM to consider the impact of the use of technology on the availability of journalistic content and the protection of journalistic sources, as part of OFCOM’s decision about whether it is necessary and proportionate to give a notice to a provider under Clause 111.
257E: Clause 113, page 100, line 4, after “(i)” insert “, (ia)”
Member’s explanatory statement
This amendment is consequential on the preceding amendment of this Clause in my name.
257F: Clause 113, page 100, line 5, at end insert—
““journalistic content” has the meaning given by section 15;”Member’s explanatory statement
This amendment adds a definition of journalistic content to Clause 113.
Amendments 257C to 257F agreed.
Clause 114: Notices under section 111(1): supplementary
Amendment 258 not moved.
Amendment 258ZA
Moved by
258ZA: After Clause 114, insert the following new Clause—
“Review by the Information Commissioner of notices under Section 111(1)
(1) Where a provider believes that a notice it has been given under section 111(1) will have a material impact on the private communications of its users, it may request a review by the Information Commissioner.(2) The review must consider the compatibility of the notice with—(a) the Human Rights Act 1998,(b) the Data Protection Act 2018,(c) the Privacy and Electronic Communications (EC Directive) Regulations 2003, and(d) any other legislation the Information Commissioner considers relevant.(3) In carrying out the review, the Information Commissioner must consult—(a) OFCOM,(b) the provider,(c) UK users of the provider’s service, and (d) such other persons as the Information Commissioner considers appropriate.(4) Following a review under subsection (1) the Information Commissioner must publish a report including—(a) their determination of the compatibility of the notice with relevant legislation,(b) their reasons for making such a determination, and(c) their advice to OFCOM in respect of the drafting and implementation of the notice.”Member’s explanatory statement
This amendment would give providers a right to request an assessment by the ICO of the compatibility of a section 111 order with UK privacy legislation.
I wish to test the opinion of the House.
Clause 115: Review and further notice under section 111(1)
Amendment 258A
Moved by
258A: Clause 115, page 102, line 24, leave out “Section 112 (warning notices) does” and insert “Sections (Requirement to obtain skilled person’s report)(skilled person’s report) and 112 (warning notices) do”
Member’s explanatory statement
This amendment provides that, if OFCOM propose to issue a further notice under Clause 111, it is not necessary to obtain a further skilled person’s report under the new Clause proposed to be inserted in my name after Clause 111.
Amendment 258A agreed.
Clause 118: Interpretation of this Chapter
Amendment 259 not moved.
Clause 120: Requirements enforceable by OFCOM against providers of regulated services
Amendments 260 and 261
Moved by
260: Page 105, line 4, at end insert—
“Section (Assessment duties: user empowerment) Assessments related to duty in section 12(2)”
This amendment ensures that OFCOM are able to use their enforcement powers in Chapter 6 of Part 7 in relation to a breach of any of the new duties imposed by the Clause proposed after Clause 11 in my name.
261: Page 105, line 28, at end insert—
“Section (Disclosure of information about use of service by deceased child users) Information about use of service by deceased child users”
This amendment ensures that OFCOM are able to use their enforcement powers in Chapter 6 of Part 7 in relation to a breach of any of the new duties imposed by the Clause proposed after Clause 67 in my name.
Amendments 260 and 261 agreed.
Clause 122: Confirmation decisions: requirements to take steps
Amendment 262
Moved by
262: Clause 122, page 107, line 7, leave out “for constraints on” and insert “in relation to”
Member’s explanatory statement
This amendment is consequential on the amendments of Clause 125 in my name.
Amendment 262 agreed.
Amendment 262A
Moved by
262A: Clause 122, page 107, line 17, at end insert—
“(ba) specify which of those requirements (if any) have been designated as CSEA requirements (see subsections (5A) and (5B)),”Member’s explanatory statement
This amendment is consequential on the next amendment to this Clause in my name.
My Lords, in moving Amendment 262A, I will speak also to the other government amendments in the group. These amendments address the Bill’s enforcement powers. Government Amendments 262A, 262B, 262C, 264A and 266A, Amendments 265, 266 and 267, tabled by my noble friend Lord Bethell, and Amendment 268 tabled by the noble Lord, Lord Stevenson of Balmacara, relate to senior management liability. Amendment 268C from the noble Lord, Lord Weir of Ballyholme, addresses interim service restriction orders.
In Committee, we amended the Bill to create an offence of non-compliance with steps set out in confirmation decisions that relate to specific children’s online safety duties, to ensure that providers and individuals can be held to account where their non-compliance risks serious harm to children. Since then, we have listened to concerns raised by noble Lords and others, in particular that the confirmation decision offence would not tackle child sexual exploitation and abuse. That is why the government amendments in this group will create a new offence of a failure to comply with a child sexual exploitation and abuse requirement imposed by a confirmation decision. This will mean that providers and senior managers can be held liable if they fail to comply with requirements to take specific steps as set out in Ofcom’s confirmation decision in relation to child sexual exploitation and abuse on their service.
Ofcom must designate a step in a confirmation decision as a child sexual exploitation and abuse requirement, where that step relates, whether or not exclusively, to a failure to comply with specific safety duties in respect of child sexual exploitation and abuse content. Failure to comply with such a requirement will be an offence. This approach is necessary, given that steps may relate to multiple or specific kinds of illegal content, or systems and process failures more generally. This approach will ensure that services know from the confirmation decision when they risk criminal liability, while providing sufficient legal certainty via the specified steps to ensure that the offence can be prosecuted effectively.
The penalty for this offence is up to two years in prison, a fine or both. Through Clause 182, where an offence is committed with the consent or connivance of a senior manager, or attributable to his or her neglect, the senior manager, as well as the entity, will have committed the offence and can face up to two years in prison, a fine or both.
I thank my noble friend Lord Bethell, as well as our honourable friends Miriam Cates and Sir William Cash in another place, for their important work in raising this issue and their collaborative approach as we have worked to strengthen the Bill in this area. I am glad that we have reached a position that will help to keep children safe online and drive a change in culture in technology companies. I hope this amendment reassures them and noble Lords that the confirmation decision offence will tackle harms to children effectively by ensuring that technology executives take the necessary steps to keep children safe online. I beg to move.
My Lords, I will briefly comment positively on the Minister’s explanation of how these offences might work, particularly the association of the liability with the failure to enforce a confirmation decision, which seems entirely sensible. In an earlier stage of the debate, there was a sense that we might associate liability with more general failures to enforce a duty of care. That would have been problematic, because the duty of care is very broad and requires a lot of pieces to be put in place. Associating the offences with the confirmation decision makes absolute sense. Having been in that position, if, as an executive in a tech company, I received a confirmation decision that said, “You must do these things”, and I chose wilfully to ignore that decision, it would be entirely reasonable for me to be held potentially criminally liable for that. That association is a good step forward.
My Lords, I will speak to Amendment 268C, which is in my name and that of the noble Baroness, Lady Benjamin, who has been so proactive in this area. The amendment seeks to clarify the threshold for Ofcom to take immediate enforcement action when children are exposed to suicide, self-harm, eating disorders and pornographic materials. It would require the regulator to either take that action or at least provide an explanation to the Secretary of State within a reasonable timeframe as to why it has chosen not to.
When we pass the Bill, the public will judge it not simply on its contents but on its implementation, its enforcement and the speed of that enforcement. Regulatory regimes as a whole work only if the companies providing the material believe the regulator to be sufficiently muscular in its approach. Therefore, the test is not simply what is there but how long it will take for a notice, whenever it is issued, to lead to direct change.
I will give two scenarios to illustrate the point. Let us take the example of a video encouraging the so-called blackout challenge, or choking challenge, which went viral on social media about two years ago. For those who are unaware, it challenged children to choke themselves to the point at which they lost consciousness and to see how long they could do that. This resulted in the death of about 15 children. If a similar situation arises and a video is not removed because it is not against the terms and conditions of the service, does Ofcom allow the video to circulate for a period of, say, six months while giving a grace period for the platform to introduce age gating? What if the platform fails to implement that highly effective age verification? How long will it take to get through warnings, a provisional notice of contravention, a representation period, a confirmation decision and the implementation of required measures before the site is finally blocked? As I indicated, this is not hypothetical; it draws from a real-life example. We know that this is not simply a matter of direct harm to children; it can lead to a risk of death, and has done in the past.
What about, for example, a pornographic site that simply has a banner where a person can self-declare that they are over 18 in order to access it? I will not rehearse, since they have been gone through a number of times, the dangers for children of early exposure to violent pornography and the impact that will have on respectful relationships, as we know from government reports, and particularly the risk it creates of viewing women as sex objects. It risks additional sexual aggression towards women and perpetuates that aggression. Given that we are aware that large numbers of children have access to this material, surely it would be irresponsible to sacrifice another generation of children to a three-year implementation process.
I am sure the Minister will seek to give the assurance that Ofcom does indeed have the power under Clause 138 to act immediately by applying the interim service restrictions when the nature and severity of the content demands. We welcome that change, which has been put into the Bill, and the power contained within it.
What we are seeking to probe is the fact that, while the Bill provides that power, by its nature it does not provide a great deal of guidance to Ofcom and the courts on when they should consider that the threshold for interim disruption measures has been reached. Both the scenarios I have mentioned involve what the Bill designates as primary priority harms to children—that is, the most severe harms. The Bill now requires highly effective age verification or age estimation for such content precisely because we cannot allow any such risks.
Any breach risks severe consequences. Will the Minister confirm that any likely failure to comply with the duties in Clauses 11(3)(a) and 72(2) would reach the threshold of severity for Ofcom to apply for an immediate interim service restriction order rather than awaiting the conclusion of the normal processes? If the answer to that question is yes, as I hope it is, Ofcom should consider applying for an order in such circumstances. I therefore ask the Minister whether he would consider making that clear in the Bill or giving other assurances as to how that will be implemented. Furthermore, applications will have to be assessed through the courts applying the same criteria. When these things are brought before the courts, there is always a danger of them being subject to additional review or indeed an inconsistent approach, so it is important that we give the greatest amount of guidance we can through the legislation itself.
There is a clear public expectation that we will tackle these issues. Again, we are not dealing with hypothetical or unprecedented situations. We should look at what has happened in other jurisdictions. In July 2020, France introduced powers for its regulator, Arcom, to apply for blocking orders against relevant ISPs if age verification is not implemented within 15 days of notification. Similar legislation being considered in the Canadian Parliament gives 20 days. By contrast, there is a bit of a gap in the Bill because we have not set a clear timetable on expected compliance.
I stress that nothing in the amendment seeks to undermine the discretion and operational independence of Ofcom. There will be times when the regulator is best placed to understand the situation and act accordingly—perhaps there has been a technical failure—but it is important that Ofcom be held accountable for the decisions it makes on enforcement, so that it needs either to act or to explain why it is not acting. That is what the second paragraph of the amendment seeks to do. It states that when Ofcom chooses not to apply interim orders when it is likely that after 14 days platforms are still allowing children to access primary priority content or pornographic content, it must provide written justification to the Secretary of State within a further period of seven days. So it does not require Ofcom to act but requires it at least to provide a justification for its decision.
Although we have reason to hope that Ofcom will act more swiftly under the Online Safety Bill, we are trying to judge this on the basis of previous experience. There is disappointment at times across the House at the slow progress in enforcing the video-sharing platform regime. It is nearly three years since that regime was introduced but we have still not seen the outcome of a single investigation against a platform. Greater communication and clarity throughout the process would go a huge way towards rebuilding that trust. I look forward to the Minister’s response, and I seek the assurances that lie at the heart of the amendment. On that basis, I commend the amendment to the House.
My Lords, I want to say “Hallelujah”. With this Bill, we have reached a landmark moment after the disappointments and obstacles that we have had over the last six years. It has been a marathon but we are now in the final straight with the finishing line in sight, after the extraordinary efforts by noble Lords on all sides of the House. I thank the Secretary of State for her commitment to this ground-breaking Bill, and the Minister and his officials for the effort they have put into it. The Minister is one of my “Play School” babies, who has done his utmost to make a difference in changing the online world. That makes me very happy.
We know that the eyes of the world are watching us because legislators around the world are looking for ways to extend the rule of law into the online world, which has become the Wild West of the 21st century, so it is critical that in our haste to reach the finishing post we do not neglect the question of enforcement. That is why I have put my name to Amendment 268C in the name of the noble Lord, Lord Weir: without ensuring that Ofcom is given effective powers for this task of unprecedented scale, the Bill we are passing may yet become a paper tiger.
The impact assessment for the Bill estimated that 25,000 websites would be in scope. Only last week, in an encouraging report by the National Audit Office on Ofcom’s readiness, we learned that the regulator’s own research has increased that estimate to 100,000, and the figure could be significantly higher. The report went on to point out that the great majority of those websites will be based overseas and will not have been regulated by Ofcom before.
The noble Lord, Lord Bethell, raised his concerns on the final day of Committee, seeking to amend the Bill to make it clear that Ofcom could take a schedule of a thousand sites to court and get them all blocked in one go. I was reassured when the Minister repeated the undertaking given by his counterpart in Committee in the other place that the Civil Procedure Rules already allow such multiparty claims. Will the Minister clarify once again that such enforcement at scale is possible and would not expose Ofcom to judicial review? That would give me peace of mind.
The question that remains for many is whether Ofcom will act promptly enough when children are at risk. I am being cautious because my experience in this area with regulators has led me not to assume that simply because this Parliament passes a law, it will be implemented. We all know the sorry tale of the Part 3 of the Digital Economy Act, when Ministers took it upon themselves not to decide when it should come into force, but to ask whether it should at all. When they announced that that should be never, the High Court took a dim view and allowed judicial review to proceed. Interestingly, the repeal of Part 3 and the clauses that replaced it may not have featured in this Bill were it not for that case—I always say that everything always happens for a reason. The amendment is a reminder to Ofcom that Parliament expects it to act, and to do so from the day when the law comes into force, not after a year’s grace period, six months or more of monitoring or a similar period of supervision before it contemplates any form of enforcement.
Many of the sites we are dealing with will not comply because this is the law; they will do so only when the business case makes compliance cheaper than the consequences of non-compliance, so this amendment is a gentle but necessary provision. If for any reason Ofcom does not think that exposing a significant number of children in this country to suicide, health harm, eating disorder or pornographic content—which is a universal plague—merits action, it will need to write a letter to the Secretary of State explaining why.
We have come too far to risk the Bill not being implemented in the most robust way, so I hope my noble friends will join me in supporting this belt-and-braces amendment. I look forward to the Minister’s response.
My Lords, we welcome the government amendments in this group to bring child sexual exploitation and abuse failures into the scope of the senior manager liability and enforcement regime but consider that they do not go far enough. On the government amendments, I have a question for the Minister about whether, through Clause 122, it would be possible to require a company that was subject to action to do some media literacy as part of its harm reduction; in other words, would it be possible for Ofcom to use its media literacy powers as part of the enforcement process? I offer that as a helpful suggestion.
We share the concerns expressed previously by the noble Lord, Lord Bethell, about the scope of the senior manager liability regime, which does not cover all the child safety duties in the Bill. We consider that Amendment 268, in the name of my noble friend Lord Stevenson, would provide greater flexibility, giving the possibility of expanding the list of duties covered in the future. I have a couple of brief questions to add to my first question. Will the Minister comment on how the operation of the senior manager liability regime will be kept under review? This has, of course, been something of a contentious issue in the other place, so could the Minister perhaps tell your Lordships’ House how confident he is that the current position is supported there? I look forward to hearing from the Minister.
I did not quite finish writing down the noble Baroness’s questions. I will do my best to answer them, but I may need to follow up in writing because she asked a number at the end, which is perfectly reasonable. On her question about whether confirmation decision steps could include media literacy, yes, that is a good idea; they could.
Amendment 268, tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to enable the Secretary of State, through regulation, to add to the list of duties which are linked to the confirmation decision offence. We are very concerned at the prospect of allowing an unconstrained expansion of the confirmation decision offence. In particular, as I have already set out, we would be concerned about expansion of those related to search services. There is also concern about unconstrained additions of any other duties related to user-to-user services as well.
We have chosen specific duties which will tackle effectively key issues related to child safety online and tackling child abuse while ensuring that the confirmation decision offence remains targeted. Non-compliance with a requirement imposed by a confirmation decision in relation to such duties warrants the prospect of criminal enforcement on top of Ofcom’s extensive civil enforcement powers. Making excessive changes to the offence risks shifting the regime towards a more punitive and disproportionate enforcement model, which would represent a significant change to the framework as a whole. Furthermore, expansion of the confirmation decision offence could lead to services taking an excessively cautious approach to content moderation to avoid the prospect of criminal liability. We are also concerned that such excessive expansion could significantly increase the burden on Ofcom.
I am grateful to the noble Lord, Lord Weir of Ballyholme, and the noble Baroness, Lady Benjamin, for the way they set out their Amendment 268C. We are concerned about this proposal because it is important that Ofcom can respond to issues on a case-by-case basis: it may not always be appropriate or proportionate to use a specific enforcement power in response to a suspected breach. Interim service restriction orders are some of the strongest enforcement powers in the Bill and will have a significant impact on the service in question. Their use may be disproportionate in cases where there is only a minor breach, or where a service is taking steps to deal with a breach following a provisional notice of contravention.
In contrast to applications for service restriction orders which require that there is a continuing failure to comply with an enforceable duty, interim service restriction orders require only that it is likely that there is a failure. This provision is included so that steps can be taken quickly where the level of risk of harm to people relating to the likely failure, and the nature and severity of that harm, are such that it would not be appropriate to wait to establish the failure before applying for the order. While the duties specified by Amendment 268C are important, putting pressure on Ofcom pre-emptively to take a specific course of enforcement action which is aimed at addressing only particularly urgent, high-risk scenarios would be counter to the intention of the rest of the framework; that is, to enable an efficient, proportionate and targeted approach.
It is important that Ofcom takes proportionate steps. Section 11(3)(a) is a duty to operate a service using proportionate systems and processes designed to prevent children of any age encountering, by means of the service, primary priority content that is harmful to them. As such, the prospective failure in relation to the duty may be to do with a particular aspect of the systems and processes that are in place. While these are important, a particular failure in relation to the systems and processes-focused duty might not necessarily warrant the drastic action of an interim court order requiring ancillary services providers to withdraw their services from a potentially non-compliant regulated service.
We are concerned that removing a proportionality ground for the court application for an interim service restriction order and requiring Ofcom to justify its decision not to apply for an interim disruption order to the Secretary of State would, in effect, pressure Ofcom to take an excessively punitive course of action. Likewise, in some cases it may be highly irregular for there to be an expectation that Ofcom would have to justify its decision not to apply for an interim disruption order in each particular case to the Secretary of State, but I am happy to reassure the noble Baroness, Lady Benjamin, again that business disruption measures will be able to operate effectively at scale and at sufficient speed. I hope that that provides enough reassurance to her and the noble Lord, Lord Weir, that they are willing to not move their amendment, but I am grateful for the support they voiced, as did others, for the government amendments in this group.
Amendment 262A agreed.
Amendment 262AA not moved.
Amendments 262B and 262C
Moved by
262B: Clause 122, page 107, line 35, at end insert—
“(5A) If the condition in subsection (5B) is met in relation to a requirement imposed by a confirmation decision which is of a kind described in subsection (1), OFCOM must designate the requirement as a “CSEA requirement” for the purposes of section 127(2A) (offence of failure to comply with confirmation decision).(5B) The condition referred to in subsection (5A) is that the requirement is imposed (whether or not exclusively) in relation to either or both of the following—(a) a failure to comply with section 9(2)(a) or (3)(a) in respect of CSEA content, or in respect of priority illegal content which includes CSEA content; (b) a failure to comply with section 9(2)(b) in respect of an offence specified in Schedule 6 (CSEA offences), or in respect of priority offences which include such an offence.”Member’s explanatory statement
This amendment provides that where a confirmation decision imposes a requirement to take steps in relation to a failure to comply with a duty under Clause 9(2)(a), (2)(b) or (3)(a) in respect of CSEA content or an offence under Schedule 6, OFCOM must designate the requirement as a CSEA requirement with the result that failure to comply with it is an offence (see the amendment inserting subsection (2A) into Clause 127 in my name).
262C: Clause 122, page 107, line 44, at end insert—
““CSEA content”, “priority illegal content” and “priority offence” have the same meaning as in Part 3 (see section 53);”Member’s explanatory statement
This amendment is consequential on the preceding amendment to this Clause in my name.
Amendments 262B and 262C agreed.
Clause 125: Confirmation decisions: proactive technology
Amendments 263 and 264
Moved by
263: Clause 125, page 109, line 27, leave out “constraints on OFCOM’s power” and insert “what powers OFCOM have”
Member’s explanatory statement
This amendment is consequential on the next amendment in my name.
264: Clause 125, page 109, line 30, at end insert—
“(1A) A proactive technology requirement may be imposed in a confirmation decision if—(a) the decision is given to the provider of an internet service within section 71(2), and(b) the decision is imposed for the purpose of complying with, or remedying the failure to comply with, the duty set out in section 72(2) (provider pornographic content).(1B) The following provisions of this section set out constraints on OFCOM’s power to include a proactive technology requirement in a confirmation decision in any case not within subsection (1A).”Member’s explanatory statement
This amendment has the effect that OFCOM may, in a confirmation decision, require a provider to use proactive technology if the purpose is to deal with non-compliance with Clause 72(2) (preventing children encountering provider pornographic content).
Amendments 263 and 264 agreed.
Clause 127: Confirmation decisions: offence
Amendment 264A
Moved by
264A: Clause 127, page 112, line 22, leave out “relates (whether or not exclusively) to” and insert “is imposed (whether or not exclusively) in relation to a failure to comply with”
Member’s explanatory statement
This is a technical amendment which adjusts the language of this provision.
Amendment 264A agreed.
Amendments 265 and 266 not moved.
Amendment 266A
Moved by
266A: Clause 127, page 112, line 27, at end insert—
“(2A) A person to whom a confirmation decision is given commits an offence if, without reasonable excuse, the person fails to comply with a CSEA requirement imposed by the decision (see section 122 (5A) and (5B)).”Member’s explanatory statement
This amendment provides that a person commits an offence if the person fails to comply, without reasonable excuse, with a CSEA requirement imposed by a confirmation decision given to the person (see the amendment inserting new subsections (5A) and (5B) into Clause 122 in my name.)
Amendment 266A agreed.
Amendments 267 and 268 not moved.
Schedule 13: Penalties imposed by OFCOM under Chapter 6 of Part 7
Amendments 268A and 268B
Moved by
268A: Schedule 13, page 236, line 12, leave out sub-paragraph (9) and insert—
“(9) Regulations made by OFCOM under section (Regulations by OFCOM about qualifying worldwide revenue etc)(1)(a)(including regulations making provision of a kind mentioned in section (Regulations by OFCOM about qualifying worldwide revenue etc)(3), (4) or (5)) apply for the purpose of determining the qualifying worldwide revenue of a provider of a regulated service for an accounting period as mentioned in this paragraph as they apply for the purpose of determining the qualifying worldwide revenue of a provider of a regulated service for a qualifying period for the purposes of Part 6.”Member’s explanatory statement
This amendment provides that regulations under the new Clause 76 proposed in my name about “qualifying worldwide revenue” for the purposes of Part 6 of the Bill (fees) also applies for the purposes of financial penalties under paragraph 4 of Schedule 13.
268B: Schedule 13, page 237, line 18, at end insert—
“(9) OFCOM may by regulations make provision about how the qualifying worldwide revenue of a group of entities is to be determined for the purposes of this paragraph.(10) Before making regulations under sub-paragraph (9) OFCOM must consult—(a) the Secretary of State,(b) the Treasury, and(c) such other persons as OFCOM consider appropriate.(11) Regulations under sub-paragraph (9) may make provision subject to such exemptions and exceptions as OFCOM consider appropriate.”Member’s explanatory statement
This amendment provides a power for OFCOM to make regulations setting out what is meant in paragraph 5 of Schedule 13 by references to the qualifying worldwide revenue of a group of entities.
Amendments 268A and 268B agreed.
Clause 134: Interim service restriction orders
Amendment 268C not moved.
Amendment 269 not moved.
Clause 141: Advisory committee on disinformation and misinformation
Amendments 269A and 269AA not moved.
Amendment 269B
Moved by
269B: Clause 141, page 128, line 19, leave out “duty” and insert “duties”
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
Amendment 269B agreed.
Amendments 269C and 269D not moved.
Clause 143: Research about users’ experiences of regulated services
Amendment 270 not moved.
Amendment 270A
Moved by
270A: After Clause 144, insert the following new Clause—
“Establishment of the Advocacy Body for Children
(1) There is to be a body corporate (“the Advocacy Body for Children”) to represent the interests of child users of regulated services.(2) A “child user”—(a) means any person aged 17 years or under who uses or is likely to use regulated internet services, and(b) includes both any existing child user and any future child user.(3) The functions of the Advocacy Body for Children must include, in relation to regulated services—(a) representing the interests of child users;(b) the protection and promotion of those interests;(c) monitoring implications of this Act’s implementation for those interests;(d) consideration of children’s rights under the United Nations Convention on the Rights of the Child, including (but not limited to) their participation rights;(e) any other matter connected with those interests.(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—(a) safety duties about illegal content, in particular CSEA content,(b) safety duties protecting children,(c) children’s access assessment duties, and(d) other enforceable requirements relating to children. (5) The Advocacy Body for Children must—(a) have due regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010,(b) assess emerging threats to child users of regulated services and bring information regarding those threats to OFCOM, and(c) publish an annual report related to the interests of child users.(6) The Advocacy Body for Children may undertake research on its own account.(7) The Advocacy Body for Children is to be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.(8) To establish the Advocacy Body for Children, OFCOM must—(a) appoint an organisation or organisations known to represent all children in the United Kingdom to be designated with the functions under this section, or(b) create an organisation to carry out the designated functions.(9) The governance functions of the Advocacy Body for Children must—(a) with the exception of the approval of its budget, remain independent of OFCOM, and(b) include representation of child users by young people under the age of 25 years.(10) The budget of the Advocacy Body for Children will be subject to annual approval by the board of OFCOM.(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body for Children, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 75).”Member’s explanatory statement
This new Clause would require Ofcom to establish a new advocacy body for child users of regulated internet services to represent, protect and promote their interests.
My Lords, I am grateful to the noble Baroness, Lady Newlove, and the noble Lord, Lord Clement-Jones, for adding their names to Amendment 270A, and to the NSPCC for its assistance in tabling this amendment and helping me to think about it.
The Online Safety Bill has the ambition, as we have heard many times, of making the UK the safest place for a child to be online. Yet, as drafted, it could pass into legislation without a system to ensure that children’s voices themselves can be heard. This is a huge gap. Children are experts in their own lives, with a first-hand understanding of the risks that they face online. It is by speaking to, and hearing from, children directly that we can best understand the harms they face online—what needs to change and how the regulation is working in practice.
User advocates are commonplace in most regulated environments and are proven to be effective. Leading children’s charities such as 5Rights, Barnardo’s and YoungMinds, as well as organisations set up by bereaved parents campaigning for child safety online, such as the Molly Rose Foundation and the Breck Foundation, have joined the NSPCC in calling for the introduction of this advocacy body for children, as set out in the amendment.
I do not wish to detain anyone. The Minister’s response when this was raised in Committee was, in essence, that this should go to the Children’s Commissioner for England. I am grateful to her for tracking me down in a Pret A Manger in Russell Square on Monday and having a chat. She reasonably pointed out that much of the amendment reads a bit like her job description, but she also could see that it is desirable to have an organisation such as the NSPCC set up a UK-wide helpline. There are children’s commissioners for Scotland, Wales and Northern Ireland who are supportive of a national advocacy body for children. She was suggesting —if the Minister agrees that this seems like a good solution—that they could commission a national helpline that works across the United Kingdom, and then advises a group that she could convene, including the children’s commissioners from the other nations of the United Kingdom. If that seems a good solution to the Minister, I do not need to press the amendment, we are all happy and we can get on with the next group. I beg to move.
My Lords, I just want to make some brief comments in support of the principle of what the noble Lord, Lord Knight, is aiming at in this amendment.
The Bill is going to have a profound impact on children in the United Kingdom. We hope that the most profound impact will be that it will significantly advance their interests in terms of safety online. But it will also potentially have a significant impact on what they can access online and the functionality of different services. They are going to experience new forms of age assurance, about which they may have very strong views. For example, the use of their biometric data to estimate their age will be there to protect them, but they may still have strong views about that.
I have said many times that there may be some measures in the Bill that will encourage services to become 18-plus only. That is not adult in the sense of adult content. Ordinary user-to-user social media services may look at the obligations and say, “Frankly, we would much rather restrict ourselves to users from the UK who identify as being 18-plus, rather than have to take on board all the associated liabilities in respect of children”—not because they are irresponsible, but precisely because they are responsible, and they can see that there is a lot of work to do in order to be legally and safely available to those under 18. For all those reasons, it is really important that the child advocacy body looks at things such as the United Nations Convention on the Rights of the Child and the rights of children to access information, and that it is able to take a view on them.
The reason I think that is important—as will any politician who has been out and spoken in schools—is that very often children are surprising in terms of what they see as their priorities. We make assumptions about their priorities, which can often be entirely wrong. There has been some really good work done on this. There was a project called EU Kids Online, back in the days of the EU, which used to look at children right across the European Union and ask them what their experience of being online was like and what was important to them. There are groups such as Childnet International, which for years has been convening groups of children and taking them to places such as the Internet Governance Forum. That always generates a lot of information that we here would not have thought of, about what children feel is really important to them about their online experience.
For all those reasons, it really would be helpful to institutionalise this in the new regime as some kind of body that looks in the round at children’s interests—their interests to stay safe, but also their interests to be able to access a wide variety of online services and to use the internet as they want to use it. I hope that that strengthens the case the noble Lord, Lord Knight, has made for such a body to exist in some kind of coalition-like format.
My Lords, I am afraid that I have some reservations about this amendment. I was trying not to, but I have. The way that the noble Lord, Lord Allan of Hallam, explained the importance of listening to young people is essential—in general, not being dictated to by them, but to understand the particular ways that they live their lives; the lived experience, to use the jargon. Particularly in relation to a Bill that spends its whole time saying it is designed to protect young people from harm, it might be worth having a word with them and seeing what they say. I mean in an ongoing way—I am not being glib. That seems very sensible.
I suppose my concern is that this becomes a quango. We have to ask who is on it, whether it becomes just another NGO of some kind. I am always concerned about these kinds of organisations when they speak “on behalf of”. If you have an advocacy body for children that says, “We speak on behalf of children”, that makes me very anxious. You can see that that can be a politically very powerful role, because it seems to have the authority of representing the young, whereas actually it can be entirely fictitious and certainly not democratic or accountable.
The key thing we discussed in Committee, which the noble Lord, Lord Knight of Weymouth, is very keen on—and I am too—is that we do not inadvertently deny young people important access rights to the internet in our attempt to protect them. That is why some of these points are here. The noble Baroness, Lady Kidron, was very keen on that. She wants to protect them but does not want to end up with them being denied access to important parts of the internet. That is all good, but I just think this body is wrong.
The only other thing to draw noble Lords’ attention to—I am not trying to be controversial, but it is worth noting—is that child advocacy is currently in a very toxic state because of some of the issues around who represents children. As we speak, there is a debate about, for example, whether the NSPCC has been captured by Stonewall. I make no comment because I do not know; I am just noting it. We have had situations where a child advocacy group such as Mermaids is now discredited because it is seen to have been promoting chest binders for young people, to have gone down the gender ideology route, which some people would argue is child abuse of a sort, advocating that young women remove their breasts—have double mastectomies. This is all online, by the way.
I know that some people would say, “Oh, you’re always going on about that”, but I raise it because it is a very real and current discussion. I know a lot of people who work in education, with young people or in children’s rights organisations, and they keep telling me that they are tearing themselves apart. I just wondered whether the noble Lord, Lord Knight, might note that there is a danger of walking into a minefield here—which I know he does not mean to walk into—by setting up an organisation that could end up being the subject of major culture wars rows or, even worse, one of those dreaded quangos that pretends it is representing people but does not.
My Lords, I think the upshot of this brief debate is that the noble Lord, Lord Knight —how he was tracked down in a Pret A Manger, I have no idea; he is normally too fast-moving for that—in his usual constructive and creative way is asking the Government to constructively engage to find a solution, which he discussed in that Pret A Manger, involving a national helpline, the NSPCC and the Children’s Commissioner, for the very reasons that he and my noble friend Lord Allan have put forward. In no way would this be some of kind of quango, in the words of the noble Baroness, Lady Fox.
This is really important stuff. It could be quite a game-changer in the way that the NSPCC and the Children’s Commissioner collaborate on tackling the issues around social media, the impact of the new rights under the Bill and so on. I very much hope that the Government will be able to engage positively on this and help to bring the parties together to, in a sense, deliver something which is not in the Bill but could be of huge importance.
My Lords, first, I reassure noble Lords that the Government are fully committed to making sure that the interests of children are both represented and protected. We believe, however, that this is already achieved through the provisions in the Bill.
Rather than creating a single advocacy body to research harms to children and advocate on their behalf, as the noble Lord’s amendment suggests, the Bill achieves the same effect through a combination of Ofcom’s research functions, the consultation requirements and the super-complaints provisions. Ofcom will be fully resourced with the capacity and technological ability to assess and understand emerging harms and will be required to research children’s experiences online on an ongoing basis.
For the first time, there will be a statutory body in place charged with protecting children from harm online. As well as its enforcement functions, Ofcom’s research will ensure that the framework remains up to date and that Ofcom itself has the latest, in-depth information to aid its decision-making. This will ensure that new harms are not just identified in retrospect when children are already affected by them and complaints are made; instead, the regulator will be looking out for new issues and working proactively to understand concerns as they develop.
Children’s perspectives will play a central role in the development of the framework, as Ofcom will build on its strong track record of qualitative research to ensure that children are directly engaged. For example, Ofcom’s ongoing programme, Children’s Media Lives, involves engaging closely with children and tracking their views and experiences year on year.
Alongside its own research functions, super-complaints will ensure that eligible bodies can make complaints on systemic issues, keeping the regulator up to date with issues as they emerge. This means that if Ofcom does not identify a systemic issue affecting children for any reason, it can be raised and then dealt with appropriately. Ofcom will be required to respond to the super-complaint, ensuring that its subsequent decisions are understood and can be scrutinised. Complaints by users will also play a vital role in Ofcom’s horizon scanning and information gathering, providing a key means by which new issues can be raised.
The extensive requirements for Ofcom to consult on codes of practice and guidance will further ensure that it consistently engages with groups focused on the interests of children as the codes and guidance are developed and revised. Children’s interests are embedded in the implementation and delivery of this framework.
The Children’s Commissioner will play a key and ongoing role. She will be consulted on codes of practice and any further changes to those codes. The Government are confident that she will use her statutory duties and powers effectively to understand children’s experiences of the digital world. Her primary function as Children’s Commissioner for England is promoting and protecting the rights of children in England and to promote and protect the rights of children across the United Kingdom where those rights are or may be affected by reserved matters. As the codes of practice and the wider Bill relate to a reserved area of law—namely, internet services—the Children’s Commissioner for England will be able to represent the interests of children from England, Scotland, Wales and Northern Ireland when she is consulted on the preparation of codes of practice. That will ensure that children’s voices are represented right across the UK. The Children’s Commissioner for England and her office also regularly speak to the other commissioners about ongoing work on devolved and reserved matters. Whether she does that in branches of Pret A Manger, I do not know, but she certainly works with her counterparts across the UK.
I am very happy to take back the idea that the noble Lord has raised and discuss it with the commissioner. There are many means by which she can carry out her duties, so I am very happy to take that forward. I cannot necessarily commit to putting it in legislation, but I shall certainly commit to discussing it with her. On the proposals in the noble Lord’s amendment, we are concerned that a separate child user advocacy body would duplicate the functions that she already has, so I hope with that commitment he will be happy to withdraw.
My Lords, I am grateful to those who have spoken in this quick debate and for the support from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Fox, about children’s voices being heard. I think that we are getting to the point when there will not be a quango or indeed a minefield, so that makes us all happy. The Minister almost derailed me, because so much of his speaking note was about the interests of children and I am more interested in the voice of children being heard directly rather than people acting on their behalf and representing their interests, but his final comments around being happy to take the idea forward means that I am very happy to withdraw my amendment.
Amendment 270A withdrawn.
Amendment 271
Moved by
271: After Clause 145, insert the following new Clause—
“OFCOM’s reports about use of age assurance
(1) OFCOM must produce and publish a report assessing—(a) how providers of regulated services have used age assurance for the purpose of compliance with their duties set out in this Act,(b) how effective the use of age assurance has been for that purpose, and(c) whether there are factors that have prevented or hindered the effective use of age assurance, or a particular kind of age assurance, for that purpose,(and in this section, references to a report are to a report described in this subsection).(2) A report must, in particular, consider whether the following have prevented or hindered the effective use of age assurance—(a) the costs to providers of using it, and(b) the need to protect users from a breach of any statutory provision or rule of law concerning privacy that is relevant to the use or operation of a regulated service (including, but not limited to, any such provision or rule concerning the processing of personal data).(3) Unless the Secretary of State requires the production of a further report (see subsection (6)), the requirement in subsection (1) is met by producing and publishing one report within the period of 18 months beginning with the day on which sections 11 and 72(2) come into force (or if those provisions come into force on different days, the period of 18 months beginning with the later of those days).(4) In preparing a report, OFCOM must consult—(a) the Information Commissioner, and(b) such other persons as OFCOM consider appropriate.(5) OFCOM must send a copy of a report to the Secretary of State, and the Secretary of State must lay it before Parliament.(6) The Secretary of State may require OFCOM to produce and publish a further report in response to—(a) the development of age assurance technology, or(b) evidence of the reduced effectiveness of such technology.(7) But such a requirement may not be imposed—(a) within the period of three years beginning with the date on which the first report is published, or(b) more frequently than once every three years.(8) For further provision about reports under this section, see section 149.(9) In this section “age assurance” means age verification or age estimation.”Member’s explanatory statement
This new Clause requires OFCOM to produce and publish a report about the use of age assurance by providers of regulated services.
Amendment 271 agreed.
Clause 147: OFCOM’s transparency reports
Amendment 272 not moved.
Amendments 272A and 272AA
Moved by
272A: After Clause 147, insert the following new Clause—
“OFCOM’s report about use of app stores by children
(1) OFCOM must produce a report about the use of app stores by children.(2) In particular, the report must—(a) assess what role app stores play in children encountering content that is harmful to children, search content that is harmful to children or regulated provider pornographic content by means of regulated apps which the app stores make available,(b) assess the extent to which age assurance is currently used by providers of app stores, and how effective it is, and(c) explore whether children’s online safety would be better protected by the greater use of age assurance or particular kinds of age assurance by such providers, or by other measures.(3) OFCOM must publish the report during the period beginning two years, and ending three years, after the day on which sections 11 and 25 come into force (or if those sections come into force on different days, the later of those days).(4) For further provision about the report under this section, see section 149.(5) In this section—“app” includes an app for use on any kind of device, and “app store” is to be read accordingly;“content that is harmful to children” has the same meaning as in Part 3 (see section 54);“regulated app” means an app for a regulated service;“regulated provider pornographic content” has the same meaning as in Part 5 (see section 70);“search content” has the same meaning as in Part 3 (see section 51).(6) In this section references to children are to children in the United Kingdom.”Member’s explanatory statement
This amendment requires OFCOM to produce a report about the use of app stores by children, including consideration of whether children would be better protected by greater use of age assurance.
272AA: After Clause 147, insert the following new Clause—
“OFCOM’s report about reporting and complaints procedures
(1) OFCOM must produce a report assessing the measures taken or in use by providers of Part 3 services to enable users and others to—(a) report particular kinds of content present on such services, and(b) make complaints to providers of such services.(2) OFCOM’s report must take into account the experiences of users and others in reporting content and making complaints to providers of Part 3 services, including—(a) how clear the procedures are for reporting content and making complaints,(b) how easy it is to do those things, and(c) whether providers are taking appropriate and timely action in response to reports and complaints that are made. (3) The report must include advice from OFCOM about whether they consider that the Secretary of State should make regulations under section (Power to impose duty about alternative dispute resolution procedure)(duty about alternative dispute resolution procedure).(4) In the report, OFCOM may make recommendations that they consider would improve the experiences of users and others in reporting content or making complaints to providers of Part 3 services, or would deliver better outcomes in relation to reports or complaints that are made.(5) In preparing the report under this section, OFCOM must consult—(a) the Secretary of State,(b) persons who appear to OFCOM to represent the interests of United Kingdom users of Part 3 services,(c) persons who appear to OFCOM to represent the interests of children (generally or with particular reference to online safety matters),(d) the Information Commissioner, and(e) such other persons as OFCOM consider appropriate.(6) The report may draw on OFCOM’s research under section 14 of the Communications Act (see subsection (6B) of that section).(7) The report is not required to address any matters which are the subject of a report by OFCOM under section 146 (report about the availability and treatment of news publisher content and journalistic content).(8) OFCOM must publish the report within the period of two years beginning with the day on which this section comes into force.(9) OFCOM must send a copy of the report to the Secretary of State, and the Secretary of State must lay it before Parliament.(10) The Secretary of State must publish a statement responding to the report within the period of three months beginning with the day on which the report is published, and the statement must include a response to OFCOM’s advice about whether to make regulations under section (Power to impose duty about alternative dispute resolution procedure).(11) The statement must be published in such manner as the Secretary of State considers appropriate for bringing it to the attention of persons who may be affected by it.(12) For further provision about the report under this section, see section 149.(13) References in this section to “users and others” are to United Kingdom users and individuals in the United Kingdom.”Member’s explanatory statement
This amendment requires OFCOM to produce a report about the content reporting and complaints procedures used by providers of Part 3 services, including user experiences of those procedures. OFCOM must specifically advise whether they consider that regulations ought to be made under the new Clause proposed to be inserted in my name after Clause 194 (duty about alternative dispute resolution procedure).
Amendments 272A and 272AA agreed.
Clause 148: OFCOM’s report about researchers’ access to information
Amendment 272AB not moved.
Amendment 272B
Moved by
272B: Clause 148, page 132, line 11, leave out “two years” and insert “18 months”
Member’s explanatory statement
This amendment provides that the report that OFCOM must publish under Clause 148 (report about researchers’ access to information) must be published within 18 months of Clause 148 coming into force (rather than two years).
Amendment 272B agreed.
Amendment 272BA not moved
Amendments 272C and 272D
Moved by
272C: Clause 148, page 132, line 16, leave out “Following the publication of the report, OFCOM may” and insert “OFCOM must”
Member’s explanatory statement
This amendment provides that OFCOM must (rather than may) produce guidance about matters dealt with by the report published under Clause 148.
272D: Clause 148, page 132, line 19, leave out subsections (8) and (9) and insert—
“(8) Before producing the guidance (including revised guidance) OFCOM must consult the persons mentioned in subsection (3).(9) OFCOM must publish the guidance (and any revised guidance).(10) OFCOM must include in each transparency report under section 147 an assessment of the effectiveness of the guidance.”Member’s explanatory statement
This amendment is consequential on the amendment in my name making the production of guidance under Clause 148(7) mandatory.
Amendments 272C and 272D agreed.
Amendment 272E not moved.
Amendment 273
Moved by
273: After Clause 148, insert the following new Clause—
“OFCOM’s report in connection with investigation into a death
(1) Subsection (2) applies if OFCOM receive—(a) a notice from a senior coroner under paragraph 1(2) of Schedule 5 to the Coroners and Justice Act 2009 in connection with an investigation into the death of a person;(b) a request for information in connection with the investigation of a procurator fiscal into, or an inquiry held or to be held in relation to, the death of a person;(c) a notice from a coroner under section 17A(2) of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.)) in connection with—(i) an investigation to determine whether an inquest into the death of a person is necessary, or(ii) an inquest in relation to the death of a person.(2) OFCOM may produce a report for use by the coroner or procurator fiscal, dealing with any matters that they consider may be relevant.(3) In subsection (1)(b) “inquiry” means an inquiry held, or to be held, under the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2).” Member’s explanatory statement
This amendment makes it clear that OFCOM may produce a report in connection with a person’s death, if the coroner gives OFCOM a notice or, in Scotland, the procurator fiscal requests information, for that purpose.
Amendment 273 agreed.
Amendments 273A and 273B had been withdrawn from the Marshalled List.
Clause 149: OFCOM'S reports
Amendments 274 to 274AA
Moved by
274: Clause 149, page 132, line 41, at end insert—
“(aa) a report under section (OFCOM’s reports about use of age assurance) (report about use of age assurance),”Member’s explanatory statement
This amendment is consequential on the new Clause to be inserted after Clause 145 in my name. It ensures that the usual confidentiality provisions apply to matters contained in OFCOM’s report about the use of age assurance.
274A: Clause 149, page 133, line 1, at end insert—
“(ca) a report under section (OFCOM’s report about use of app stores by children) (report about use of app stores by children),”Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 147 in my name. It ensures that the usual confidentiality provisions apply to matters contained in OFCOM’s report about the use of app stores by children.
274AA: Clause 149, page 133, line 1, at end insert—
“(ca) a report under section (OFCOM’s report about reporting and complaints procedures) (report about reporting and complaints procedures),”Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 147 in my name about OFCOM’s report concerning reporting and complaints procedures used by providers of Part 3 services. The amendment ensures that the usual confidentiality provisions apply to matters contained in that report.
Amendments 274 to 274AA agreed.
Amendment 274B
Moved by
274B: After Clause 149, insert the following new Clause—
“CHAPTER 8MEDIA LITERACYMedia literacy
(1) Section 11 of the Communications Act is amended in accordance with subsections (2) to (5).(2) Before subsection (1) insert—“(A1) In this section—(a) subsection (1) imposes duties on OFCOM which apply in relation to material published by means of the electronic media (including by means of regulated services), and(b) subsections (1A) to (1E) expand on those duties, and impose further duties on OFCOM, in relation to regulated services only.”(3) After subsection (1) insert— “(1A) OFCOM must take such steps, and enter into such arrangements, as they consider most likely to be effective in heightening the public’s awareness and understanding of ways in which they can protect themselves and others when using regulated services, in particular by helping them to—(a) understand the nature and impact of harmful content and the harmful ways in which regulated services may be used, especially content and activity disproportionately affecting particular groups, including women and girls;(b) reduce their and others’ exposure to harmful content and to the use of regulated services in harmful ways, especially content and activity disproportionately affecting particular groups, including women and girls;(c) use or apply—(i) features included in a regulated service, including features mentioned in section 12(2) of the Online Safety Act 2023, and(ii) tools or apps, including tools such as browser extensions,so as to mitigate the harms mentioned in paragraph (b);(d) establish the reliability, accuracy and authenticity of content;(e) understand the nature and impact of disinformation and misinformation, and reduce their and others’ exposure to it;(f) understand how their personal information may be protected.(1B) OFCOM must take such steps, and enter into such arrangements, as they consider most likely to encourage the development and use of technologies and systems for supporting users of regulated services to protect themselves and others as mentioned in paragraph (a), (b), (c), (d) or (e) of subsection (1A), including technologies and systems which—(a) provide further context to users about content they encounter;(b) help users to identify, and provide further context about, content of democratic importance present on regulated user-to-user services;(c) signpost users to resources, tools or information raising awareness about how to use regulated services so as to mitigate the harms mentioned in subsection (1A)(b).(1C) OFCOM’s duty under subsection (1A) is to be performed in the following ways (among others)—(a) pursuing activities and initiatives,(b) commissioning others to pursue activities and initiatives,(c) taking steps designed to encourage others to pursue activities and initiatives, and(d) making arrangements for the carrying out of research (see section 14(6)(a)).(1D) OFCOM must draw up, and from time to time review and revise, a statement recommending ways in which others, including providers of regulated services, might develop, pursue and evaluate activities or initiatives relevant to media literacy in relation to regulated services.(1E) OFCOM must publish the statement and any revised statement in such manner as they consider appropriate for bringing it to the attention of the persons who, in their opinion, are likely to be affected by it.”(4) After subsection (2) insert— “(3) In this section and in section 11A,“regulated service” means—(a) a regulated user-to-user service, or(b) a regulated search service.“Regulated user-to-user service” and “regulated search service” have the same meaning as in the Online Safety Act 2023 (see section 3 of that Act).(4) In this section—(a) “content”, in relation to regulated services, means regulated user-generated content, search content or fraudulent advertisements;(b) the following terms have the same meaning as in the Online Safety Act 2023—“content of democratic importance” (see section 13 of that Act);“fraudulent advertisement” (see sections 33 and 34 of that Act);“harm” (see section 209 of that Act) (and “harmful” is to be interpreted consistently with that section);“provider”(see section 202 of that Act);“regulated user-generated content” (see section 49 of that Act);“search content” (see section 51 of that Act).”(5) In the heading, for “Duty” substitute “Duties”.(6) In section 14 of the Communications Act (consumer research), in subsection (6)(a), after “11(1)” insert “, (1A) and (1B)”.”Member’s explanatory statement
This amendment inserts provisions into section 11 of the Communications Act 2003 (OFCOM’s duties to promote media literacy). The new provisions expand on the existing duties so far as they relate to regulated user-to-user and search services, and impose new duties on OFCOM aimed at enhancing users’ media literacy.
I beg to move Amendment 274B.
Amendments 274BA and 274BB (to Amendment 274B) not moved.
Amendment 274B agreed.
Amendment 274C
Moved by
274C: After Clause 149, insert the following new Clause—
“Media literacy strategy and media literacy statement
After section 11 of the Communications Act insert—“11A Regulated services: media literacy strategy and media literacy statement(1) OFCOM must prepare and publish a media literacy strategy within the period of one year beginning with the day on which the Online Safety Act 2023 is passed.(2) A media literacy strategy is a plan setting out how OFCOM propose to exercise their functions under section 11 in the period covered by the plan, which must be not more than three years.(3) In particular, a media literacy strategy must state OFCOM’s objectives and priorities for the period it covers.(4) Before the end of the period covered by a media literacy strategy, OFCOM must prepare and publish a media literacy strategy for a further period, ensuring that each successive strategy covers a period beginning immediately after the end of the last one. (5) In preparing or revising a media literacy strategy, OFCOM must consult such persons as they consider appropriate.(6) OFCOM’s annual report must contain a media literacy statement.(7) A media literacy statement is a statement by OFCOM—(a) summarising what they have done in the financial year to which the report relates in the exercise of their functions under section 11, and(b) assessing what progress has been made towards achieving the objectives and priorities set out in their media literacy strategy in that year.(8) A media literacy statement must include a summary and an evaluation of the activities and initiatives pursued or commissioned by OFCOM in the exercise of their functions under section 11 in the financial year to which the report relates.(9) The first annual report that is required to contain a media literacy statement is the report for the financial year during which OFCOM’s first media literacy strategy is published, and that first statement is to relate to the period from publication day until the end of that financial year.(10) But if OFCOM’s first media literacy strategy is published during the second half of a financial year—(a) the first annual report that is required to contain a media literacy statement is the report for the next financial year, and(b) that first statement is to relate to the period from publication day until the end of that financial year.(11) References in this section to OFCOM’s functions under section 11 are to those functions so far as they relate to regulated services.(12) In this section—“annual report” means OFCOM’s annual report under paragraph 12 of the Schedule to the Office of Communications Act 2002;“financial year” means a year ending with 31 March.””Member’s explanatory statement
This amendment requires OFCOM to produce a media literacy strategy every three years (or more frequently), and to include, in their annual report, a statement summarising and evaluating their media literacy activities, so far as they relate to regulated services, during the year.
Amendment 274C agreed.
Amendments 275 and 275A not moved.
Clause 202: “Provider” of internet service
Amendment 276
Moved by
276: Clause 202, page 171, line 2, at end insert—
“(15) For the purposes of subsections (8) and (9), a person who makes available on a service an automated tool or algorithm by means of which content is generated is to be regarded as having control over content so generated.”Member’s explanatory statement
This amendment is about who counts as the provider of a service (other than a user-to-user or search service) that hosts provider pornographic content for the purposes of the Bill. The amendment makes it clear that a person who controls a generative tool on the service, such as a generative AI bot, is regarded as controlling the content generated by that tool.
Amendment 276 agreed.
Amendment 277
Moved by
277: After Clause 205, insert the following new Clause—
““Age verification” and “age estimation”
(1) This section applies for the purposes of this Act.(2) “Age verification” means any measure designed to verify the exact age of users of a regulated service.(3) “Age estimation” means any measure designed to estimate the age or age- range of users of a regulated service.(4) A measure which requires a user to self-declare their age (without more) is not to be regarded as age verification or age estimation.”Member’s explanatory statement
This new Clause defines age verification and age estimation, and makes it clear that mere self-declaration of age does not count as either.
Amendment 277 agreed.
Clause 206: “Proactive technology”
Amendments 278 to 280
Moved by
278: Clause 206, page 172, line 34, leave out “assessing or establishing” and insert “verifying or estimating”
Member’s explanatory statement
This amendment is made to ensure consistency of language in the Bill when referring to age verification and age estimation.
279: Clause 206, page 173, line 11, at end insert—
“(c) in relation to an internet service within section 71(2), content that is provider pornographic content in relation to the service.”Member’s explanatory statement
This amendment is about what counts as “relevant content” for the purposes of defining “proactive technology” for the purposes of the Bill. The effect is for provider pornographic content to now be included.
280: Clause 206, page 173, line 15, leave out “Part 3” and insert “regulated”
Member’s explanatory statement
This amendment revises the definition of “user data” for the purposes of defining “proactive technology” for the purposes of the Bill. The effect is for user data to now include data created etc by providers of all services regulated by the Bill (including providers subject to the Part 5 pornography duties).
Amendments 278 to 280 agreed.
Clause 208: “Functionality”
Amendments 281 to 281B not moved.
Amendment 281BA
Moved by
281BA: Clause 208, page 175, line 5, at end insert—
“(3A) In this Act “functionality”, in relation to a regulated service, includes the design of systems and processes that engage or impact on users, particularly algorithms.”Member’s explanatory statement
This amendment clarifies the role that system design can impact on outcomes on users in light of the requirement for systems to be safe by design.
My Lords, I note that the noble Lord, Lord Stevenson, is no longer in his place, but I promise to still try to live by his admonition to all of us to speak briefly.
I will speak to Amendments 281BA, 281FA, 286A and 281F, which has already been debated but is central to this issue. These amendments aim to fix a problem we repeatedly raised in Committee and on Report. They are also in the name of the noble Baroness, Lady Kidron, and the noble Lords, Lord Stevenson and Lord Clement-Jones, and build on amendments in Committee laid by the noble Lord, Lord Russell, my noble friend Lord Bethell and the right reverend Prelate the Bishop of Oxford. This issue has broad support across the whole House.
The problem these amendments seek to solve is that, while the Government have consistently asserted that this is a systems and processes Bill, the Bill is constructed in a manner that focuses on content. Because this is a rerun of previous debates, I will try to keep my remarks short, but I want to be clear about why this is a real issue.
I am expecting my noble friend the Minister to say, as he has done before, that this is all covered; we are just seeing shadows, we are reading the Bill wrong and the harms that we are most concerned about are genuinely in the Bill. But I really struggle to understand why, if they are in the Bill, stating them clearly on the face of the Bill creates the legal uncertainty that seems to be the Government’s favourite problem with each of the amendments we have been raising today.
My noble friend—sorry, my friend—the noble Baroness, Lady Kidron, commissioned a legal opinion that looked at the statements from the Government and compared it to the text in the Bill. That opinion, like that of the many noble Lords I have just mentioned, is that the current language in the Bill about features and functionalities only pertains as far as it relates to harmful content. All roads in this game of Mornington Crescent lead back to content.
Harmful content is set out in a schedule to the Bill, and this set of amendments ensures that the design of services, irrespective of content, is required to be safe by design. If the Government were correct in their assertion that this is already covered, then these amendments really should not pose any threat at all, and I have yet to hear the Government enunciate what the real legal uncertainty actually is in stating that harm can come from functionality, not just from content.
Secondly, I think that, in the process of this Report stage, we may have risked making the Bill worse, not better, by creating a real confusion. At the front of the Bill, the new purposive Clause 1 sets out really clearly that regulated companies have to be safe by design. But when you actually work your way through the Bill, unfortunately, at each point in the Bill, we only then refer back to content; we do not refer back to the functionality that can in and of itself be harmful. There is no reference to functionality in the risk assessments, the child safety duties, or the definitions of harm—no reference to the systems and processes that may in themselves be designed well or badly, irrespective of content.
I note that both the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, on our last day of Report, made reference to the work of Professor Bowden-Jones, which included the spectre of children so addicted to reward loops of games and other media that they needed to attend a gambling or gaming clinic, including the spectre of a child who left their house in the dead of night to access wifi from a random hotspot when their desperate parents had switched off the home wifi.
As we have said several times before, it is not an accident that these platforms do this; it is a direct result of their business models to encourage dwell time, which in turn drives addiction, irrespective of content. It is very well established by psychiatrists, many of whom have already been quoted, including for example Dr Norman Doidge, a pre-eminent Canadian psychiatrist, that the plasticity of the brain in the developing teenager makes them particularly susceptible to this sort of addiction. Once caught in that loop as a teenager, it stays with them for life. So this is a very real and present risk to today’s teenagers, to —unfortunately—the last decade’s teenagers and to all teenagers as we look ahead. It is why, together with my co-signatories, we felt that we had to continue to keep pressing this.
Thirdly, both in Committee and on Report, we kept asking the Government to be more mindful of the future. This morning, I am sure like many other noble Lords, I woke up to the dulcet tones of my friend, the noble Baroness, Lady Kidron, talking about Meta’s overnight announcement about their large language model. I fear that it would be extraordinary hubris to keep insisting that content on its own is going to be the defining harm that our children face going forward. We simply do not know, and it is so important that we leave open the possibility for functionality that we cannot even imagine today to be launched tomorrow, let alone in five or 10 years.
It is a huge mistake to not make sure that this Bill captures non-content harm and functionality irrespective of any form of content. Because of the lateness of the hour and the urgency of catching trains to get home before the train strike, I will not go through what each of the amendments do, save to say that they introduce specific elements in the back half of the Bill to ensure that non-content harm is captured. We are in a bit of a mess in this Bill, because the front half of the Bill now does include non-content harms, in both Amendment 35 and Amendment 240. So we do need to make sure that in the end we produce a Bill that is internally consistent and genuinely captures the purpose set out in the new Clause 1 all the way through the Bill.
I would like to ask my noble friend a couple of questions. First: what is the legal uncertainty that I am fully expecting him to set out that he is so worried about, and is the reason why he cannot accept the amendments? There is a charitable interpretation, which is that we are all worried about creating legal uncertainty. My co-signatories and I are worried about the legal uncertainty we are creating by not naming functionality as harm. If I am being charitable, I think the Government are worried—and this is what I do not understand—that by naming these non-contact harms, we somehow create a new loophole that would enable a platform to continue to cause harm and Ofcom not to be able to regulate. I hope we are united in trying to stop that, and if so, I really hope that my noble friend can offer an explanation. This is not for the want of us having had many conversations about this, and we may need to have many more. I hope that that charitable interpretation is right: that we are all trying to do the same thing but we do not really understand how this complex Bill works.
There is, unfortunately, a less charitable interpretation, which would lead one to worry that the Bill is actually just about content. I ask my noble friend to confirm that this is not just a content Bill. One of the things that most scared me was Ofcom’s insistence in front of the Communications and Digital Select Committee last week that, if the amendments were allowed, it would create a huge amount of additional work for it. I note that the Government have been briefing that today: that the amendments would lead to substantial delay because of the extra work Ofcom would need to do. That makes me worried that Ofcom has not properly thought about the consequences of non-content harm—harm generated by functionality—if it really will take it so long. That is the much less charitable interpretation of why I am expecting my noble friend to reject the amendments. I should like to understand those two questions: what is really the legal uncertainty that the Government are worried about; and why, if this is all covered, would it take so long?
I am channelling my friend the noble Baroness, Lady Kidron, here, but this is such an important part of protecting our children that if it really is going to take some extra months to prepare to do it properly, we should be willing to do that. We have a few months ahead of us over the summer holidays, and we know that Ofcom has done a brilliant job in getting ahead of the legislation. If the problem is simply that there might be some extra work—provided that really is the reason, rather than the Government not wanting it to be anything other than a content Bill—we should accept that it will take a bit of time. I should like to understand the answer to that.
It is late, and it has been a long Report stage. I will listen very carefully to what my noble friend the Minister has to say. I really hope that the Bill can continue to progress in this collaborative way. It would be an awful shame if, at the end of a long Report stage, we did not recognise that we are trying to solve the same problem and find a way through. I beg to move.
My Lords, the hour is late and I will not detain the House for long. However, I hope that the fact that we are all still sitting here at the end of a long Report stage, because we care very much about the Bill and what we are trying to achieve, will be noted by my noble friend the Minister, his officials and others who are watching. I thank my noble friend Lady Harding for so ably introducing the amendments, which I absolutely support. I was, perhaps for the first time, going to agree with something the noble Baroness, Lady Fox, said a day or so ago: that one thing we and Ofcom need to do much better is to understand the transparency of the algorithms. It is not just algorithms—this is where my knowledge ends—but other design features that make these sites addictive and harmful, and which are outside content. The Bill will not be capable of addressing even the next five years, let alone beyond that, if we do not reflect the fact that, as my noble friend Lady Harding said, it has already been amended so that one way its objectives are to be achieved is by services being required to focus on safety by design.
I hope very much that my noble friend will take up the invitation, because everybody is tired and has been looking at this Bill for so many hours and months that we are probably all word-blind. We could all do with standing back and thinking, “With the amendments made, how does it all hang together so that ultimately, we keep those we want to keep safe as safe as we possibly can?” On that basis, I support these amendments and look forward to hearing further from the Government about how they hope to keep safe those we all wish to keep safe.
My Lords, I rise to support the amendment in the name of the noble Baroness, Lady Kidron. She has been such a forceful voice throughout the passage of this Bill, driven by her passion to protect children, and no more so than with the amendment in her name. That is why I feel compelled to speak up to support her. So far, we have all worked with the Government to see the safe passage of the Online Safety Bill, with strong protections for children. These amendments would be yet another excellent and unique opportunity to protect children. This is what we have been fighting for for years, and it is so uplifting that the Government have listened to us throughout the passage of this Bill—so why stop now? If the Government are saying that the Bill is being clear about harms, they should have no objection to making it explicit.
These amendments press for safety by design to be embedded in later clauses of the Bill and go hand in hand with the earlier amendment that the House so clearly supported. It is clear that the design of services and algorithms is responsible for orchestrating and manipulating the behaviour, feelings, emotions and thoughts of children who, because they are at a vulnerable stage in their development, are easily influenced. We have all witnessed the disastrous impact of the new technology which is fast encroaching upon us, and our children will not be spared from it. So it is imperative that Ofcom have the tools with which to consider and interrogate system design separately from content because, as has been said, it is not only content that is harmful: design is too. We therefore need to take a holistic approach and leave nowhere to hide for the tech companies when it comes to harms affecting our children.
As I have said before, these amendments would send a loud and clear message to the industry that it is responsible for the design of its products and has to think of the consequences for our children’s mental health and well-being when considering design. What better way to do that than for the Government to accept these amendments, in order to show that they are on the side of our children, not the global tech companies, when it comes to protecting them from harm? They need to put measures in place to ensure that the way a service is designed is subject to the online safety regime we have all fought for over the years and during the passage of this Bill.
If the Government do not accept the amendment, perhaps the issue of harmful design could be included in the welcome proposed review of pornography. It would be good to hear the Minister’s thoughts on this idea—but I am not giving him a let-off. I hope he will listen to the strength of feeling and that the Government will reconsider their position, support the amendment and complete the one main task they set out to complete with this Bill, which is to protect children from harm no matter where it rears its ugly head online.
My Lords, I rise briefly to support my noble friend Lady Harding and to associate myself with everything she has just said. It strikes me that if we do not acknowledge that there is harm from functionality, not just content, we are not looking to the future, because functionality protects vulnerable people before the harm has happened; content relies on us having to take it down afterwards. I want to stress that algorithms and functionality disproportionately harm not just vulnerable children but vulnerable adults as well. I do not understand why, since we agreed to safety by design at the beginning of the Bill, it is not running throughout it, rather than just in the introduction. I want to lend my support these amendments this evening.
My Lords, I can be very brief. My noble friend Lady Benjamin and the noble Baronesses, Lady Harding, Lady Morgan and Lady Fraser, have all very eloquently described why these amendments in this group are needed.
It is ironic that we are still having this debate right at the end of Report. It has been a running theme throughout the passage of the Bill, both in Committee and on Report, and of course it ran right through our Joint Committee work. It is the whole question of safety by design, harm from functionalities and, as the noble Baroness, Lady Morgan, said, understanding the operation of the algorithm. And there is still the question: does the Bill adequately cover what we are trying to achieve?
As the noble Baroness, Lady Harding, said, Clause 1 now does set out the requirement for safety by design. So, in the spirit of amity, I suggested to the Minister that he might run a check on the Bill during his free time over the next few weeks to make sure that it really does cover it. But, in a sense, there is a serious point here. Before Third Reading there is a real opportunity to run a slide rule over the Bill to see whether the present wording really is fit for purpose. So many of us around this House who have lived and breathed this Bill do not believe that it yet is. The exhortation by the ethereal presences of the noble Baronesses, Lady Kidron and Lady Harding, to keep pressing to make sure that the Bill is future-proofed and contains the right ingredients is absolutely right.
I very much hope that once again the Minister will go through the hoops and explain whether this Bill really captures functionality and design and not just content, and whether it adequately covers the points set out in the purpose of the Bill which is now there.
My Lords, as we have heard, the noble Baroness, Lady Harding, made a very clear case in support of these amendments, tabled in the name of the noble Baroness, Lady Kidron, and supported by noble Lords from across the House. The noble Baroness, Lady Morgan, gave wise counsel to the Minister, as did the noble Lord, Lord Clement-Jones, that it is worth stepping back and seeing where we are in order to ensure that the Bill is in the right place. I urge the Minister to find the time and the energy that I know he has—he certainly has the energy and I am sure he will match it with the time—to speak to noble Lords over the coming Recess to agree a way to incorporate systems and functionality into the Bill, for all the reasons we have heard.
On Monday, my noble friend Lord Knight spoke of the need for a review about loot boxes and video games. When we checked Hansard, we saw the Minister had promised that such a review would be offered in the coming months. In an unusual turn of events, the Minister exceeded the timescale. We did not have to hear the words “shortly”, “in the summer” or “spring” or anything like that, because it was announced the very next day that the department would keep legislative options under review.
I make that point simply to thank the Minister for the immediate response to my noble friend Lord Knight. But, if we are to have such a review, does this not point very much to the fact that functionality and systems should be included in the Bill? The Minister has a very nice hook to hang this on and I hope that he will do so.
My Lords, this is not just a content Bill. The Government have always been clear that the way in which a service is designed and operated, including its features and functionalities, can have a significant impact on the risk of harm to a user. That is why the Bill already explicitly requires providers to ensure their services are safe by design and to address the risks that arise from features and functionalities.
The Government have recognised the concerns which noble Lords have voiced throughout our scrutiny of the Bill, and those which predated the scrutiny of it. We have tabled a number of amendments to make it even more explicit that these elements are covered by the Bill. We have tabled the new introductory Clause 1, which makes it clear that duties on providers are aimed at ensuring that services are safe by design. It also highlights that obligations on services extend to the design and operation of the service. These obligations ensure that the consideration of risks associated with the business model of a service is a fundamental aspect of the Bill.
My noble friend Baroness Harding of Winscombe worried that we had made the Bill worse by adding this. The new clause was a collaborative one, which we have inserted while the Bill has been before your Lordships’ House. Let me reassure her and other noble Lords as we conclude Report that we have not made it worse by so doing. The Bill will require services to take a safety by design approach to the design and operation of their services. We have always been clear that this will be crucial to compliance with the legislation. The new introductory Clause 1 makes this explicit as an overarching objective of the Bill. The introductory clause does not introduce any new concepts; it is an accurate summary of the key provisions and objectives of the Bill and, to that end, the framework and introductory statement are entirely compatible.
We also tabled amendments—which we debated last Monday—to Clause 209. These make it clear that functionalities contribute to the risk of harm to users, and that combinations of functionality may cumulatively drive up the level of risk. Amendment 281BA would amend the meaning of “functionality” within the Bill, so that it includes any system or process which affects users. This presents a number of concerns. First, such a broad interpretation would mean that any service in scope of the Bill would need to consider the risk of any feature or functionality, including ones that are positive for users’ online experience. That could include, for example, processes designed for optimising the interface depending on the user’s device and language settings. The amendment would increase the burden on service providers under the existing illegal content and child safety duties and would dilute their focus on genuinely risky functionality and design.
Second, by duplicating the reference to systems, processes and algorithms elsewhere in the Bill, it implies that the existing references in the Bill to the design of a service or to algorithms must be intended to capture matters not covered by the proposed new definition of “functionality”. This would suggest that references to systems and processes, and algorithms, mentioned elsewhere in the Bill, cover only systems, processes or algorithms which do not have an impact on users. That risks undermining the effectiveness of the existing duties and the protections for users, including children.
Amendment 268A introduces a further interpretation of features and functionality in the general interpretation clause. This duplicates the overarching interpretation of functionality in Clause 208 and, in so doing, introduces legal and regulatory uncertainty, which in turn risks weakening the existing duties. I hope that sets out for my noble friend Lady Harding and others our legal concerns here.
Amendment 281FA seeks to add to the interpretation of harm in Clause 209 by clarifying the scenarios in which harm may arise, specifically from services, systems and processes. This has a number of concerning effects. First, it states that harm can arise solely from a system and process, but a design choice does not in isolation harm a user. For example, the decision to use algorithms, or even the algorithm itself, is not what causes harm to a user—it is the fact that harmful content may be pushed to a user, or content pushed in such a manner that is harmful, for example repeatedly and in volume. That is already addressed comprehensively in the Bill, including in the child safety risk assessment duties.
Secondly, noble Lords should be aware that the drafting of the amendment has the effect of saying that harm can arise from proposed new paragraphs (a) (b) and (c)—
Can I just double-check what my noble friend has just said? I was lulled into a possibly false sense of security until we got to the point where he said “harmful” and then the dreaded word “content”. Does he accept that there can be harm without there needing to be content?
This is the philosophical question on which we still disagree. Features and functionality can be harmful but, to manifest that harm, there must be some content which they are functionally, or through their feature, presenting to the user. We therefore keep talking about content, even when we are talking about features and functionality. A feature on its own which has no content is not what the noble Baroness, Lady Kidron, my noble friend Lady Harding and others are envisaging, but to follow the logic of the point they are making, it requires some content for the feature or functionality to cause its harm.
But the content may not be harmful.
Yes, even if the content is not harmful. We keep saying “content” because it is the way the content is disseminated, as the Bill sets out, but the features and functionalities can increase the risks of harm as well. We have addressed this through looking at the cumulative effects and in other ways.
This is the key question. For example, let us take a feature that is pushing something at you constantly; if it was pushing poison at you then it would obviously be harmful, but if it was pushing marshmallows then they would be singularly not harmful but cumulatively harmful. Is the Minister saying that the second scenario is still a problem and that the surfeit of marshmallows is problematic and will still be captured, even if each individual marshmallow is not harmful?
Yes, because the cumulative harm—the accumulation of marshmallows in that example—has been addressed.
Noble Lords should also be aware that the drafting of Amendment 281FA has the effect of saying that harm can arise from proposed new paragraphs (a), (b) and (c)—for example, from the
“age or characteristics of the likely user group”.
In effect, being a child or possessing a particular characteristic may be harmful. This may not be the intention of the noble Baronesses who tabled the amendment, but it highlights the important distinction between something being a risk factor that influences the risk of harm occurring and something being harmful.
The Government are clear that these aspects should properly be treated as risk factors. Other parts of the Bill already make it clear that the ways in which a service is designed and used may impact on the risk of harm suffered by users. I point again to paragraphs (f) to (h) of Clause 10(6); paragraph (e) talks about the level of risk of functionalities of the service, paragraph (f) talks about the different ways in which the service is used, and so on.
We have addressed these points in the Bill, though clearly not to the satisfaction of my noble friend, the noble Baroness, Lady Kidron, and others. As we conclude Report, I recognise that we have not yet convinced everyone that our approach achieves what we all seek, though I am grateful for my noble friend’s recognition that we all share the same aim in this endeavour. As I explained to the noble Baroness, Lady Kidron, on her Amendment 35, I was asking her not to press it because, if she did, the matter would have been dealt with on Report and we would not be able to return to it at Third Reading.
As the Bill heads towards another place with this philosophical disagreement still bubbling away, I am very happy to commit to continuing to talk to your Lordships—particularly when the Bill is in another place, so that noble Lords can follow the debates there. I am conscious that my right honourable friend Michelle Donelan, who has had a busy maternity leave and has spoken to a number of your Lordships while on leave, returns tomorrow in preparation for the Bill heading to her House. I am sure she will be very happy to speak even more when she is back fully at work, but we will both be happy to continue to do so.
I think it is appropriate, in some ways, that we end on this issue, which remains an area of difference. With that promise to continue these discussions as the Bill moves towards another place, I hope that my noble friend will be content not to press these amendments, recognising particularly that the noble Baroness, Lady Kidron, has already inserted this thinking into the Bill for consideration in the other House.
My Lords, being the understudy for the noble Baroness, Lady Kidron, is quite a stressful thing. I am, however, reliably informed that she is currently offline in the White House, but I know that she will scrutinise everything I say afterwards and that I will receive a detailed school report tomorrow.
I am extremely grateful to my noble friend the Minister for how he has just summed up, but I would point out two things in response. The first is the circularity of the legal uncertainty. What I think I have heard is that we are trying to insert into the Bill some clarity because we do not think it is clear, but the Government’s concern is that by inserting clarity, we then imply that there was not clarity in the rest of the Bill, which then creates the legal uncertainty—and round we go. I am not convinced that we have really solved that problem, but I may be one step further towards understanding why the Government think that it is a problem. I think we have to keep exploring that and properly bottom it out.
My second point is about what I think will for evermore be known as the marshmallow problem. We have just rehearsed across the House a really heartfelt concern that just because we cannot imagine it today, it does not mean that there will not be functionality that causes enormous harm which does not link back to a marshmallow, multiple marshmallows or any other form of content.
Those two big issues are the ones we need to keep discussing: what is really causing the legal uncertainty and how we can be confident that unimaginable harms from unimaginable functionality are genuinely going to be captured in the Bill. Provided that we can continue, maybe it is entirely fitting at the end of what I think has been an extraordinarily collaborative Report, Committee and whole process of the Bill going through this House—which I have felt incredibly proud and privileged to be a part of—that we end with a commitment to continue said collaborative process. With that, I beg leave to withdraw the amendment.
Amendment 281BA withdrawn.
Clause 209: “Harm” etc
Amendments 281C to 281E
Moved by
281C: Clause 209, page 175, line 17, leave out from “dissemination” to end of line 18
Member’s explanatory statement
This amendment is consequential on the next amendment to this Clause in my name.
281D: Clause 209, page 175, line 18, at end insert—
“(3A) References to harm presented by content, and any other references to harm in relation to content, include references to cumulative harm arising or that may arise in the following circumstances—(a) where content, or content of a particular kind, is repeatedly encountered by an individual (including, but not limited to, where content, or a kind of content, is sent to an individual by one user or by different users or encountered as a result of algorithms used by, or functionalities of, a service);(b) where content of a particular kind is encountered by an individual in combination with content of a different kind (including, but not limited to, where a kind of content is sent to an individual by one user or by different users or encountered as a result of algorithms used by, or functionalities of, a service).”Member’s explanatory statement
This amendment makes clear that references to harm presented by content include cumulative harm that arises or that may arise in the circumstances mentioned and, in particular, covers the case where this occurs as a result of algorithms used by, or functionalities of, a service.
281E: Clause 209, page 175, line 29, at end insert—
“(4A) References to a risk of harm in relation to functionalities, and references to the risk of functionalities facilitating users encountering particular kinds of content (however expressed), include references to risks arising or that may arise due to multiple functionalities which, used in combination, increase the likelihood of harm arising (for example, as mentioned in subsection (3A)).”Member’s explanatory statement
This amendment makes clear that references to a risk of harm in relation to functionalities and references to the risk of functionalities facilitating users encountering particular kinds of content include references to risks from a combination of those functionalities.
Amendments 281C to 281E agreed.
Amendments 281F and 281FA not moved.
Amendment 281G
Moved by
281G: Clause 209, page 175, line 33, leave out “and (4)” and insert “to (4)”
Member’s explanatory statement
This amendment is consequential on the amendment in my name inserting new subsection (3A) into this Clause.
Amendment 281G agreed.
Clause 210: “Online safety functions” and “online safety matters”
Amendments 281H to 283
Moved by
281H: Clause 210, page 176, line 12, leave out “section 11 (duty” and insert “sections 11 and 11A (duties”
Member’s explanatory statement
This amendment provides that the term “online safety functions” includes OFCOM’s functions under section 11A of the Communications Act 2003 (inserted by the new Clause proposed to be inserted after Clause 149 in my name) regarding OFCOM’s media literacy strategy (as well as OFCOM’s functions under section 11 of that Act).
282: Clause 210, page 176, line 21, at end insert—
“(2A) References to OFCOM’s “online safety functions” also include references to OFCOM’s duty to comply with any of the following, so far as relating to the use of a regulated service by a person who has died—(a) a notice from a senior coroner under paragraph 1(2) of Schedule 5 to the Coroners and Justice Act 2009 in connection with an investigation into a person’s death;(b) a request for information in connection with the investigation of a procurator fiscal into, or an inquiry held or to be held in relation to, a person’s death;(c) a notice from a coroner under section 17A(2) of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.)) in connection with—(i) an investigation to determine whether an inquest into a person’s death is necessary, or(ii) an inquest in relation to a person’s death.”Member’s explanatory statement
This amendment makes it clear that OFCOM’s online safety functions include the duty of complying with a coroner’s notice or, in Scotland, a request from the procurator fiscal, in connection with the use of a regulated service by a person who has died.
283: Clause 210, page 176, line 23, at end insert—
“(4) In subsection (2A)(b) “inquiry” means an inquiry held, or to be held, under the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2).”Member’s explanatory statement
This amendment defines a term used in the preceding amendment in my name.
Amendments 281H to 283 agreed.
Clause 211: Interpretation: general
Amendments 284 and 285
Moved by
284: Clause 211, page 176, leave out lines 27 and 28
Member’s explanatory statement
This amendment removes a definition of “age assurance” from Clause 211 as that term is now defined separately where used.
285: Clause 211, page 176, line 29, at end insert—
““automated tool” includes bot;”Member’s explanatory statement
This amendment makes it clear that references in the Bill to automated tools include bots.
Amendments 284 and 285 agreed.
Amendment 286
Moved by
286: Clause 211, page 177, line 7, at end insert—
““freedom of expression”: any reference to freedom of expression (except in sections 36(6)(f) and 69(2)(d)) is to the freedom to receive and impart ideas, opinions or information (referred to in Article 10(1) of the Convention) by means of speech, writing or images;”Member’s explanatory statement
This amendment inserts a definition of freedom of expression into the Bill.
Amendment 286 agreed.
Amendment 286A not moved.
Amendments 287 to 290
Moved by
287: Clause 211, page 177, line 10, after “91(1)”insert “or (Information in connection with an investigation into the death of a child)(1)”
Member’s explanatory statement
This amendment revises the definition of “information notice” so that it includes a notice under the new Clause proposed in my name concerning OFCOM’s power to obtain information in connection with an investigation into the death of a child.
288: Clause 211, page 177, line 31, at end insert—
““pornographic content” means content of such a nature that it is reasonable to assume that it was produced solely or principally for the purpose of sexual arousal;”Member’s explanatory statement
This amendment adds a definition of “pornographic content” to Clause 211 of the Bill.
288A: Clause 211, page 178, line 3, at end insert—
“(2A) References in this Act to an individual with a certain characteristic include references to an individual with a combination of characteristics.”Member’s explanatory statement
This amendment makes clear that references in the Bill to an individual with a certain characteristic include an individual with a combination of characteristics.
288B: Clause 211, page 178, line 9, leave out “description” and insert “kind”
Member’s explanatory statement
This amendment ensures consistency of language in referring to kinds of content.
288C: Clause 211, page 178, line 11, leave out “description” and insert “kind”
Member’s explanatory statement
This amendment ensures consistency of language in referring to kinds of content.