Report (4th Day)
Relevant documents: 28th and 38th Reports from the Delegated Powers Committee, 15th Report from the Constitution Committee. Scottish and Welsh Legislative Consent granted.
Amendment 186A
Moved by
186A: Before Clause 64, insert the following new Clause—
“Terms of service as a contract
The terms of service under which a Category 1 service is provided to a person who is a consumer for the purposes of the Consumer Rights Act 2015 must be treated as being a contract for a trader to provide a service to a consumer.”Member’s explanatory statement
This purpose of this amendment is to ensure that providers’ terms of service are treated as consumer contracts, and to give users recourse to the remedies under the Consumer Rights Act 2015 in the event of breach.
My Lords, in speaking to my Amendment 186A, I hope that noble Lords will forgive me for not speaking in detail to the many other amendments in this group correctly branded “miscellaneous” by those who compile our lists for us. Many of them are minor and technical, especially the government amendments. However, that is not true of all of them: Amendment 253 in the name of the noble Lord, Lord Clement-Jones, is a substantial amendment relating to regulatory co-operation, while Amendment 275A, in the name of the noble Baroness, Lady Finlay of Llandaff, is also of some interest, relating to the reports that Ofcom is being asked to produce on technological developments.
Nor is Amendment 191A lacking in importance and substance, although—I hope I will be forgiven for saying this, not in a snarky sort of way—for those of us who are worried about the enormous powers being given to Ofcom as a result of the Bill, the idea that it should be required by statute to give guidance to coroners, who are part of the courts system, seems to me strange and worth examining more closely. There might be a more seemly way of achieving the effect that the noble Baroness, Lady Kidron, understandably wants to achieve.
I turn to my own Amendment 186A, which, I hope, ought to be relatively straightforward. It concerns the terms of service of a contract with a category 1 service provider, and it is intended to improve the rights that consumers or users of that service have. It is the case that the Government want users of those services to have the ability to enforce their rights under contract against the service providers, as set out in Clause 65, and this is entirely welcome. However, it is well known that bringing claims in contract can be an expensive and onerous business, as I have pointed out in the past, particularly when the service is provided on the one-sided terms of the service provider—often, of course, drafted under the legal system of a foreign jurisdiction.
Parliament recognised this difficulty when it enacted the Consumer Rights Act 2015. Its purpose was to make it easier for consumers to deal fairly with traders and get redress when they are treated unfairly. However, category 1 providers such as Twitter and Facebook are not governed by the Consumer Rights Act. In the language of the Act, they do not supply digital content
“for a price paid by the consumer”,
a requirement of the Act. So the purpose of Clause 65 is therefore only partially achieved in the Bill as it stands; a key element in enforcing consumer rights is missing.
The European Union, to its credit, rectified this oversight in its directive about modernising consumer law in 2019. It provides that a consumer who receives content or a service in return for providing their personal data should be in the same position as the consumer who pays a price, and the same consumer protections apply to both. The effect of my amendment is that we would follow suit—if we are serious about protecting and empowering users, that is what we should do. We should do away with the arbitrary fiction that category 1 users are not consumers deserving of protection simply because they do not pay a monetary price.
I have heard it said about bringing up this matter on Report, when it should have been brought up in Committee—as I admit that it should—that it is a little too late perhaps for the Bill team and the Minister to absorb this proposal. However, I do not fully accept that, since the Third Reading of the Bill has been set for 4 September, which gives the Government ample time over the summer to get all their ducks lined up in a row.
I should have thought that this amendment would command support on all sides of the House and from all noble Lords who have participated in this Bill so far. I am hoping, although I have had no indication from my noble friend the Minister whether he is going to accept this amendment, that he would feel that it was a relatively straightforward thing to do, entirely in line with his purpose and something that would strengthen the Bill considerably. I beg to move.
My Lords, I shall speak to my Amendment 275A in this group. It would place a duty on Ofcom to report annually on areas where our legal codes need clarification and revision to remain up to date as new technologies emerge—and that is to cover technologies, some of which we have not even thought of yet.
Government Amendments 206 and 209 revealed the need for an amendment to the Bill and how it would operate, as they clarify that reference to pornographic content in the Bill includes content created by a bot. However, emerging technologies will need constant scrutiny.
As the noble Lord, Lord Clement-Jones, asked, what about provider content, which forms the background to the user interaction and may include many harms. For example, would a game backdrop that includes anti-Semitic slurs, a concentration camp, a sex shop or a Ku Klux Klan rally be caught by the Bill?
The Minister confirmed that “content” refers to anything communicated by means of an internet service and the encounter includes any content that individuals read, view, hear or otherwise experience, making providers liable for the content that they publish. Is this liable under civil, regulatory or criminal law?
As Schedule 1 goes to some lengths to exempt some service-to-provider content, can the Minister for the record provide chapter and verse, as requested by the noble Lord, Lord Clement-Jones, on provider liability and, in particular, confirm whether such content would be dealt with by the Part 3 duties under the online safety regime or whether users would have to rely on similar law for claims at their own expense through the courts or the police carry the burden of further enforcement?
Last week, the Minister confirmed that “functionality” captures any feature enabling interactions of any description between service users, but are avatars or objects created by the provider of a service, not by an individual user, in scope and therefore subject to risk assessments and their mitigation requirements? If so, will these functionalities also be added to user empowerment tools, enabling users to opt out of exposure to them, or will they be caught only by child safety duties? Are environments provided by a service provider, such as a backdrop to immersive environments, in scope through the definition of “functionality”, “content” or both? When this is provider content and not user-generated content, will this still hold true?
All this points to a deeper issue. Internet services have become more complex and vivid, with extremely realistic avatars and objects indistinguishable from people and objects in the real world. This amendment avoids focusing on negatives associated with AI and new technologies but tries to ensure that the online world is as safe as the offline world should be. It is worth noting that Interpol is already investigating how to deal with criminals in the metaverse and anticipating crimes against children, data theft, money laundering, fraud and counterfeit, ransomware, phishing, sexual assault and harassment, among other things. Many of these behaviours operate in grey areas of the law where it is not clear whether legal definitions extend to the metaverse.
Ofcom has an enormous task ahead, but it is best placed to consider the law’s relationship to new technological developments and to inform Parliament. Updating our laws through the mechanisms proposed in Amendment 275A will provide clarity to the courts, judges, police and prosecution service. I urge the Minister to provide as full an answer as possible to the many questions I have posed. I am grateful to him for all the work he has been doing. If he cannot accept my amendment as worded, will he provide an assurance that he will return to this with a government amendment at Third Reading?
My Lords, I will speak to Amendment 191A in my name. I also support Amendment 186A in the name of the noble Lord, Lord Moylan, Amendment 253 in the name of the noble Lord, Lord Clement-Jones, and Amendment 275A in the name of my noble friend Lady Finlay. I hope that my words will provide a certain level of reassurance to the noble Lord, Lord Moylan.
In Committee and on Report, the question was raised as to how to support the coronial system with information, education and professional development to keep pace with the impact of the fast-changing digital world. I very much welcome the Chief Coroner’s commitment to professional development for coroners but, as the Minister said, this is subject to funding. While it is right that the duty falls to the Chief Coroner to honour the independence and expert knowledge associated with his roles, this amendment seeks to support his duties with written guidance from Ofcom, which has no such funding issue since its work will be supported by a levy on regulated companies—a levy that I argue could usefully and desirably contribute to the new duties that benefit coroners and bereaved parents.
The role of a coroner is fundamental. They must know what preliminary questions to ask and how to triage the possibility that a child’s digital life is relevant. They must know that Ofcom is there as a resource and ally and how to activate its powers and support. They must know what to ask Ofcom for, how to analyse information they receive and what follow-up questions might be needed. Importantly, they must feel confident in making a determination and describing the way in which the use of a regulated service has contributed to a child’s death, in the case that that is indeed their finding. They must be able to identify learnings that might prevent similar tragedies happening in the future. Moreover, much of the research and information that Ofcom will gather in the course of its other duties could be usefully directed at coroners. All Amendment 191A would do is add to the list of reports that Ofcom has to produce with these issues in mind. In doing so, it would do the Chief Coroner the service of contributing to his own needs and plans for professional development.
I turn to Amendment 186A in the name of the noble Lord, Lord Moylan, who makes a very significant point in bringing it forward. Enormous effort goes into creating an aura of exceptionality for the tech sector, allowing it to avoid laws and regulations that routinely apply to other sectors. These are businesses that benefit from our laws, such as intellectual copyright or international tax law. However, they have negotiated a privileged position in which they have privatised the benefits of our attention and data while outsourcing most of the costs of their service to the public purse or, indeed, their users.
Terms and conditions are a way in which a company enters into a clear agreement with its users, who then “pay” for access with their attention and their data: two of the most valuable commodities in today’s digital society. I am very sympathetic to the noble Lord’s wish to reframe people, both adults and children, from a series of euphemisms that the sector employs—such as “users”, “community members”, “creators” or “participants”—to acknowledge their status as consumers who have rights and, in particular, the right to expect the product they use to be safe and for providers to be held accountable if it is not. I join the noble Lord in asserting that there are now six weeks before Third Reading. This is a very valuable suggestion that is worthy of government attention.
Amendment 253 in the name of the noble Lord, Lord Clement-Jones, puts forward a very strong recommendation of the pre-legislative committee. We were a bit bewildered and surprised that it was not taken up at the time, so I will be interested to hear what argument the Minister makes to exclude it, if indeed he does so. I say to him that I have already experienced the frustration of being bumped from one regulator to another. Although my time as an individual or the organisational time of a charity is minor in the picture we are discussing, it is costly in time and resources. I point to the time, resources and potential effectiveness of the regulatory regime. However well oiled and well funded the regulatory regime of the Online Safety Bill is, I do not think it will be as well oiled and well funded as those that it seeks to regulate.
I make it clear that I accept the arguments of not wanting to create a super-regulator or slow down or confuse existing regulators which each have their own responsibilities, but I feel that the noble Lord, Lord Clement-Jones, has approached this with more of a belt-and-braces approach rather than a whole realignment of regulators. He simply seeks to make it explicit that regulators can, should and do have a legal basis on which to work singularly or together when it suits them. As I indicated earlier, I cannot quite understand why that would not be desirable.
Finally, in what is truly a miscellaneous group, I will refer to the amendment in the name of my noble friend Lady Finlay. I support the intent of this amendment and sincerely hope that the Minister will be able to reassure us that this is already in the Bill and will be done by Ofcom under one duty or another. I hope that he will be able to point to something that includes this. I thank my noble friend for raising it, as it harks back to an amendment in Committee in my name that sought to establish that content deemed harmful in one format would be deemed harmful in all formats—whether synthetic, such as AI, the metaverse or augmented reality. As my noble friend alluded to, it also speaks to the debate we had last week in relation to the amendment from the noble Lord, Lord Clement-Jones, about provider content in the metaverse.
I do not think we have time to wait for the report that my noble friend seeks. This is the long-awaited Online Safety Bill. We have been warned by the inventors of neural networks and leaders in AI and alternate realities that we are at a crossroads between human and machine. It is incumbent on the Government to ensure that the Bill is fit not only for the past but for the future. In order to do that, they need to look at the definitions—as they did so admirably in Part 5—but also at some of the exceptions they have carved out so that they can say that the Bill truly ends the era of exceptionality in which harms online are treated differently from those offline. My view is that the amendment in the name of my noble friend Lady Finlay should not be necessary at this stage. But, if the Minister cannot confirm that it is already covered, perhaps he will indicate his willingness to accept the amendment.
My Lords, I will make some arguments in favour of Amendment 191A, in the name of the noble Baroness, Lady Kidron, and inject some notes of caution around Amendment 186A.
On Amendment 191A, it has been my experience that when people frequently investigate something that has happened on online services, they do it well, and well-formed requests are critical to making this work effectively. This was the case with law enforcement: when an individual police officer is investigating something online for the first time, they often ask the wrong questions. They do not understand what they can get and what they cannot get. It is like everything in life: the more you do it, the better you get at it.
Fortunately, in a sense, most coroners will only very occasionally have to deal with these awful circumstances where they need data related to the death of a child. At that point, they are going to be very dependent on Ofcom—which will be dealing with the companies day in and day out across a range of issues—for its expertise. Therefore, it makes absolute sense that Ofcom’s expertise should be distributed widely and that coroners—at the point where they need to access this information—should be able to rely on that. So Amendment 191A is very well intended and, from a practical point of view, very necessary if we are going to make this new system work as I know the noble Baroness, Lady Kidron, and I would like to see it work.
On Amendment 186A around consumer law, I can see the attraction of this, as well as some of the read-across from the United States. A lot of the enforcement against online platforms in the US takes place through the Federal Trade Commission precisely in this area of consumer law and looking at unfair and deceptive practices. I can see the attraction of seeking to align with European Union law, as the noble Lord, Lord Moylan, argued we should be doing with respect to consumer law. However, I think this would be much better dealt with in the context of the digital markets Bill and it would be a mistake to squeeze it in here. My reasons for this are about both process and substance.
In terms of process, we have not done the impact assessment on this. It is quite a major change, for two reasons. First, it could potentially have a huge impact in terms of legal costs and the way businesses will have to deal with that—although I know nobody is going to get too upset if the impact assessment says there will be a significant increase in legal costs for category 1 companies. However, we should at least flesh these things out when we are making regulations and have them in an impact assessment before going ahead and doing something that would have a material impact.
Secondly in process terms, there are some really interesting questions about the way this might affect the market. The consumer law we have does exclude services that are offered for free, because so much of consumer law is about saying, “If the goods are not delivered correctly, you get your money back”. With free services, we are clearly dealing with a different model, so the notion that we have a law that is geared towards making sure you either get the goods or you get the money may not be the best fit. To try to shoehorn in these free-at-the-point-of-use services may not be the best way to do it, even from a markets and consumer point of view. Taking our time to think about how to get this right would make sense.
More fundamentally, in terms of the substance, we need to recognise that, as a result of the Online Safety Bill, Ofcom will be requiring regulated services to rewrite their terms of service in quite a lot of detail. We see this throughout the Bill. We are going to have to do all sorts of things—we will debate other amendments in this area today—to make sure that their terms of service are conformant with what we want from them in this Bill. They are going to have to redo their complaints and redress mechanisms. All of this is going to have to change and Ofcom is going to be the regulator that tells them how to do it; that is what we are asking Ofcom to tell them to do.
My fundamental concern here, if we introduce another element, is that there is a whole different structure under consumer law where you might go to local trading standards or the CMA, or you might launch a private action. In many cases, this may overlap. The overlap is where consumer law states that goods must be provided with reasonable care and skill and in a reasonable time. That sounds great, but it is also what the Online Safety Bill is going to be doing. We do not want consumer law saying, “You need to write your terms of service this way and handle complaints this way”, and then Ofcom coming along and saying, “No, you must write your terms of service that way and handle complaints that way”. We will end up in a mess. So I just think that, from a practical point of view, we should be very focused in this Bill on getting all of this right from an Online Safety Bill point of view, and very cautious about introducing another element.
Perhaps one of the attractions of the consumer law point for those who support the amendment is that it says, “Your terms must be fair”. It is the US model; you cannot have unfair terms. Again, I can imagine a scenario in which somebody goes to court and tries to get the terms struck down because they are unfair but the platform says, “They’re the terms Ofcom told me to write. Sort this out, please, because Ofcom is saying I need to do this but the courts are now saying the thing I did was unfair because somebody feels that they were badly treated”.
Does the noble Lord accept that that is already a possibility? You can bring an action in contract law against them on the grounds that it is an unfair contract. This could happen already. It is as if the noble Lord is not aware that the possibility of individual action for breach of contract is already built into Clause 65. This measure simply supplements it.
I am certainly aware that it is there but, again, the noble Lord has just made the point himself: this supplements it. The intent of the amendment is to give consumers more rights under this additional piece of legislation; otherwise, why bother with the amendment at all? The noble Lord may be arguing against himself in saying that this is unnecessary and, at the same time, that we need to make the change. If we make the change, it is, in a sense, a material change to open the door to more claims being made under consumer law that terms are unfair. As I say, we may want this outcome to happen eventually, but I find it potentially conflicting to do it precisely at a time when we are getting Ofcom to intervene much more closely in setting those terms. I am simply arguing, “Let’s let that regime settle down”.
The net result and rational outcome—again, I am speaking to my noble friend’s Amendment 253 here—may be that other regulators end up deferring to Ofcom. If Ofcom is the primary regulator and we have told it, under the terms of the Online Safety Bill, “You must require platforms to operate in this way, handle complaints in this way and have terms that do these things, such as excluding particular forms of language and in effect outlawing them on platforms”, the other regulators will eventually end up deferring to it. All I am arguing is that, at this stage, it is premature to try to introduce a second, parallel route for people to seek changes to terms or different forms of redress, however tempting that may be. So I am suggesting a note of caution. It is not that we are starting from Ground Zero—people have routes to go forward today—but I worry about introducing something that I think people will see as material at this late stage, having not looked at the full impact of it and potentially running in conflict with everything else that we are trying to do in this legislation.
My Lords, I will speak briefly on a couple of amendments and pick up from where the noble Lord, Lord Allan, just finished on Amendment 186A. I associate myself with all the comments that the noble Baroness, Lady Kidron, made on her Amendment 191A. As ever, she introduced the amendment so brilliantly that there is no need for me to add anything other than my wholehearted support.
I will briefly reference Amendment 253 from the noble Lord, Lord Clement-Jones. Both his amendment and my noble friend Lord Moylan’s point to one of the challenges about regulating the digital world, which is that it touches everything. We oscillate between wanting to compartmentalise the digital and recognising that it is interconnected to everything. That is the same challenge faced by every organisation that is trying to digitise: do you ring-fence or recognise that it touches everything? I am very supportive of the principles behind Amendment 253 precisely because, in the end, it does touch everything. It is hugely important that, even though this Bill and others still to come are creating an extraordinarily powerful single regulator in the form of Ofcom, we also recognise the interconnectivity of the regulatory landscape. The amendment is very well placed, and I hope my noble friend the Minister looks favourably on it and its heritage from the pre-legislative scrutiny committee.
I will briefly add my thoughts on Amendment 186A in this miscellaneous group. It feels very much as if we are having a Committee debate on this amendment, and I thank my noble friend Lord Moylan for introducing it. He raises a hugely important point, and I am incredibly sympathetic to the logic he set out.
In this area the digital world operates differently from the physical world, and we do not have the right balance at all between the powers of the big companies and consumer rights. I am completely with my noble friend in the spirit in which he introduced the amendment but, together with the noble Lord, Lord Allan, I think it would be better tackled in the Digital Markets, Competition and Consumers Bill, precisely because it is much broader than online safety. This fundamentally touches the issue of consumer rights in the digital world and I am worried that, if we are not careful, we will do something with the very best intentions that actually makes things slightly worse.
I worry that the terms and conditions of user-to-user services are incomprehensible to consumers today. Enshrining it as a contract in law might, in some cases, make it worse. Today, when user-to-user services have used our data for something, they are keen to tell us that we agreed to it because it was in their terms of service. My noble friend opens up a really important issue to which we should give proper attention when the Digital Markets, Competition and Consumers Bill arrives in the House. It is genuinely not too late to address that, as it is working its way through the Commons now. I thank my noble friend for introducing the amendment, because we should all have thought of the issue earlier, but it is much broader than online safety.
My Lords, even by previous standards, this is the most miscellaneous of miscellaneous groups. We have ranged very broadly. I will speak first to Amendment 191A from the noble Baroness, Lady Kidron, which was so well spoken to by her and by the noble Baroness, Lady Harding. It is common sense, and my noble friend Lord Allan, as ever, put his finger on it: it is not as if coroners are going to come across this every day of the week; they need this kind of guidance. The Minister has introduced his amendments on this, and we need to reduce those to an understandable code for coroners and bereaved parents. I defy anybody, apart from about three Members of this House, to describe in any detail how the information notices will interlock and operate. I could probably name those Members off the top of my head. That demonstrates why we need such a code of practice. It speaks for itself.
I am hugely sympathetic to Amendment 275A in the name of the noble Baroness, Lady Finlay, who asked a series of important questions. The Minister said at col. 1773 that he would follow up with further information on the responsibility of private providers for their content. This is a real, live issue. The noble Baroness, Lady Kidron, put it right: we hope fervently that the Bill covers the issue. I do not know how many debates about future-proofing we have had on the Bill but each time, including in that last debate, we have not quite been reassured enough that we are covering the metaverse and provider content in the way we should be. I hope that this time the Minister can give us definitive chapter and verse that will help to settle the horses, so to speak, because that is exactly what the very good amendment in the name of the noble Baroness, Lady Finlay, was about.
On the amendment in the name of the noble Lord, Lord Moylan, I think I am rather more in favour of it in principle than is my noble friend, although I do not think he was against it in principle, and I was probably on exactly the same page as far as process is concerned. The noble Baroness, Lady Harding, put her finger on it, because a right of action is highly desirable; in fact, the Joint Committee was extremely keen on having a right of action for those affected by social media, and it made an important recommendation. In a sense, that was not seen through to a sufficient extent. We believe that the Bill is an opportunity to reset the relationship between service providers and users. While we recognise the resource challenges both for individuals in accessing the courts and for the courts themselves, we think that the importance of issues in the Bill requires that users have a right of redress in the court. I am absolutely with the noble Lord but am not entirely sure about shoehorning that into the Bill at this stage, given that other digital services acquire data as well. This should not be covered just by the Online Safety Bill; I believe the DMCC Bill should cover it, and I very much look forward to supporting an amendment from the noble Lord, Lord Moylan, if he chooses to table it when the time comes.
On my Amendment 253, we have heard throughout debates on the Bill that the range of human and business activity covered online presents a complex map of potential harms. Some of them will fall into or be adjacent to the oversight of other regulators with domain-specific expertise. The relationship has to some extent been formalised through the Digital Regulation Cooperation Forum, which comprises Ofcom, the CMA, the ICO and the FCA, and I think we all support the creation of the DRCF. Ofcom already has a working relationship with the ASA, and of course with the Internet Watch Foundation—I was very pleased to hear from it that progress is being made on a memorandum of understanding with Ofcom, and I very much hope that continues to progress, as it has since Committee, because that kind of relationship that Ofcom and the IWF continue to build is really important.
Within that regulatory web, if you like, Ofcom will of course have the most relevant powers and expertise, and many regulators will look to it for help in tackling online safety issues. Effective public protection will be achieved through what might be described as regulatory interlock. To protect the public, Ofcom should be explicitly empowered to co-operate with others and to share information, and the Bill should, as much as it can, enable Ofcom to work with other regulators and share online safety information with them. Ofcom should also be able to bring the immense skills of other regulators into its own work. The Bill gives Ofcom the general ability to co-operate with overseas regulators, but it is largely silent on co-operation with UK ones. The Communications Act 2003 limits the UK regulators with which Ofcom can share information—it excludes the ICO, for instance—yet the Bill has a permissive approach to overseas regulators.
The Bill should extend co-operation and information sharing in respect of online safety to include regulators overseeing the offences in Schedule 7, the primary priority harms for children and the priority harms to adults. Elsewhere in regulation, it is noted that the Financial Conduct Authority has a general duty to co-operate, so there are precedents. As the noble Baroness, Lady Kidron, pointed out, the Joint Committee was extremely keen on this. It said:
“In taking on its responsibilities under the Bill, Ofcom will be working with a network of other regulators and third parties already working in the digital world. We recommend that the Bill provide a framework for how these bodies will work together including when and how they will share powers, take joint action, and conduct joint investigations”.
The logic is absolutely in favour of this. I very much hope that the Minister will be able to take it on board because it has been seen to be logical right from the word go, particularly by the Joint Committee. I think the Communications and Digital Committee also recommended it, so what is not to like?
My Lords, as others have said, this has been a very interesting tour d’horizon of some of the points in the Bill that we still need to resolve. I will not go over too much of the detail that has been raised because those points need a response from the Minister when he responds.
I will start with the use of “chairman” in several places throughout the Bill. We do not understand what is going on here. My noble friend Lady Merron wanted to deal with this but she unfortunately is not here, so I have been left holding the issue, and I wish to pursue it vigorously.
It is probably not well known but, in 2007, the Government decided that there ought to be changes in the drafting of our laws to make them gender-neutral as much as possible. Since 2007, it has been customary practice to replace words that could be gender-specific with those which are not. The Drafting Guidance, which is issued and should be followed by the Office of the Parliamentary Counsel, says that gender-neutral drafting requires
“avoiding gender-specific pronouns (such as ‘he’) for a person who is not necessarily of that gender”,
and avoiding gender-specific nouns
“that might appear to assume that a person of a particular gender will do a particular job or perform a particular role (eg ‘chairman’)”.
The guidance provides another bit of extra information:
“The gender-specific noun most likely to be encountered is ‘chairman’. ‘Chair’ is now widely used in primary legislation as a substitute”,
and we should expect to see it. Why do we not see it in this Bill?
My wife, who is chairman of a number of things, objects to “chair” as “furniturism”. She likes to be referred to as a person and not a thing.
I respect the noble Lord’s point. I did not make a specific proposal; I simply asked why the Bill was framed in circumstances that are not those required by the Office of the Parliamentary Counsel.
Moving on, Amendment 288A, which addresses the issue of multiple characteristics, is welcome. I am grateful to the Minister for it. However, it is a rather odd solution to what should be a very straightforward question. We have the amendment—which, as I said, we welcome—because it was pointed out that the new overarching objective for this Bill that has been brought forward by government amendment refers to issues affecting those who have a characteristic. It uses the word “characteristic” without qualification, although I think most of us who have been involved in these debates and discussions realise that this is an oblique reference to the Equality Act 2010 and that, although they are not set out in the Bill, the characteristics concerned are probably those that are protected under the Equality Act. I think the Minister has some problem with the Equality Act, because we have crossed swords on it before, but we will not go back into that.
In referencing “a characteristic”, which is perfectly proper, we did not realise—but it has been pointed out—that under the Interpretation Act it is necessary to recall that in government legislation when the singular is mentioned it includes the plural unless it is specifically excluded. So we can assume that when references are made to “a characteristic”, they do in fact mean “characteristics”. Therefore, by logic, moving forward to the way to which it is referred in the Bill, when a person is referred to as having “a characteristic” it can also be assumed that the reference in the Bill applies to them having more than one characteristic.
However, grateful as I am to the Minister for bringing forward these amendments, which we accept, this is not quite the point that we were trying to get across. I invite the Minister, when he comes to respond, to explain a little more about the logic behind what I will propose. We are fairly convinced—as I think are most people who have been involved in these discussions—that social media companies’ form of operation, providing the materials and service that we want, is gendered. I do not think there is any doubt about that; everybody who has spoken in this debate has at some stage pointed out that, in many cases, those with protected characteristics, and women and girls in particular, are often picked on and singled out. A pile-on—the phrase used to mean the amplification that comes with working on the internet—is a very serious concern. That may change; it may just be a feature of today’s world and one day be something that does not happen. However, at the moment, it is clearly the case that if one is in some way characterised by a protected characteristic, and you have more than one of them, you tend to get more attention, aggravation and difficulty in your social media engagement. The evidence is so clear that we do not need to go into it.
The question we asked in Committee, and which we hoped we would get a response to, was whether we should always try to highlight the fact that where we are talking about people with more than one characteristic, it is the fact that there is a combination, not that it is a plural, that is the matter. Being female and Jewish, which has been discussed several times from the Dispatch Box by my noble friend Lady Merron and others, seems to be the sort of combination of characteristics which causes real difficulties on the internet for the people who have them. I use that only as one example; there are others.
If that is the case then it would have been nice to have seen that specifically picked up, and my original drafting of the amendment did that. However, we have accepted the Government’s amendment to create the new overarching objective, and I do not want to change it at this stage—we are past that debate. But I wonder whether the Minister, when he comes to respond, could perhaps as a grace note explain that he accepts the point that it is the doubling or tripling of the characteristics, not the plurality, that matters.
Moving back to the clauses that have been raised by others speaking in this debate, and who have made points that need to be responded to, I want to pick up on the point made by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, about the need for some form of engagement between domestic regulators if we are going to get the best possible solution to how the internet is regulated as we go forward. We have never said that there had to be a new super-regulator and we never intended that there should be powers taken to change the way in which we do this. However, some form of co-operation, other than informal co-operation, is almost certainly going to be necessary. We do not want to subtract from where we are in relation to how our current regulators operate—they seem to be working well—but we worry that the legal powers and support that might be required in order to do that are not yet in place or, if they are in place, are based on somewhat archaic and certainly not modern-day regulatory practice.
Is this something that the committees of the Houses might consider? Perhaps when we come to other amendments around this, that is something we might pick up, because I think it probably needs further consideration away from the Bill in order to get the best possible solution. That is particularly true given, as the noble Lord, Lord Clement-Jones, says, so many of these regulators will now have experience of working together and might be prepared to share that in evidence or in appearances before such a committee.
Finally, I refer to the very good discussion we have had about Amendment 186A, which was introduced by the noble Lord, Lord Moylan. Like many people who received his initial circulation of his draft amendment, I was struck by why on earth I had not thought of that myself. It is a good and obvious move that we should think a little more about. It probably needs a lot more thought about the concerns about the unintended consequences that might arise from it before we move forward on it, and I take the points made by the noble Lord, Lord Allan, about that, but I hope that the Minister will respond positively to it and that it is perhaps something we can pick up in future Bills.
My Lords, let me add to this miscellany by speaking to the government amendments that stand in my name as part of this group. The first is Amendment 288A, which we mentioned on the first group of amendments on Report because it relates to the new introductory clause, Clause 1, and responds to the points raised by the noble Lord, Lord Stevenson of Balmacara. I am very happy to say again that the Government recognise that people with multiple and combined characteristics suffer disproportionately online and are often at greater risk of harm. This amendment therefore adds a provision in the new interpretation clause, Clause 1, to put beyond doubt that all the references to people with “a certain characteristic” throughout the Bill include people with a combination of characteristics. We had a good debate about the Interpretation Act 1978, which sets that out, but we are happy to set it out clearly here.
In his Amendment 186A, my noble friend Lord Moylan seeks to clarify a broader issue relating to consumer rights and online platforms. He got some general support—certainly gratitude—for raising this issue, although there was a bit of a Committee-style airing of it and a mixture of views on whether this is the right way or the right place. The amendment seeks to make it clear that certain protections for consumers in the Consumer Rights Act 2015 apply when people use online services and do not pay for them but rather give up their personal data in exchange. The Government are aware that the application of the law in that area is not always clear in relation to free digital services and, like many noble Lords, express our gratitude to my noble friend for highlighting the issue through his amendment.
We do not think that the Bill is the right vehicle for attempting to provide clarification on this point, however. We share some of the cautions that the noble Lord, Lord Allan of Hallam, raised and agree with my noble friend Lady Harding of Winscombe that this is part of a broader question about consumer rights online beyond the services with which the Bill is principally concerned. It could be preferable that the principle that my noble friend Lord Moylan seeks to establish through his amendment should apply more widely than merely to category 1 services regulated under the Bill. I assure him that the Bill will create a number of duties on providers which will benefit users and clarify that they have existing rights of action in the courts. We discussed these new protections in depth in Committee and earlier on Report. He drew attention to Clause 65(1), which puts a requirement on all services, not just category 1 services, to include clear and accessible provisions in their terms of service informing users about their right to bring a claim for breach of contract. Therefore, while we are grateful, we agree with noble Lords who suggested that this is a debate for another day and another Bill.
Amendment 191A from the noble Baroness, Lady Kidron, would require Ofcom to issue guidance for coroners and procurators fiscal to aid them in submitting requests to Ofcom to exercise its power to obtain information from providers about the use of a service by a deceased child. While I am sympathetic to her intention, I do not think that her amendment is the right answer. It would be inappropriate for an agency of the Executive to issue guidance to a branch of the judiciary. As I explained in Committee, it is for the Chief Coroner to provide detailed guidance to coroners. This is written to assist coroners with the law and their legal duties and to provide commentary and advice on policy and practice.
The amendment tabled by the noble Baroness cuts across the role of the Chief Coroner and risks compromising the judicial independence of the coroner, as set out in the Constitutional Reform Act 2005. As she is aware, the Chief Coroner has agreed to consider issuing guidance to coroners on social media and to consider the issues covered in the Bill. He has also agreed to explore whether coroners would benefit from additional training, with the offer of consultation with experts including Ofcom and the Information Commissioner’s Office. I suggest that the better approach would be for Ofcom and the Information Commissioner’s Office to support the Chief Coroner in his consideration of these issues where he would find that helpful.
I agree with the noble Lord, Lord Allan, that coroners must have access to online safety expertise given the technical and fast-moving nature of this sector. As we have discussed previously, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest following a request from a coroner which will provide that expertise. I hope that this reassures the noble Baroness.
I understand the report on a specific death, which is very welcome and part of the regime as we all see it. The very long list of things that the coroner may not know that they do not know, as I set out in the amendment, is the issue which I and other noble Lords are concerned about. If the Government could find a way to make that possible, I would be very grateful.
We are keen to ensure that coroners have access to the information and expertise that they need, while respecting the independence of the judicial process to decide what they do not know and would like to know more about and the role of the Chief Coroner there. It is a point that I have discussed a lot with the noble Baroness and with my noble friend Lady Newlove in her former role as Victims’ Commissioner. I am very happy to continue doing so because it is important that there is access to that.
The noble Lord, Lord Stevenson, spoke to the amendments tabled by the noble Baroness, Lady Merron, about supposedly gendered language in relation to Clauses 141 and 157. As I made clear in Committee, I appreciate the intention—as does Lady Deben—of making clear that a person of either sex can perform the role of chairman, just as they can perform the role of ombudsman. We have discussed in Committee the semantic point there. The Government have used “chairman” here to be consistent with terminology in the Office of Communications Act 2002. I appreciate that this predates the Written Ministerial Statement which the noble Lord cited, but that itself made clear that the Government at the time recognised that in practice, parliamentary counsel would need to adopt a flexible approach to this change—for example, in at least some of the cases where existing legislation originally drafted in the former style is being amended.
The noble Lord may be aware of a further Written Ministerial Statement, made on 23 May last year, following our debates on gendered language on another Bill, when the then Lord President of the Council and Leader of the House of Commons said that the Office of the Parliamentary Counsel would update its drafting guidance in light of that. That guidance is still forthcoming. However, importantly, the term here will have no bearing on Ofcom’s decision-making on who would chair the advisory committees. It must establish that this could indeed be a person of either sex.
Amendment 253 seeks to enable co-operation, particularly via information-sharing, between Ofcom and other regulators within the UK. I reassure noble Lords that Section 393 of the Communications Act 2003 already includes provisions for sharing information between Ofcom and other regulators in the UK.
As has been noted, Ofcom already co-operates effectively with other domestic regulators. That has been strengthened by the establishment of the Digital Regulation Co-operation Forum. By promoting greater coherence, the forum helps to resolve potential tensions, offering clarity for people and the industry. It ensures collaborative work across areas of common interest to address complex problems. Its outputs have already delivered real and wide-ranging impacts, including landmark policy statements clarifying the interactions between digital regulatory regimes, research into cross-cutting issues, and horizon-scanning activities on new regulatory challenges. We will continue to assess how best to support collaboration between digital regulators and to ensure that their approaches are joined up. We therefore do not think that Amendment 253 is necessary.
My Lords, the Minister has not stated that there is a duty to collaborate. Is he saying that that is, in fact, the case in practice?
Yes, there is a duty, and the law should be followed. I am not sure whether the noble Lord is suggesting that it is not—
Is there a duty to collaborate between regulators?
I am not sure that I follow the noble Lord’s question, but perhaps—
My Lords, the Minister is saying that, in practice, there is a kind of collaboration between regulators and that there is a power under the Communications Act, but is he saying that there is any kind of duty on regulators to collaborate?
If I may, I will write to the noble Lord setting that out; he has lost me with his question. We believe, as I think he said, that the forum has added to the collaboration in this important area.
The noble Baroness, Lady Finlay, raised important questions about avatars and virtual characters. The Bill broadly defines “content” as
“anything communicated by means of an internet service”,
meaning that it already captures the various ways through which users may encounter content. In the metaverse, this could therefore include things such as avatars or characters created by users. As part of the user-to-user services’ risk assessments, providers will be required to consider more than the risk in relation to user-generated content, including aspects such as how the design and operation of their services, including functionality and how the service is used, might increase the risk of harm to children and the presence of illegal content. A user-to-user service will need to consider any feature which enables interaction of any description between users of the service when carrying out its risk assessments.
The Bill is focused on user-to-user and search services, as there is significant evidence to support the case for regulation based on the risk of harm to users and the current lack of regulatory and other accountability in this area. Hosting, sharing and the discovery of user-generated content and activity give rise to a range of online harms, which is why we have focused on those services. The Bill does not regulate content published by user-to-user service providers themselves; instead, providers are already liable for the content that they publish on their services themselves, and the criminal law is the most appropriate mechanism for dealing with services which publish illegal provider content.
The noble Baroness’s Amendment 275A seeks to require Ofcom to produce a wide-ranging report of behaviour facilitated by emerging technologies. As we discussed in Committee, the Government of course agree that Ofcom needs continually to assess future risks and the capacity of emerging technologies to cause harm. That is why the Bill already contains provisions which allow it to carry out broad horizon scanning, such as its extensive powers to gather information, to commission skilled persons’ reports and to require providers to produce transparency reports. Ofcom has already indicated that it plans to research emerging technologies, and the Bill will require it to update its risk assessments, risk profiles and codes of practice with the outcomes of this research where relevant.
As we touched on in Committee, Clause 56 requires regular reviews by Ofcom into the incidence of content that is harmful to children, and whether there should be changes to regulations setting out the kinds of content that are harmful to children. In addition, Clause 143 mandates that Ofcom should investigate users’ experience of regulated services, which are likely to cover user interactions in virtual spaces, such as the metaverse and those involving content generated by artificial intelligence.
I reiterate that platforms on which user-generated interactions take place are in scope of the Bill. That includes the metaverse as well as other extended reality services which have been raised by a number of noble Lords as an area of concern. In Committee, the noble Baroness, Lady Finlay, tabled a similar amendment in relation to real-world physical crimes, such as sexual assault. Although her present amendment pertains to the regulatory framework, I reassure her and all noble Lords that criminal offences are drafted so as to avoid, where possible, specifying any medium or technology through which they might be committed so that they too are future-proofed. Many current criminal offences can therefore be committed and prosecuted regardless of whether the behaviour is conducted online or offline.
The report that the noble Baroness seeks through this amendment would be a broad expansion of Ofcom’s oversight responsibilities to services that are not in scope of the Bill. As a result, I am afraid I cannot commit to taking that forward in relation to this Bill but I am very happy to keep discussing the issue with her more broadly, as is my noble friend Lord Camrose, as a Minister at the Department for Science, Innovation and Technology. I hope that provides her with sufficient reassurance to not press her amendment today.
I am most grateful to the Minister; perhaps I could just check something he said. There was a great deal of detail and I was trying to capture it. On the question of harms to children, we all understand that the harms to children are viewed more extensively than harms to others, but I wondered: what counts as unregulated services? The Minister was talking about regulated services. What happens if there is machine-generated content which is not generated by any user but by some random codes that are developed and then randomly incite problematic behaviours?
I am happy to provide further detail in writing and to reiterate the points I have made as it is rather technical. Content that is published by providers of user-to-user services themselves is not regulated by the Bill because providers are liable for the content they publish on the services themselves. Of course, that does not apply to pornography, which we know poses a particular risk to children online and is regulated through Part 5 of the Bill. I will set out in writing, I hope more clearly, for the noble Baroness what is in scope to reassure her about the way the Bill addresses the harms that she has rightly raised.
Will the Minister copy other Members in?
My Lords, this has indeed been a wide-ranging and miscellaneous debate. I hope that since we are considering the Bill on Report noble Lords will forgive me if I do not endeavour to summarise all the different speeches and confine myself to one or two points.
The first is to thank the noble Baroness, Lady Kidron, for her support for my amendment but also to say that having heard her argument in favour of her Amendment 191A, I think the difference between us is entirely semantic. Had she worded it so as to say that Ofcom should be under a duty to offer advice to the Chief Coroner, as opposed to guidance to coroners, I would have been very much happier with it. Guidance issued under statute has to carry very considerable weight and, as my noble friend the Minister said, there is a real danger in that case of an arm of the Executive, if you like, or a creature of Parliament—however one wants to regard Ofcom—interfering in the independence of the judiciary. Had she said “advice to the Chief Coroner and whoever is the appropriate officer in Scotland”, that would have been something I could have given wholehearted support to. I hope she will forgive me for raising that quibble at the outset, but I think it is a quibble rather than a substantial disagreement.
On my own amendment, I simply say that I am grateful to my noble friend for the brevity and economy with which he disposed of it. He was of course assisted in that by the remarks and arguments made by many other noble Lords in the House as they expressed their support for it in principle.
I think there is a degree of confusion about what the Bill is doing. There seemed to be a sense that somehow the amendment was giving individuals the right to bring actions in the courts against providers, but of course that already happens because that right exists and is enshrined in Article 65. All the amendment would do is give some balance so that consumers actually had some protections in what is normally, in essence, an unequal contest, which is trying to ensure that a large company enforces the terms and contracts that it has written.
In particular, my amendment would give, as I think noble Lords know, the right to demand repeat performance—that is, in essence, the right to put things right, not monetary compensation—and it would frustrate any attempts by providers, in drafting their own terms and conditions, to limit their own liability. That is of course what they seek to do but the Consumer Rights Act frustrates them in their ability to do so.
We will say no more about that for now. With that, I beg leave to withdraw my amendment.
Amendment 186A withdrawn.
Clause 65: Further duties about terms of service
Amendment 187
Moved by
187: Clause 65, page 62, line 18, leave out from “service” to “down” in line 20 and insert “indicate (in whatever words) that the presence of a particular kind of regulated user-generated content is prohibited on the service, the provider takes”
Member’s explanatory statement
This amendment makes a change to a provision about what the terms of service of a Category 1 service say. The effect of the change is to cover a wider range of ways in which a term of service might indicate that a certain kind of content is not allowed on the service.
My Lords, transparency and accountability are at the heart of the regulatory framework that the Bill seeks to establish. It is vital that Ofcom has the powers it needs to require companies to publish online safety information and to scrutinise their systems and processes, particularly their algorithms. The Government agree about the importance of improving data sharing with independent researchers while recognising the nascent evidence base and the complexities of this issue, which we explored in Committee. We are pleased to be bringing forward a number of amendments to strengthen platforms’ transparency, which confer on Ofcom new powers to assess how providers’ algorithms work, which accelerate the development of the evidence base regarding researchers’ access to information and which require Ofcom to produce guidance on this issue.
Amendment 187 in my name makes changes to Clause 65 on category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. The amendment tightens the clause to ensure that all the providers’ terms through which they might indicate that a certain kind of content is not allowed on its service are captured by these duties.
Amendment 252G is a drafting change, removing a redundant paragraph from the Bill in relation to exceptions to the legislative definition of an enforceable requirement in Schedule 12.
In relation to transparency, government Amendments 195, 196, 198 and 199 expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports. With thanks to the noble Lord, Lord Stevenson of Balmacara, for his engagement on this issue, we are pleased to table these amendments, which will allow Ofcom to require providers to publish information relating to the formulation, development and scope of user-to-user service providers’ terms of service and search service providers’ public statements of policies and procedures. This is in addition to the existing transparency provision regarding their application.
Amendments 196 and 199 would enable Ofcom to require providers to publish more information in relation to algorithms, specifically information about the design and operation of algorithms that affect the display, promotion, restriction, discovery or recommendation of content subject to the duties in the Bill. These changes will enable greater public scrutiny of providers’ terms of service and their algorithms, providing valuable information to users about the platforms that they are using.
As well as publicly holding platforms to account, the regulator must be able to get under the bonnet and scrutinise the algorithms’ functionalities and the other systems and processes that they use. Empirical tests are a standard method for understanding the performance of an algorithmic system. They involve taking a test data set, running it through an algorithmic system and observing the output. These tests may be relevant for assessing the efficacy and wider impacts of content moderation technology, age-verification systems and recommender systems.
Government Amendments 247A, 250A, 252A, 252B, 252C, 252D, 252E and 252F will ensure that Ofcom has the powers to enable it to direct and observe such tests remotely. This will significantly bolster Ofcom’s ability to assess how a provider’s algorithms work, and therefore to assess its compliance with the duties in the Bill. I understand that certain technology companies have voiced some concerns about these powers, but I reassure your Lordships that they are necessary and proportionate.
The powers will be subject to a number of safeguards. First, they are limited to viewing information. Ofcom will be unable to remotely access or interfere with the service for any other purpose when exercising the power. These tests would be performed offline, meaning that they would not affect the services’ provision or the experience of users. Assessing systems, processes, features and functionalities is the focus of the powers. As such, individual user data and content are unlikely to be the focus of any remote access to view information.
Additionally, the power can be used only where it is proportionate to use in the exercise of Ofcom’s functions—for example, when investigating whether a regulated service has complied with relevant safety duties. A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was unlawful. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.
The Bill contains no restriction on services making the existence and detail of the information notice public. Should a regulated service wish to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. In addition, the amendments create no restrictions on the use of this power being viewable to members of the public through a request, such as those under the Freedom of Information Act—noting that under Section 393 of the Communications Act, Ofcom will not be able to disclose information it has obtained through its exercise of these powers without the provider’s consent, unless permitted for specific, defined purposes. These powers are necessary and proportionate and will that ensure Ofcom has the tools to understand features and functionalities and the risks associated with them, and therefore the tools to assess companies’ compliance with the Bill.
Finally, I turn to researchers’ access to data. We recognise the valuable work of researchers in improving our collective understanding of the issues we have debated throughout our scrutiny of the Bill. However, we are also aware that we need to develop the evidence base to ensure that any sharing of sensitive information between companies and researchers can be done safely and securely. To this end, we are pleased to table government Amendments 272B, 272C and 272D.
Government Amendment 272B would require Ofcom to publish its report into researcher access to information within 18 months, rather than two years. This report will provide the evidence base for government Amendments 272C and 272D, which would require Ofcom to publish guidance on this issue. This will provide valuable, evidence-based guidance on how to improve access for researchers safely and securely.
That said, we understand the calls for further action in this area. The Government will explore this issue further and report back to your Lordships’ House on whether further measures to support researchers’ access to data are required—and if so, whether they could be implemented through other legislation, such as the Data Protection and Digital Information Bill. I beg to move.
My Lords, Amendment 247B in my name was triggered by government Amendment 247A, which the Minister just introduced. I want to explain it, because the government amendment is quite late—it has arrived on Report—so we need to look in some detail at what the Government have proposed. The phrasing that has caused so much concern, which the Minister has acknowledged, is that Ofcom will be able to
“remotely access the service provided by the person”.
It is those words—“remotely access”—which are trigger words for anyone who lived through the Snowden disclosures, where everyone was so concerned about remote access by government agencies to precisely the same services we are talking about today: social media services.
So I hope the House will forgive me for teasing out why this is important and why we need extensive safeguards. This is non-trivial. If you are out there running a service, the idea of a government agency having remote access to your systems is a big deal, and the fact that this has come so late compounds that fear because it feels like it is being snuck in.
Ofcom, as I am sure it will remind us, is independent, but in the last debate it was referred to as part of the UK Executive. We cannot have it both ways. I see the chairman of Ofcom shaking his head at that, but we have just had a debate at the heart of which was the notion that this part of the UK Executive would give guidance to part of the UK judiciary. What is sauce for the goose is sauce for the gander. Here, the concern is that Ofcom would be seen as part of the UK Executive having remote access to social media services run by independent companies; I think your Lordships can see why that triggers things.
We need to establish in this debate that what the Government have in mind for Ofcom—this independent regulator—is utterly different from what might be the case with, for example, a security service under the terms of investigatory powers legislation. On the face of it, it looks similar, so it is important that, if it is different as the Minister has started, and I think will continue, to argue, we establish exactly why it is different and how we can be confident that is different. Amendment 247B in my name and that of my noble friend was intended to be helpful to the Government as one way of trying to establish why this is different, by placing some limitations on the Bill.
There are two risks inherent in this notion of remote access. The first is that Ofcom is overintrusive. It has an oversight role, but we are not expecting Ofcom to run our social media services; we expect it to oversee social media services that run themselves. Clearly, remote access, if used in an overbearing way, could be excessively intrusive in relation to those services being able to do what they do. In one version of remote access, which I think the Minister has tried to tease out, it is an offline exercise done occasionally to check something in a quasi-academic way; in another form, Ofcom sits there with a dashboard of what is going on in these systems, just as the Government like to do in their own public services with health and other things. Ofcom with a dashboard looking at what is happening in real time is quite different, and I think would be seen as overbearing and excessively intrusive. I hope the Minister will be able to provide further assurances that that is not what they have in mind.
The second risk is that the access is used for purposes other than simply Ofcom’s purposes. I am certainly not a conspiracy theorist and, perhaps unusually in my community of tech people, I quite admire the people who have only first names and live in Cheltenham, because I think what they do does keep us safe. That is what we pay them to do: to be creative and find creative ways to access data under lawful authority, et cetera, fully respecting human rights. I have confidence that those I have met do that and they do a great job; but we pay them to be creative and find access to data, not to put up with barriers. A spy is gonna spy. If they know that there is a form of access to data, of course, their job is to look at whether that would be useful to them.
My understanding—not as a conspiracy theory but as a matter of fact as to how the law works—is that, under investigatory powers rules, they can issue secret warrants, appropriately signed off, to pretty much anyone to access data. The recipients of those warrants have to execute them and, under penalty of prosecution themselves, are not able to tell anyone they received the warrant. Ofcom is not exempt from that. That is a fact, and we should recognise it; so, were Ofcom to receive an appropriate warrant for data, my understanding is that it would not have a way to say no and would not even be able to tell us about it. The best way to protect against that—to protect against temptation for James from Cheltenham, who is doing his job—is to make sure that remote access does not include anything that would be remotely useful to the security services. The way that we will be able to understand that is through transparency.
The Minister began his comments by saying that transparency and accountability were critical, and that maxim also applies here. We also want to protect against Ofcom’s own overreach and against any downstream use of that data. It is essential, therefore, that we understand in quite a lot of detail exactly what this remote access does and does not entail, so we can make our minds up about whether this piece is being used in an appropriate way.
I hope that the Minister can build on assurances which he very helpfully started to give at the beginning of the debate about this information notice process. He said that there was nothing secret about the information notices. Again, I hope that we can reinforce that any platform that is concerned that remote access that it is being asked to provide is inappropriate can tell us all about it and, as the Minister said, challenge that. I hope also that individual complainants and the harshest critics of the Government and of the security services—and a lot of people in this world worry about these things who, when they read this debate or look at the amendment, will assume the worst—can see exactly what remote access has and has not been made available. Then also I hope that, as individual users—because it is all about our privacy, as social media users of one form or another—we will know that, when we hand our data over to the social media service, which correctly under the terms of this legislation is required to give Ofcom access and keep us safe, we will know exactly what that access entails and that it does not go further than we set out in the legislation.
The transparency piece is critical. Can the Minister say that the information notices in relation to remote access will never be withheld? It is an utterly different world from the investigatory powers world, where there are good reasons where things have to be kept secret. If the Minister can say that in that world nothing is secret about the fact of remote access, and if anybody who has concerns can get the information that they need to understand whether those concerns are genuine, or whether something much more benign is happening, that would be extremely helpful. The Minister mentioned judicial review by platforms. I get that but, if a platform feels that Ofcom, the regulator, is behaving in an overbearing way, I remain a little concerned that judicial review is quite a slow and painful process. As I understand it, it is more about whether it was legally correct to give the order than perhaps whether the substance of the order was appropriate.
We still need to know that the checks and balances are in place. If Ofcom, under a future leadership—I am sure not under its current leadership—were to take it upon itself to want to set up a dashboard to look at what was happening in every social media company in real time, and were taken by that spirit of madness at some future date, I hope that the companies would be able to raise concerns about that, because it is not what we intend to happen in this Bill. I hope that they would be able to do that in a more straightforward process than in a lengthy judicial review.
The Minister has a clear idea of the kind of reassurances that we are looking for. He teased out some of them in his opening comments, and I hope that he can make them even more strongly in his closing remarks.
My Lords, the noble Lord, Lord Allan of Hallam, hinted at the fact that there have been a plethora of government amendments on Report and, to be honest, it has been quite hard fully to digest most of them, let alone scrutinise them. I appreciate that the vast majority have been drawn up with opposition Lords, who might have found it a bit easier. But some have snuck in and, in that context, I want to raise some problems with the amendments in this group, which are important. I, too, am especially worried about that government amendment on facilitating remote access to services and equipment used to buy services. I am really grateful to the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, for tabling Amendment 247B, because I did not know what to do—and they did it. At least it raises the issue to the level of it needing to be taken seriously.
The biggest problem that I had when I originally read this provision was that facilitating remote access to services, and as yet undefined equipment used by a service, seems like a very big decision, and potentially disproportionate. It certainly has a potential to have regulatory overreach, and it creates real risks around privacy. It feels as though it has not even been flagged up strongly enough by the Government with regard to what it could mean.
I listened to what the Minister said, but I still do not fully understand why this is necessary. Have the Government considered the privacy and security implications that have already been discussed? Through Amendment 252A, the Government now have the power to enter premises for inspection—it rather feels as if there is the potential for raids, but I will put that to one side. They can go in, order an audit and so on. Remote access as a preliminary way to gather information seems heavy-handed. Why not leave it as the very last thing to do in a dialogue between Ofcom and a given platform? We have yet to hear a proper justification of why Ofcom would need this as a first-order thing to do.
The Bill does not define exactly what
“equipment used by the service”
means. Does it extend to employees’ laptops and phones? If it extends to external data centres, have the Government assessed the practicalities and security impact of that and the substantial security implications, as have been explained, for the services, the data centre providers and those of us whose data they hold?
I am also concerned that this will necessitate companies having very strongly to consider internal privacy and security controls to deal with the possibility of this, and that this will place a disproportionate burden on smaller and mid-sized businesses that do not have the resources available to the biggest technology companies. I keep raising this because in other parts of government there is a constant attempt to say that the UK will be the centre of technological innovation and that we will be a powerhouse in new technologies, yet I am concerned that so much of the Bill could damage that innovation. That is worth considering.
It seems to me that Amendment 252A on the power to observe at the premises ignores decentralised projects and services—the very kind of services that can revolutionise social media in a positive way. Not every service is like Facebook, but this amendment misses that point. For example, you will not be able to walk into the premises of the UK-based Matrix, the provider of the encrypted chat service Element that allows users to host their own service. Similarly, the non-profit Mastodon claims to be the largest decentralised social network on the internet and to be built on open-web standards precisely because it does not want to be bought or owned by a billionaire. So many of these amendments seem not to take those issues into account.
I also have a slight concern on researcher access to data. When we discussed this in Committee, the tone was very much—as it is in these amendments now—that these special researchers need to be able to find out what is going on in these big, bad tech companies that are trying to hide away dangerous information from us. Although we are trying to ensure that there is protection from harms, we do not want to demonise the companies so much that, every time they raise privacy issues or say, “We will provide data but you can’t access it remotely” or “We want to be the ones deciding which researchers are allowed to look at our data”, we assume that they are always up to no good. That sends the wrong message if we are to be a tech-innovative country or if there is to be any working together.
My final point is to be a bit more positive. I am very keen on the points made by the Minister on the importance of transparency in algorithms, particularly in Amendments 196 and 199. This raises an important point. These amendments are intended to mean that providers of user-to-user services and search services would have to include in their transparency report details about algorithms, so that we can see how they work, and these amendments particularly relate to illegal content and content that is harmful to children. I should like that being understood more broadly, because for me there is constant tension where people do not know what the algorithms are doing. When content is removed, deboosted, or whatever, they do not know why. More transparency there would be positive.
The Minister knows this, because I have written to him on the subject, but many women, for example, are regularly being banned from social media for speaking out on sex-based rights, and gender-critical accounts are constantly telling me and are discussing among themselves that they have been shadow banned: that the algorithms are not allowing them to get their points over. This is, it is alleged, because of the legacy of trans activists controlling the algorithms.
Following on from the point of the noble Lord, Lord Allan of Hallam, there is always a danger here of people being conspiratorial, paranoid and thinking it is the algorithms. I made the point in an earlier discussion that sometimes you might just put up a boring post and no one is interested, but you imagine someone behind the scenes. But we know that Facebook continues to delete posts that states that men cannot be women, for example.
I would like this to be demystified, so the more Ofcom can ask the companies to demystify their algorithmic decisions and the more users can be empowered to know about it, the better for all of us. That is the positive bit of the amendments that I like.
My Lords, the business of the internet is data. Whether it is a retail business, a media business or any other kind of business, the internet is all about data. The chiefs of our internet companies know more about noble Lords than anyone else—more than any government agency, your doctor and almost anyone—because the number of data points that big internet companies have on people is absolutely enormous, and they use them to very great effect.
Some of those effects are entirely benign. I completely endorse what the noble Baroness, Lady Fox, said. As a champion of innovation and business, I totally recognise the good that is done by the world’s internet companies to make our lives richer, create jobs and improve the world, but some of what they do is not good. Either inadvertently or by being passive enablers of harm, internet companies have been responsible for huge societal harms. I do not want to go through the full list, but when I think about the mental health of our teenagers, the extremism in our politics, the availability of harmful information to terrorists and what have you, there is a long catalogue of harms to which internet companies have contributed. We would be naive if we did not recognise.
However, almost uniquely among commercial businesses, internet companies guard access to that data incredibly jealously. They will not let you peek in and share their insights. I know from my experience in the health field that we work very closely with the pharmaceutical industry—there is a whole programme of pharmacovigilance that any pharma company has to participate in in order to explain, measure and justify the good and disbenefits of its medicines. We have no similar programme to pharmacovigilance for the tech industry. Instead, we are completely blind. Policy makers, the police and citizens are flying blind when it comes to the data that is held on us on both an individual and a demographic basis. That is extremely unusual.
That is why I really welcome my noble friend’s amendments that give Ofcom what seems to me to be extremely proportionate and thoughtful powers in order to look into this data, because without it, we do not know what is going on in this incredibly important part of our lives.
The role that researchers, including academic, civil society and campaigning researchers, play in helping Ofcom, policymakers and politicians to arrive at sensible, thoughtful and proportionate policy is absolutely critical. I pay enormous tribute to them; I am grateful to those noble Lords who have also done so. I am extremely grateful to my noble friend the Minister for his amendments on this subject, Amendments 272B and 272C, which address the question of giving researchers better access to some of this data. They would reduce the timeline for the review on data from 24 months to 18 months, which would be extremely helpful, and would changing “may” to “must”, which represents an emphatic commitment to the outcome of this review.
However, issues remain around the question of granting access to data for researchers. What happens to the insights from the promised review once it is delivered? Where are the powers to deliver the review’s recommendations? That gap is not currently served by the government amendments, which is why I and the noble Lord, Lord Clement-Jones, have tabled Amendments 237ZA, 237DB, 262AA and 272AB. Their purpose is to put in the Bill reasonable, proportionate powers to bring access to data for researchers along the lines that the research review will recommend.
The feelings on this matter are extremely strong because we all recognise the value here. We are concerned that any delay may completely undermine this sector. As we debated in Committee, there is a substantial and valuable UK sector in this research area that is likely to move lock, stock and barrel to other countries where these kinds of powers may be in place; for instance, in EU or US legislation. The absence of these powers will, I think, leave Britain in the dark and competitively behind other countries, which is why I want to push the Minister hard on these amendments. I am grateful for his insight that this matter is something that the Government may look to in future Bills, but those Bills are far off. I would like to hear from him what more he could do to try to smooth the journey from this Bill and this review to any future legislation that comes through this House in order to ensure that this important gap is closed.
My Lords, Amendments 270 and 272 are in my name; I thank the noble Lord, Lord Stevenson of Balmacara, for adding his name to them. They are the least controversial amendments in this group, I think. They are really simple. Amendment 270 would require Ofcom’s research about online interests and users’ experiences of regulated services under Clause 143 to be broken down by nation, while Amendment 272 relates to Clause 147 and would require Ofcom’s transparency reports also to be broken down in a nation-specific way.
These amendments follow on from our debates on devolution in Committee. Both seek to ensure that there is analysis of users’ online experiences in the different nations of the UK, which I continue to believe is essential to ensuring that the Bill works for the whole of the UK and is both future-proofed—a word we have all used lots—and able to adapt to different developments across each of the four nations. I have three reasons why I think these things are important. The first concerns the interplay between reserved and devolved matters. The second concerns the legal differences that already exist across the UK. The third concerns the role of Ofcom.
In his much-appreciated email to me last week, the Minister rightly highlighted that internet services are a reserved matter and I absolutely do not wish to impose different standards of regulation across the UK. Regarding priority offences, I completely support the Government’s stance that service providers must treat any content as priority illegal content where it amounts to a criminal offence anywhere in the UK regardless of where that act may have taken place or where the user is. However, my amendments are not about regulation; they are about research and transparency reporting, enabling us to understand the experience across the UK and to collect data—which we have just heard, so powerfully, will be more important as we continue.
I am afraid that leaving it to Ofcom’s discretion to understand the differences in the online experiences across the four nations over time is not quite good enough. Many of the matters we are dealing with in the online safety space—such as children, justice, police and education—are devolved. Government policy-making in devolved areas will increasingly rely on data about online behaviours, harms and outcomes. These days, I cannot imagine creating any kind of public policy without understanding the online dimension. There are areas where either the community experience and/or the policy approach is markedly different across the nations—take drug abuse, for example. No data means uninformed policy-making or flying blind, as my noble friend Lord Bethell has just said. But how easy will it be for the devolved nations to get this information if we do not specify it in the Bill?
In many of the debates, we have already heard of the legal differences across the four nations, and I am extremely grateful to the noble and learned Lord, Lord Hope of Craighead, who is not in his place, the noble Lord, Lord Stevenson of Balmacara, and the Minister for supporting my amendment last week when I could not be here. I am terribly sorry. I was sitting next to the noble Viscount, Lord Camrose, at the time. The amendment was to ensure that there is a legal definition of “freedom of expression” in the Bill that can be understood by devolved Administrations across the UK.
The more I look at this landscape, the more challenges arise. The creation of legislation around intimate abuse images is a good example. The original English legislation was focused on addressing the abusive sharing of intimate images after a relationship breakdown. It required the sharing to have been committed with the intent to cause harm, which has a very easy defence: “I did not mean to cause any harm”. The Scottish legislation, drafted slightly later, softened this to an intent to cause harm or being reckless as to whether harm was caused, which is a bit better because you do not need to prove intent. Now the English version is going to be updated in the Bill to create an offence simply by sharing, which is even better.
Other differences in legislation have been highlighted, such as on deepfakes and upskirting. On the first day of Report, the noble Baroness, Lady Kennedy of The Shaws, highlighted a difference in the way cyberflashing offences are understood in Northern Ireland. So the issue is nuanced, and the Government’s responses change as we learn about harmful behaviours in practice. Over time, we gradually see these offences refined as we learn more about how technology is used to abuse in practice. The question really is: what will such offences look like online in five years’ time? Will the user experience and government policy across the four nations be the same? I will not pretend to try to answer that, but to answer it we will need the data.
I am concerned that the unintended consequences of the Bill in the devolved Administrations have not been fully appreciated or explored. Therefore, I am proposing a belt and braces approach in the reporting regime. When we come to post-legislative scrutiny, with reports being laid before this Parliament and the devolved Administrations in Edinburgh, Cardiff and Belfast—if there is one—we will want to have the data to understand the online experiences of each nation. That is why my very little amendments are seeking to ensure that we capture this experience and that is why it is so important.
On Ofcom—my final point—I know the Minister has every confidence in Ofcom and rightly points out that it has a strong track record of producing data that is representative of people across the UK. I agree. Ofcom already does a great deal of research which is broken down into nation-specific reporting, particularly in broadcasting, but most of this is directly in relation to its obligations under the Communications Act and the BBC charter, which contains a specific purpose:
“To reflect, represent and serve the diverse communities of all of the United Kingdom’s nations and regions and, in doing so, support the creative economy across the United Kingdom”.
From my Scottish point of view, I am arguing—and I know that the Scottish advisory committee of Ofcom would agree with me—that its research and Ofcom’s reports, such as the annual Media Nations report, are linked to the way legislation is set up and then implemented by Ofcom.
Having ensured this for broadcasting and communications, why would we not want to do this for online safety? At last Tuesday’s meeting of the Communications and Digital Committee, on which I serve, we took evidence on a huge range of subjects from my noble friend the chairman of Ofcom and Dame Melanie Dawes. The noble Lord, Lord Grade, used all the usual words to describe this Bill—“complex”, “challenging”—and pointed out that it is a new law in a novel area, but he stressed that Ofcom comes to decisions outside the political arena based on research and evidence.
My amendments just remind Ofcom that we need this research and evidence by nation. This Bill is so large, so wide-ranging, that Ofcom’s remit and functions are having to expand hugely to deliver this new regime. The noble Baroness, Lady Fox, reminded us last week that what Ofcom does comes from the legislation. It does not do things off its own bat. Ofcom already has a huge challenge on its hands with this Bill, and experience tells us that it is likely to deliver only what is specified—the “must do” bits, not the “nice to do” extras. There may be no differences in the online experiences across the nations of the UK, but the only way we can be sure is if we have the data for each nation, the transparency and all the research reporting. I urge the Minister to take my amendment seriously.
My Lords, I think that was a very good speech from the noble Baroness, partly because I signed her amendment and support it and also because I want to refer back to the points made earlier by the noble Lord, Lord Bethell, about research. I am speaking from the Back Benches here because some of what I say may not have been cleared fully with my colleagues, but I am hoping that they will indulge me slightly. If I speak from this elevated position, perhaps they will not hear me so well.
To deal with noble Lords in the order in which they spoke, I support the amendments tabled by the noble Lord, Lord Bethell, in relation to having a bit more activity in relation to the area where we have very good change of government policy in relation to access by researchers to data, and I am very grateful to the Minister for doing that. The noble Lord, Lord Bethell, made the point that there is perhaps a bigger question and a bigger story than can be done just by simply bringing forward the time of the report and changing “may” to “must”, although I always think “may” to “must” changes are important because they reflect a complete change of approach and I hope action will follow. The question about access by those who need data in order to complete their research is crucial to the future success of these regimes. That plays back to what the noble Baroness, Lady Fraser, was saying, which is that we need to have this not just in aggregate form but broken down and stratified so that we can really interrogate where this information is showing the gaps, the opportunities, the changes that are needed and the successes, if there are any, in the way in which we are working.
I support the amendments tabled by the noble Lord, Lord Bethell, because I think this is not so much a question of regulation or lawmaking in this Bill but of trying to engender a change of culture about the way in which social media companies operate. It will need all of us, not just the Government or the regulatory bodies, to continue to press this because this is a major sea change in what they have been doing until now. They are quite rightly protective of their business interests and business secrets, but that is not the same when the currency is data and our data is being used to create change and opportunity and their profits are based on exploiting our resources.
I go back to the points made by the noble Lord, Lord Moylan, in his opening amendment today about why consumer rights do not apply when monetary considerations are not being taken into account. Bartering our data in order to obtain benefits from social media companies is not the same as purchasing over the counter at the local shop—we accept that—but times have changed and we are living in a different world. Everything is being bought and sold electronically. Why is consumer law not being moved forward to take account of that so that the rights that are important to that, because they are the same, are being exploited? I leave that for the Minister to come back to if he wishes to do so from the Dispatch Box.
Moving on to the Scottish issues, the amendment, as introduced by the noble Baroness, is about transparency and data, but I think it hides a bigger question which I am afraid affects much of the legislation that comes through this House, which is that very often the devolution impact of changes in the law and new laws that are brought forward is always the last to be thought about and is always tacked on at the end in ways that are often very obscure.
I have one particularly obscure question which I want to leave with the Minister, completely unreasonably, but I think it just about follows on from the amendment we are discussing. It is that, towards the end of the Bill, Clause 53(5)(c) refers to the consent of the Secretary of State or other Minister of the Crown to crimes in Scottish or Northern Irish legislation when they enter the Online Safety Bill regime. This is because, as has been made clear, laws are changing and are already different in Scotland, Wales and Northern Ireland from some of the criminal laws in England and Wales. While that is to be welcomed, as the noble Baroness said, the devolved Administrations should have the right to make sure, in the areas of their control, that they have the laws that are appropriate for the time, but if they are different, we are going to have to live with those across the country in a way that is a bit patchwork. There need to be rules about how they will apply. I think the noble Baroness said that it would be right and proper that a crime committed in one territory is treated within the rules that apply in that territory, but if they are significantly different, we ought at least to understand why that is the case and how that has come about.
As I understand it—I have a note provided by Carnegie UK and it is always pretty accurate about these matters—the Secretary of State can consent to a devolved authority which wants to bring forward a devolved offence and include it in the online safety regime. However, it is not quite clear how that happens. What is a consent? Is it an Order in Council, a regulation, affirmative or negative procedure or primary legislation? We are not told that; we are just told that consent arrangements apply and consent can be given. Normally consents involve legislative authority—in its words, one Parliament speaking to another—and we are all becoming quite aware of the fact that the legislative consent required from Scotland, Northern Ireland or Wales is often not given, yet the UK Parliament continues to make legislation and it applies, so the process works, but obviously it would be much better if the devolved structures were involved and agreed to what was being done. This is different from the normal top-down approach. Where we already have a change in the law or the law is about to be changed in one of the devolved Administrations, how does that become part of the Online Safety Bill regime? I look forward to the Minister’s response. I did not warn him that I was giving him a very difficult question, and he can write if he cannot give the detail today, but we would like to see on the record how this happens.
If we are getting Statements to Parliament from the Secretary of State about provisional changes to the way in which the law applies in the devolved Administrations, are they going to be subject to due process? Will there be engagement with committees? What will happen if a new code is required or a variation in the code is required? Does that require secondary legislation and, if so, will that be done with the consent of the devolved Administration or by this Parliament after a process we are yet to see?
There is a lot here that has not been fleshed out. There are few very easy answers, but it would be useful if we could get that going. I will not go into more detail on the noble Baroness’s point that laws change, but I know that the Law Society of Scotland has briefed that at least one major piece of legislation, the Hate Crime and Public Order (Scotland) Act 2021, does not appear in Schedule 7 as expected. Again, I ask the Minister if he would write to us explaining the background to that.
These are very important issues and they do not often get discussed in the full process of our Bills, so I am glad that the noble Baroness raised them. She cloaked them in what sounded like a very general and modest request, but they reveal quite considerable difficulties behind them.
My Lords, before I talk to the amendments I had intended to address, I will make a very narrow point in support of the noble Baroness, Lady Fraser. About 10 years ago, when I started doing work on children, I approached Ofcom and asked why all its research goes to 24, when childhood finishes at 18 and the UNCRC says that a child needs special consideration. Ofcom said, “Terribly sorry, but this is our inheritance from a marketing background”. The Communications and Digital Committee later wrote formally to Ofcom and asked if it could do its research up to 18 and then from 18 to 24, but it appeared to be absolutely impossible. I regret that I do not know what the current situation is and I hope that, with the noble Lord, Lord Grade, in place it may rapidly change overnight. My point is that the detailed description that the noble Baroness gave the House about why it is important to stipulate this is proven by that tale.
I also associate myself with the remarks of the noble Lord, Lord Allan, who terrified me some 50 minutes ago. I look forward to hearing what will be said.
I in fact rose to speak to government Amendments 196 and 199, and the bunch of amendments on access to data for researchers. I welcome the government amendments to which I added my name. I really am delighted every time the Government inch forward into the area of the transparency of systemic and design matters. The focus of the Bill should always be on the key factor that separates digital media from other forms of media, which is the power to determine, manipulate and orchestrate what a user does next, see how they behave or what they think. That is very different and is unique to the technology we are talking about.
It will not surprise the Minister to hear that I would have liked this amendment to cover the design of systems and processes, and features and functionalities that are not related to content. Rather than labouring this point, on this occasion I will just draw the Minister’s attention to an article published over the weekend by Professor Henrietta Bowden-Jones, the UK’s foremost expert on gambling and gaming addiction. She equates the systems and processes involved in priming behaviours on social media with the more extreme behaviours that she sees in her addiction clinics, with ever younger children. Professor Bowden-Jones is the spokesperson on behavioural addictions for the Royal College of Psychiatrists, and the House ignores her experience of the loops of reward and compulsion that manipulate behaviour, particularly the behaviour of children, at our peril.
I commend the noble Lord, Lord Bethell, for continuing to press the research issue and coming back, even in the light of the government amendment, with a little more. Access to good data about the operation of social media is vital in holding regulated companies to account, tracking the extent of harms, building an understanding of them and, importantly, building knowledge about how they might be sensibly and effectively addressed.
My concern here is that, when making a concession, the Government most often reach for a review at some time in the future. In the case of research, the future is too late. We are at an inflection point right now, at which digital tech may or may not overwhelm our job market and our understanding of what is real and what is not. It has the potential for societal and technological change that is both beneficial and harmful, but at such a scale that it will certainly transform society as we understand it before 18 months or two years—the point at which the review is triggered and then takes place.
I feel passionately that, in the context of where we are now and the game of catch-up we have been playing for the last couple of decades, it should not be left to the companies to decide what is or is not in the public arena. As a minimum, independent research would allow the regulator to better understand the operation of social media platforms. More broadly, it would keep our universities on a level playing field—as a number of noble Lords have commented—and, maybe most importantly, ensure that the regulator, academia and civil society have a seat at the table of the future of tech.
For that reason, I again ask the Government, as a minimum, to accept the shorter date that was proposed or perhaps to think again before Third Reading.
My Lords, I associate myself with my noble friend Lady Fraser of Craigmaddie’s incredibly well-made points. I learned a long time ago that, when people speak very softly and say they have a very small point to make, they are often about to deliver a zinger. She really did; it was hugely powerful. I will say no more than that I wholeheartedly agree with her; thank you for helping us to understand the issue properly.
I will speak in more detail about access to data for researchers and in support of my noble friend Lord Bethell’s amendments. I too am extremely grateful to the Minister for bringing forward all the government amendments; the direction of travel is encouraging. I am particularly pleased to see the movement from “may” to “must”, but I am worried that it is Ofcom’s rather than the regulated services’ “may” that moves to “must”. There is no backstop for recalcitrant regulated services that refuse to abide by Ofcom’s guidance. As the noble Baroness, Lady Kidron, said, in other areas of the Bill we have quite reasonably resorted to launching a review, requiring Ofcom to publish its results, requiring the Secretary of State to review the recommendations and then giving the Secretary of State backstop powers, if necessary, to implement regulations that would then require regulated companies to change.
I have a simple question for the Minister: why are we not following the same recipe here? Why does this differ from the other issues, on which the House agrees that there is more work to be done? Why are we not putting backstop powers into the Bill for this specific issue, when it is clear to all of us that it is highly likely that there will be said recalcitrant regulated firms that are not willing to grant access to their data for researchers?
Before my noble friend the Minister leaps to the hint he gave in his opening remarks—that this should all be picked up in the Data Protection and Digital Information Bill—unlike the group we have just discussed, this issue was discussed at Second Reading and given a really detailed airing in Committee. This is not new news, in the same way that other issues where we have adopted the same recipe that includes a backstop are being dealt with in the Bill. I urge my noble friend the Minister to follow the good progress so far and to complete the package, as we have in other areas.
My Lords, it is valuable to be able to speak immediately after my noble friend Lady Harding of Winscombe, because it gives me an opportunity to address some remarks she made last Wednesday when we were considering the Bill on Report. She suggested that there was a fundamental disagreement between us about our view of how serious online safety is—the suggestion being that somehow I did not think it was terribly important. I take this opportunity to rebut that and to add to it by saying that other things are also important. One of those things is privacy. We have not discussed privacy in relation to the Bill quite as much as we have freedom of expression, but it is tremendously important too.
Government Amendment 247A represents the most astonishing level of intrusion. In fact, I find it very hard to see how the Government think they can get away with saying that it is compatible with the provisions of the European Convention on Human Rights, which we incorporated into law some 20 years ago, thus creating a whole law of privacy that is now vindicated in the courts. It is not enough just to go around saying that it is “proportionate and necessary” as a mantra; it has to be true.
This provision says that an agency has the right to go into a private business with no warrant, and with no let or hindrance, and is able to look at its processes, data and equipment at will. I know of no other business that can be subjected to that without a warrant or some legal process in advance pertinent to that instance, that case or that business.
My noble friend Lord Bethell said that the internet has been abused by people who carry out evil things; he mentioned terrorism, for example, and he could have mentioned others. However, take mobile telephones and Royal Mail—these are also abused by people conducting terrorism, but we do not allow those communications to be intruded into without some sort of warrant or process. It does not seem to me that the fact that the systems can be abused is sufficient to justify what is being proposed.
My noble friend the Minister says that this can happen only offline. Frankly, I did not understand what he meant by that. In fact, I was going to say that I disagreed with him, but I am moving to the point of saying that I think it is almost meaningless to say that it is going to happen offline. He might be able to explain that. He also said that Ofcom will not see individual traffic. However, neither the point about being offline nor the point about not seeing individual traffic is on the face of the Bill.
When we ask ourselves what the purpose of this astonishing power is—this was referred to obliquely to some extent by the noble Baroness, Lady Fox of Buckley—we can find it in Clause 91(1), to which proposed new subsection (2A) is being added or squeezed in subordinate to it. Clause 91(1) talks about
“any information that they”—
that is, Ofcom—
“require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions”.
The power could be used entirely as a fishing expedition. It could be entirely for the purpose of educating Ofcom as to what it should be doing. There is nothing here to say that it can have these powers of intrusion only if it suspects that there is criminality, a breach of the codes of conduct or any other offence. It is a fishing expedition, entirely for the purpose of
“exercising, or deciding whether to exercise”.
Those are the intrusions imposed upon companies. In some ways, I am less concerned about the companies than I am about what I am going to come to next: the intrusion on the privacy of individuals and users. If we sat back and listened to ourselves and what we are saying, could we explain to ordinary people—we are going to come to this when we discuss end-to-end encryption—what exactly can happen?
Two very significant breaches of the protections in place for privacy on the internet arise from what is proposed. First, if you allow someone into a system and into equipment, especially from outside, you increase the risk and the possibility that a further, probably more hostile party that is sufficiently well-equipped with resources—we know state actors with evil intent which are so equipped—can get in through that or similar holes. The privacy of the system itself would be structurally weakened as a result of doing this. Secondly, if Ofcom is able to see what is going on, the system becomes leaky in the direction of Ofcom. It can come into possession of information, some of which could be of an individual character. My noble friend says that it will not be allowed to release any data and that all sorts of protections are in place. We know that, and I fully accept the honesty and integrity of Ofcom as an institution and of its staff. However, we also know that things get leaked and escape. As a result of this provision, very large holes are being built into the protections of privacy that exist, yet there has been no reference at all to privacy in the remarks made so far by my noble friend.
I finish by saying that we are racing ahead and not thinking. Good Lord, my modest amendment in the last group to bring a well-established piece of legislation—the Consumer Rights Act—to bear upon this Bill was challenged on the grounds that there had not been an impact assessment. Where is the impact assessment for this? Where is even the smell test for this in relation to explaining it to the public? If my noble friend is able to expatiate at the end on the implications for privacy and attempt to give us some assurance, that would be some consolation. I doubt that he is going to give way and do the right thing and withdraw this amendment.
My Lords, the debate so far has been—in the words of the noble Baroness, Lady Fox—a Committee debate. That is partly because this set of amendments from the Government has come quite late. If they had been tabled in Committee, I think we would have had a more expansive debate on this issue and could have knocked it about a bit and come back to it on Report. The timing is regrettable in all of this.
That said, the Government have tabled some extremely important amendments, particularly Amendments 196 and 198, which deal with things such as algorithms and functionalities. I very much welcome those important amendments, as I know the noble Baroness, Lady Kidron, did.
I also very much support Amendments 270 and 272 in the name of the noble Baroness, Lady Fraser. I hope the Minister, having been pre-primed, has all the answers to them. It is astonishing that, after all these years, we are so unattuned to the issues of the devolved Administrations and that we are still not in the mindset on things such as research. We are not sufficiently granular, as has been explained—let alone all the other questions that the noble Lord, Lord Stevenson, asked. I hope the Minister can unpack some of that as well.
I want to express some gratitude, too, because the Minister and his officials took the trouble to give us a briefing about remote access issues, alongside Ofcom. Ofcom also sent through its note on algorithmic assessment powers, so an effort has been made to explain some of these powers. Indeed, I can see the practical importance, as explained to us. It is partly the lateness, however, that sets off what my noble friend Lord Allan called “trigger words” and concerns about the remote access provisions. Indeed, I think we have a living and breathing demonstration of the impact of triggers on the noble Lord, Lord Moylan, because these are indeed issues that concern those outside the House to quite a large degree.
I really think the Minister will have to take us through the safeguards again, and there are some he has already mentioned: the limits to the information that can be viewed, and the fact that this is done offline in operation but is limited to functionalities rather than content. He did mention that it was privacy-protecting, that it was justifiable only where proportionate, that the fact that Ofcom is accessing remotely can be made public, that it is challengeable, and that Ofcom cannot disclose information obtained. These are non-trivial powers which require a non-trivial response and a great deal more explanation, particularly, as my noble friend said, on how they differ from those in the Investigatory Powers Act; otherwise, I think concerns will continue. That is the reason for my noble friend’s Amendment 247B, which attempts to place further safeguards.
Amendments 237ZA, 272AB and 262AA, tabled by the noble Lord, Lord Bethell, and spoken to by the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Stevenson, are really important. Of course, we welcome the tweak to Clause 148 that the Minister has introduced today but, as all noble Lords have said, this does not take us far enough. At the risk of boring the House to death, as usual, I will refer to the original Joint Committee report again. We keep returning to this because it does set out a great deal of sense.
There are three areas I want to mention. The Joint Committee said at the time:
“We heard from Dr Amy Orben, College Research Fellow at Emmanuel College, University of Cambridge, that lack of access to data is ‘making it impossible for good quality and independent scientific studies to be completed on topics such as online harms, mental health, or misinformation’”.
In another paragraph, the report says:
“We heard there is evidence that social media usage can cause psychological harm to children, but that platforms prevent research in this area from being conducted or circulated”.
Then we actually heard from Meta. It said:
“One of the things that is a particular challenge in the area of research is how we can provide academics who are doing independent research with access to data really to study these things more deeply”.
On all sides, there is a need for teeth when it comes to access for independent researchers.
Our recommendation was very clear; it was not just a recommendation that this should be reviewed but that there should be these powers. Obviously, the Minister has agreed to a compromise, in terms of a report on access, but he really should go further and have default powers, so that the Government can institute the access that is required for researchers, as a result of that report. So far, with the amendments today, the Minister is not strengthening the Bill, because there is no way for Ofcom to compel compliance with the guidance or any incentive for companies to comply with any guidance that is produced.
I very much hope that the Minister will take on board what the noble Lord, Lord Bethell, had to say, in a very eloquent way. If he cannot do it here and now on Report, I very much hope that he will come back with a proposal at Third Reading. As the noble Baroness, Lady Harding, said, we have done this in virtually every other case where there is a report. As we have seen, the Minister has agreed to have a review or a report, and then the backstop powers are in place. That is not the case with this, and it should be.
My Lords, I just want to reinforce what my noble friend Lord Bethell said about the amendments to which I have also put my name: Amendments 237ZA, 266AA and 272E. I was originally of the view that it was enough to give Ofcom the powers to enforce its own rulings. I have been persuaded that, pace my noble friend Lord Grade, the powers that have been given to Ofcom represent such a huge expansion that the likelihood of the regulator doing anything other than those things which it is obliged to do is rather remote. So I come to the conclusion that an obligation is the right way to put these things. I also agree with what has been said about the need to ensure that subsequent action is taken, in relation to a regulated service if it does not follow what Ofcom has set out.
I will also say a word about researchers. They are a resource that already exists. Indeed, there has been quite a lot of pushing, not least by me, on using this resource, first, to update the powers of the Computer Misuse Act, but also to enlarge our understanding of and ability to have information about the operation of online services. So this is a welcome move on the part of the Government, that they see the value of researchers in this context.
My noble friend Lord Moylan made a good point that the terms under which this function is exercised have to have regard to privacy as well as to transparency of operations. This is probably one of the reasons why we have not seen movement on this issue in the Computer Misuse Act and its updating, because it is intrinsically quite a difficult issue. But I believe that it has to be tackled, and I hope very much that the Government will not delay in bringing forward the necessary legislation that will ensure both that researchers are protected in the exercise of this function, which has been one of the issues, and that they are enabled to do something worth while. So I believe the Minister when he says that the Government may need to bring forward extra legislation on this; it is almost certainly the case. I hope very much that there will not be a great gap, so that we do not see this part of the proposals not coming into effect.
My Lords, we have had an important debate on a range of amendments to the Bill. There are some very important and good ones, to which I would say: “Better late than never”. I probably would not say that to Amendment 247A; I would maybe say “better never”, but we will come on to that. It is interesting that some of this has come to light following the debate on and scrutiny of the Digital Markets, Competition and Consumers Bill in another place. That might reinforce the need for post-legislative review of how this Bill, the competition Bill and the data Bill are working together in practice. Maybe we will need another Joint Committee, which will please the noble Lord, Lord Clement-Jones, no end.
There are many government amendments. The terms of service and takedown policy ones have been signed by my noble friend Lord Stevenson, and we support them. There are amendments on requiring information on algorithms in transparency reports; requiring search to put into transparency reports; how policies on illegal content and content that is harmful for children were arrived at; information about search algorithms; and physical access in an audit to view the operations of algorithms and other systems. Like the noble Baroness, Lady Kidron, I very much welcome, in this section anyway, that focus on systems, algorithms and process rather than solely on content.
However, Amendment 247A is problematic in respect of the trigger words, as the noble Lord, Lord Allan, referred to, of remote access and requiring a demonstration gathering real-time data. That raises a number of, as he said, non-trivial questions. I shall relay what some service providers have been saying to me. The Bill already provides Ofcom with equivalent powers under Schedule 12—such as rights of entry and inspection and extensive auditing powers—that could require them to operate any equipment or algorithms to produce information for Ofcom and/or allow Ofcom to observe the functioning of the regulated service. Crucially, safeguards are built into the provisions in Schedule 12 to ensure that Ofcom exercises them only in circumstances where the service provider is thought to be in breach of its duties and/or under a warrant, which has to have judicial approval, yet there appear to be no equivalent safeguards in relation to this power. I wonder whether, as it has come relatively late, that is an oversight that the Minister might want to address at Third Reading.
The policy intent, as I understand it, is to give Ofcom remote access to algorithms to ensure that service providers located out of the jurisdiction are not out of scope of Ofcom’s powers. Could that have been achieved by small drafting amendments to Schedule 12? In that case, the whole set of safeguards that we are concerned about would be in place because, so to speak, they would be in the right place. As drafted, the amendment appears to be an extension of Ofcom’s information-gathering powers that can be exercised as a first step against a service provider or access facility without any evidence that the service is in breach of its obligations or that any kind of enforcement action is necessary, which would be disproportionate and oppressive.
Given the weight of industry concern about the proportionality of these powers and their late addition, I urge the Minister to look at the addition of further safeguards around the use of these powers in the Bill and further clarification on the scope of the amendment as a power of escalation, including that it should be exercised as a measure of last resort, and only in circumstances where a service provider has not complied with its duty under the Bill or where the service provider has refused to comply with a prior information notice.
Amendment 247B is welcome because it gives the Minister the opportunity to tell us now that he wants to reflect on all this before Third Reading, work with us and, if necessary, come back with a tightening of the language and a resolution of these issues. I know his motivation is not to cause a problem late on in the Bill but he has a problem, and if he could reflect on it and come back at Third Reading then that would be helpful.
I welcome the amendments tabled by the noble Lord, Lord Bethell, on researcher access. This is another area where he has gone to great efforts to engage across the House with concerned parties, and we are grateful to him for doing so. Independent research is vital for us to understand how this new regime that we are creating is working. As he says, it is a UK strength, and we should play to that strength and not let it slip away inadvertently. We will not get the regime right first time, and we should not trust the platforms to tell us. We need access to independent researchers, and the amendments strike a good balance.
We look forward to the Minister deploying his listening ear, particularly to what the noble Baroness, Lady Harding, had to say on backstop powers. When he said in his opening speech that he would reflect, is he keeping open the option of reflecting and coming back at Third Reading, or is he reflecting only on the possibility of coming back in other legislation?
The noble Baroness, Lady Fraser, raised an important issue for the UK regulator, ensuring that it is listening to potential differences in public opinion in the four nations of our union and, similarly, analysing transparency reports. As she says, this is not about reserved matters but about respecting the individual nations and listening to their different voices. It may well be written into the work of Ofcom by design but we cannot assume that. We look forward to the Minister’s response, including on the questions from my noble friend on the consent process for the devolved Administrations to add offences to the regime.
My Lords, I am grateful to noble Lords for their contributions in this group. On the point made by the noble Lord, Lord Knight of Weymouth, on why we are bringing in some of these powers now, I say that the power to direct and observe algorithms was previously implicit within Ofcom’s information powers and, where a provider has UK premises, under powers of entry, inspection and audit under Schedule 12. However, the Digital Markets, Competition and Consumers Bill, which is set to confer similar powers on the Competition and Markets Authority and its digital markets unit, makes these powers explicit. We wanted to ensure that there was no ambiguity over whether Ofcom had equivalent powers in the light of that. Furthermore, the changes we are making ensure that Ofcom can direct and observe algorithmic assessments even if a provider does not have relevant premises or equipment in the UK.
I am grateful to the noble Lord, Lord Allan of Hallam, for inviting me to re-emphasise points and allay the concerns that have been triggered, as his noble friend Lord Clement-Jones put it. I am happy to set out again a bit of what I said in opening this debate. The powers will be subject to a number of safeguards. First, they are limited to “viewing information”. They can be used only where they are proportionate in the exercise of Ofcom’s functions, and a provider would have the right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was done unlawfully. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.
These are not secret powers, as the noble Lord rightly noted. The Bill contains no restriction on services making the existence and detail of the information notice public. If a regulated service wished to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. I also mentioned the recourse that people have through existing legislation, such as the Freedom of Information Act, to give them safeguards, noting that, under Section 393 of the Communications Act, Ofcom will not be able to disclose information that it has obtained through its exercise of these powers without the provider’s consent unless that is permitted for specific, defined purposes.
The noble Lord’s Amendment 247B seeks to place further safeguards on Ofcom’s use of its new power to access providers’ systems remotely to observe tests. While I largely agree with the intention behind it, there are already a number of safeguards in place for the use of that power, including in relation to data protection, legally privileged material and the disclosure of information, as I have outlined. Ofcom will not be able to gain remote access simply for exploratory or fishing purposes, and indeed Ofcom expects to have conversations with services about how to provide the information requested.
Furthermore, before exercising the power, Ofcom will be required to issue an information notice specifying the information to be provided, setting out the parameters of access and why Ofcom requires the information, among other things. Following the receipt of an information notice, a notice requiring an inspection or an audit notice, if a company has identified that there is an obvious security risk in Ofcom exercising the power as set out in the notice, it may not be proportionate to do so. As set out in Ofcom’s duties, Ofcom must have regard to the principles under which regulatory activities should be proportionate and targeted only at cases where action is needed.
In line with current practice, we anticipate Ofcom will issue information notice requests in draft form to identify and address any issues, including in relation to security, before the information notice is issued formally. Ofcom will have a legal duty to exercise its remote access powers in a way that is proportionate, ensuring that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information, and whether there was a less onerous method of obtaining the necessary information to ensure that the use of this power is proportionate. As I said, the remote access power is limited to “viewing information”. Under this power, Ofcom will be unable to interfere or access the service for any other purpose.
In practice, Ofcom will work with services during the process. It is required to specify, among other things, the information to be provided, which will set the parameters of its access, and why it requires the information, which will explain the link between the information it seeks and the online safety function that it is exercising or deciding whether to exercise.
As noble Lords know, Ofcom must comply with the UK’s data protection law. As we have discussed in relation to other issues, it is required to act compatibly with the European Convention on Human Rights, including Article 8 privacy rights. In addition, under Clause 91(7), Ofcom is explicitly prohibited from requiring the provision of legally privileged information. It will also be under a legal obligation to ensure that the information gathered from services is protected from disclosure unless clearly defined exemptions apply, such as those under Section 393(2) of the Communications Act 2003—for example, the carrying out of any of Ofcom’s functions. I hope that provides reassurance to the noble Lord, Lord Allan, and the noble Baroness, Lady Fox, who raised these questions.
I am grateful to the Minister. That was helpful, particularly the description of the process and the fact that drafts have to be issued early on. However, it still leaves open a couple of questions, one of which was very helpfully raised by the noble Lord, Lord Knight. We have in Schedule 12 this other set of protections that could be applied. There is a genuine question as to why this has been put in this place and not there.
The second question is to dig a little more into the question of what happens when there is a dispute. The noble Lord, Lord Moylan, pointed out that if you have created a backdoor then you have created a backdoor, and it is dangerous. If we end up in a situation where a company believes that what it is being asked to do by Ofcom is fundamentally problematic and would create a security risk, it will not be good enough to open up the backdoor and then have a judicial review. It needs to be able to say no at that stage, yet the Bill says that it could be committing a serious criminal offence by failing to comply with an information notice. We want some more assurances, in some form, about what would happen in a scenario where a company genuinely and sincerely believes that what Ofcom is asking for is inappropriate and/or dangerous and it wants not to have to offer it unless and until its challenge has been looked at, rather than having to offer it and then later judicially review a decision. The damage would already have been done by opening up an inappropriate backdoor.
A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the remote access power was unlawful. I am sure that would be looked at swiftly, but I will write to the noble Lord on the anticipated timelines while that judicial review was pending. Given the serious nature of the issues under consideration, I am sure that would be looked at swiftly. I will write further on that.
And on Schedule 12?
I will write on Schedule 12 as well.
Before the Minister sits down, to quote the way the Minister has operated throughout Report, there is consensus across the House that there are some concerns. The reason why there are concerns outside and inside the House on this particular amendment is that it is not entirely clear that those protections exist, and there are worries. I ask the Minister whether, rather than just writing, it would be possible to take this back to the department, table a late amendment and say, “Look again”. That has been done before. It is certainly not too late: if it was not too late to have this amendment then it is certainly not too late to take it away again and to adopt another amendment that gives some safeguarding. Seriously, it is worth looking again.
I had not quite finished; the noble Baroness was quick to catch me before I sat down. I still have some way to go, but I will certainly take on board all the points that have been made on this group.
The noble Lord, Lord Knight, asked about Schedule 12. I will happily write with further information on that, but Schedule 12 is about UK premises, so it is probably not the appropriate place to deal with this, as we need to be able to access services in other countries. If there is a serious security risk then it would not necessarily be proportionate. I will write to him with further details.
I am grateful to the Minister for giving way so quickly. I think the House is asking him to indicate now that he will go away and look at this issue, perhaps with some of us, and that, if necessary, he would be willing to look at coming back with something at Third Reading. From my understanding of the Companion, I think he needs to say words to that effect to allow him to do so, if that is what he subsequently wants to do at Third Reading.
I am very happy to discuss this further with noble Lords, but I will reserve the right, pending that discussion, to decide whether we need to return to this at Third Reading.
Amendments 270 and 272, tabled by my noble friend Lady Fraser of Craigmaddie, to whom I am very grateful for her careful scrutiny of the devolved aspects of the Bill, seek to require Ofcom to include separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in the research about users’ experiences of regulated services and in Ofcom’s transparency reports. While I am sympathetic to her intention—we have corresponded on it, for which I am grateful—it is important that Ofcom has and retains the discretion to prioritise information requests that will best shed light on the experience of users across the UK.
My noble friend and other noble Lords should be reassured that Ofcom has a strong track record of using this discretion to produce data which are representative of people across the whole United Kingdom. Ofcom is committed to reflecting the online experiences of users across the UK and intends, wherever possible, to publish data at a national level. When conducting research, Ofcom seeks to gather views from a representative sample of the United Kingdom and seeks to set quotas that ensure an analysable sample within each of the home nations.
It is also worth noting the provisions in the Communications Act 2003 that require Ofcom to operate offices in each of the nations of the UK, to maintain advisory committees for each, and to ensure their representation on its various boards and panels—and, indeed, on the point raised by the noble Baroness, Lady Kidron, to capture the experiences of children and users of all ages. While we must give Ofcom the discretion it needs to ensure that the framework is flexible and remains future-proofed, I hope that I have reassured my noble friend that her point will indeed be captured, reported on and be able to be scrutinised, not just in this House but across the UK.
I am grateful to the Minister for giving way. My premise is that the reason Ofcom reports in a nation-specific way in broadcasting and in communications is because there is a high-level reference in both the Communications Act 2003 and the BBC charter that requires it to do so, because it feeds down into national quotas and so on. There is currently nothing of that equivalence in the Online Safety Bill. Therefore, we are relying on Ofcom’s discretion, whereas in the broadcasting and communications area we have a high-level reference to insisting that there is a breakdown by nation.
We think we can rely on Ofcom’s discretion, and point to its current practice. I hope that will reassure my noble friend that it will set out the information she seeks.
I was about to say that I am very happy to write to the noble Lord, Lord Stevenson, about the manner by which consent is given in Clause 53(5)(c), but I think his question is on something else.
I would be grateful if the Minister could repeat that immediately afterwards, when I will listen much harder.
Just to echo what the noble Baroness was saying, may we take it as an expectation that approaches that are signalled in legislation for broadcasting and communications should apply pari passu to the work of Ofcom in relation to the devolved Administrations?
Yes, and we can point to the current actions of Ofcom to show that it is indeed doing this already, even without that legislative stick.
I turn to the amendments in the name of my noble friend Lord Bethell and the noble Lord, Lord Clement-Jones, on researchers’ access to data. Amendment 237ZA would confer on the Secretary of State a power to make provisions about access to information by researchers. As my noble friend knows, we are sympathetic to the importance of this issue, which is why we have tabled our own amendments in relation to it. However, as my noble friend also knows, in such a complex and sensitive area that we think it is premature to endow the Secretary of State with such broad powers to introduce a new framework. As we touched on in Committee, this is a complex and still nascent area, which is why it is different from the other areas to which the noble Lord, Lord Clement-Jones, pointed in his contribution.
The noble Baroness, Lady Harding, made the point that in other areas where the Minister has agreed to reviews or reports, there are backstop powers; for instance, on app stores. Of course, that was a negotiated settlement, so to speak, but why can the Minister not accede to that in the case of access for researchers, as he has with app stores? Indeed, there is one other example that escapes me, which the Minister has also agreed to.
We touched on the complexity of defining who and what is a researcher and making sure that we do not give rise to bad actors exploiting that. This is a complex area, as we touched on in Committee. As I say, the evidence base here is nascent. It is important first to focus on developing our understanding of the issues to ensure that any power or legislation is fit to address those challenges. Ofcom’s report will not only highlight how platforms can share data with researchers safely but will provide the evidence base for considering any future policy approaches, which we have committed to doing but which I think the noble Lord will agree are worthy of further debate and reflection in Parliament.
The benefit of having a period of time between the last day of Report on Wednesday and Third Reading is that that gives the Minister, the Bill team and parliamentary counsel the time to reflect on the kind of power that could be devised. The wording could be devised, and I would have thought that six weeks would be quite adequate for that, perhaps in a general way. After all, this is not a power that is immediately going to be used; it is a general power that could be brought into effect by regulation. Surely it is not beyond the wit to devise something suitable.
Before the Minister stands up, I also wondered—
Oh!
Sit down or stand up—I cannot remember.
I wonder whether the department has looked at the DSA and other situations where this is being worked out. I recognise that it takes a period of time, but it is not without some precedent that a pathway should be described.
We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.
Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.
Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.
Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.
On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.
Just on the timescale, one of the issues that we talked about in Committee was the fact that there needs to be some kind of mechanism created, with a code of practice with reference to data protection law and an approving body to approve researchers as suitable to take information; the noble Baroness, Lady Kidron, referred to the DSA process, which the European Union has been working on. I hope the Minister can confirm that Ofcom might get moving on establishing that. It is not dependent on there being a report in 18 months; in fact, you need to have it in place when you report in 18 months, which means you need to start building it now. I hope the Minister would want Ofcom, within its existing framework, to be encouraging the creation of that researcher approval body and code of practice, not waiting to start that process in 18 months’ time.
I will continue my train of thought on my noble friend’s amendments, which I hope will cover that and more.
My noble friend’s Amendment 273A would allow Ofcom to appoint approved independent researchers to access information. Again, given the nascent evidence base here, it is important to focus on understanding these issues before we commit to a researcher access framework.
Under the skilled persons provisions, Ofcom will already have the powers to appoint a skilled person to assess compliance with the regulatory framework; that includes the ability to leverage the expertise of independent researchers. My noble friend’s Amendment 273B would require Ofcom to produce a code of practice on access to data by researchers. The government amendments I spoke to earlier will require Ofcom to produce guidance on that issue, which will help to promote information sharing in a safe and secure way.
To the question asked by the noble Lord, Lord Allan: yes, Ofcom can start the process and do it quickly. The question here is really about the timeframe in which it does so. As I said in opening, we understand the calls for further action in this area.
I am happy to say to my noble friend Lord Bethell, to whom we are grateful for his work on this and the conversations we have had, that we will explore the issue further and report back on whether further measures to support researchers’ access to data are required and, if so, whether they can be implemented through other legislation, such as the Data Protection and Digital Information (No.2) Bill.
Before the Minister sits down—he has been extremely generous in taking interventions—I want to put on record my understanding of his slightly ambiguous response to Amendment 247A, so that he can correct it if I have got it wrong. My understanding is that he has agreed to go away and reflect on the amendment and that he will have discussions with us about it. Only if he then believes that it is helpful to bring forward an amendment at Third Reading will he do so.
Yes, but I do not want to raise the hopes of the noble Lord or others, with whom I look forward to discussing this matter. I must manage their expectations about whether we will bring anything forward. With that, I beg to move.
Amendment 187 agreed.
Amendment 188 not moved.
Clause 67: Interpretation of this Chapter
Amendment 189
Moved by
189: Clause 67, page 64, line 15, leave out from “65(9),” to “and” in line 16 and insert “indicates (in whatever words) that the presence of content of that kind is prohibited on the service or that users’ access to content of that kind is restricted,”
Member’s explanatory statement
This amendment makes a change to the definition of “relevant content” which applies for the purposes of Chapter 3 of Part 4 of the Bill (transparency of terms of service etc). The effect of the change is to cover a wider range of ways in which a term of service might indicate that a certain kind of content is not allowed on the service.
Amendment 189 agreed.
Amendment 190
Moved by
190: After Clause 67, insert the following new Clause—
“CHAPTER 3ADECEASED CHILD USERSDisclosure of information about use of service by deceased child users
(1) A provider of a relevant service must make it clear in the terms of service what their policy is about dealing with requests from parents of a deceased child for information about the child’s use of the service.(2) A provider of a relevant service must have a dedicated helpline or section of the service, or some similar means, by which parents can easily find out what they need to do to obtain information and updates in those circumstances, and the terms of service must provide details.(3) A provider of a relevant service must include clear and accessible provisions in the terms of service—(a) specifying the procedure for parents of a deceased child to request information about the child’s use of the service,(b) specifying what evidence (if any) the provider will require about the parent’s identity or relationship to the child, and(c) giving sufficient detail to enable child users and their parents to be reasonably certain about what kinds of information would be disclosed and how information would be disclosed. (4) A provider of a relevant service must respond in a timely manner to requests from parents of a deceased child for information about the child’s use of the service or for updates about the progress of such information requests.(5) A provider of a relevant service must operate a complaints procedure in relation to the service that—(a) allows for complaints to be made by parents of a deceased child who consider that the provider is not complying with a duty set out in any of subsections (1) to (4),(b) provides for appropriate action to be taken by the provider of the service in response to such complaints, and(c) is easy to access, easy to use and transparent.(6) A provider of a relevant service must include in the terms of service provisions which are easily accessible specifying the policies and processes that govern the handling and resolution of such complaints.(7) If a person is the provider of more than one relevant service, the duties set out in this section apply in relation to each such service.(8) The duties set out in this section extend only to the design, operation and use of a service in the United Kingdom, and references in this section to children are to children in the United Kingdom.(9) A “relevant service” means—(a) a Category 1 service (see section 86(10)(a));(b) a Category 2A service (see section 86(10)(b));(c) a Category 2B service (see section 86(10)(c)).(10) In this section “parent”, in relation to a child, includes any person who is not the child’s parent but who—(a) has parental responsibility for the child within the meaning of section 3 of the Children Act 1989 or Article 6 of the Children (Northern Ireland) Order 1995 (S.I. 1995/755 (N.I. 2)), or(b) has parental responsibilities in relation to the child within the meaning of section 1(3) of the Children (Scotland) Act 1995.(11) In the application of this section to a Category 2A service, references to the terms of service include references to a publicly available statement.”Member’s explanatory statement
This amendment imposes new duties on providers of Category 1, 2A and 2B services to have a policy about disclosing information to the parents of deceased child users, and providing details about it in the terms of service or a publicly available statement.
Amendment 190 agreed.
Amendment 191
Moved by
191: After Clause 67, insert the following new Clause—
“OFCOM’s guidance about duties set out in section (Disclosure of information about use of service by deceased child users)
(1) OFCOM must produce guidance for providers of relevant services to assist them in complying with their duties set out in section (Disclosure of information about use of service by deceased child users).(2) OFCOM must publish the guidance (and any revised or replacement guidance).(3) In this section “relevant service” has the meaning given by section (Disclosure of information about use of service by deceased child users).”Member’s explanatory statement
This amendment requires OFCOM to give guidance to providers about the new duties imposed by the other Clause proposed after Clause 67 in my name.
Amendment 191A (to Amendment 191) not moved.
Amendment 191 agreed.
Schedule 8: Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services
Amendments 192 to 203
Moved by
192: Schedule 8, page 212, line 26, leave out “and relevant content” and insert “, relevant content and content to which section 12(2) applies”
Member’s explanatory statement
This amendment adds a reference to content to which section 12(2) applies (content to which certain user empowerment duties apply) to paragraph 1 of the transparency reporting Schedule, which allows OFCOM to require providers of user-to-user services to include information in their transparency reports about the incidence of content.
193: Schedule 8, page 212, line 28, leave out “and relevant content” and insert “, relevant content and content to which section 12(2) applies”
Member’s explanatory statement
This amendment adds a reference to content to which section 12(2) applies to paragraph 2 of the transparency reporting Schedule, which allows OFCOM to require providers of user- to-user services to include information in their transparency reports about the dissemination of content.
194: Schedule 8, page 212, line 31, leave out “or relevant content” and insert “, relevant content or content to which section 12(2) applies”
Member’s explanatory statement
This amendment adds a reference to content to which section 12(2) applies to paragraph 3 of the transparency reporting Schedule, which allows OFCOM to require providers of user- to-user services to include information in their transparency reports about the number of users encountering content.
195: Schedule 8, page 212, line 33, after “The” insert “formulation, development, scope and”
Member’s explanatory statement
This amendment allows OFCOM to require providers of user-to-user services to include information in their transparency report about the formulation, development and scope of their terms of service (as well as the application of the terms of service).
196: Schedule 8, page 213, line 5, at end insert—
“8A_ The design and operation of algorithms which affect the display, promotion, restriction or recommendation of illegal content, content that is harmful to children, relevant content or content to which section 12(2) applies.”Member’s explanatory statement
This amendment makes it clear that OFCOM can require providers of user-to-user services to include information in their transparency report about algorithms, as mentioned in this new paragraph.
197: Schedule 8, page 213, line 16, at end insert—
“12A_ Measures taken or in use by a provider to comply with any duty set out in section (Disclosure of information about use of service by deceased child users) (deceased child users).”Member’s explanatory statement
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about measures taken to comply with the new duties imposed by the Clause proposed after Clause 67 in my name.
198: Schedule 8, page 214, line 3, after “The” insert “formulation, development, scope and”
Member’s explanatory statement
This amendment allows OFCOM to require providers of search services to include information in their transparency report about the formulation, development and scope of their public statements of policies and procedures (as well as the application of those statements).
199: Schedule 8, page 214, line 15, at end insert—
“24A_ The design and operation of algorithms which affect the display, promotion, restriction or recommendation of illegal search content or search content that is harmful to children.”Member’s explanatory statement
This amendment means that OFCOM can require providers of search services to include information in their transparency report about algorithms, as mentioned in this new paragraph.
200: Schedule 8, page 214, line 22, at end insert—
“26A_ Measures taken or in use by a provider to comply with any duty set out in section (Disclosure of information about use of service by deceased child users) (deceased child users).”Member’s explanatory statement
This amendment means that OFCOM can require providers of search services to include information in their transparency report about measures taken to comply with the new duties imposed by the Clause proposed after Clause 67 in my name.
201: Schedule 8, page 215, line 9, leave out “to 3” and insert “to 3A”
Member’s explanatory statement
This amendment requires OFCOM, in considering which information to require from a provider in a transparency report, to consider whether the provider is subject to the duties imposed by Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name.
202: Schedule 8, page 215, line 25, leave out from “(2),” to “and” in line 26 and insert “indicates (in whatever words) that the presence of content of that kind is prohibited on the service or that users’ access to content of that kind is restricted,”
Member’s explanatory statement
This amendment makes a change to the definition of “relevant content” which applies for the purposes of the transparency reporting Schedule. The effect of the change is to cover a wider range of ways in which a term of service might indicate that a certain kind of content is not allowed on the service.
203: Schedule 8, page 215, line 34, at end insert—
“(4) The reference in sub-paragraph (1) to users’ access to content being restricted is to be construed in accordance with sections 52 and 211(5).”Member’s explanatory statement
This technical amendment makes it clear that the reference to users’ access to content being restricted in the transparency reporting Schedule has the meaning given to it in Part 3 of the Bill.
Amendments 192 to 203 agreed.
Amendment 204 not moved.
Clause 70: “Pornographic content”, “provider pornographic content”, “regulated provider pornographic content”
Amendments 205 to 209
Moved by
205: Clause 70, page 66, line 42, leave out subsection (2)
Member’s explanatory statement
This amendment is consequential on the amendment to Clause 211 in my name adding a definition of “pornographic content” to that Clause.
206: Clause 70, page 67, leave out lines 4 to 6 and insert “, including pornographic content published or displayed on the service by means of—
(a) software or an automated tool or algorithm applied by the provider or by a person acting on behalf of the provider, or(b) an automated tool or algorithm made available on the service by the provider or by a person acting on behalf of the provider.”Member’s explanatory statement
This amendment is about what counts as “provider pornographic content” for the purposes of Part 5 of the Bill. Words are added to expressly cover the case where an automated tool or algorithm is made available on the service by a provider, such as a generative AI bot.
207: Clause 70, page 67, line 8, leave out from “than” to end of line 10 and insert “content within subsection (4A) or (4B).”
Member’s explanatory statement
This amendment is related to the next amendment in my name which inserts new subsection (4A) into Clause 70. The change is to the scope of what it means for content to consist only of text.
208: Clause 70, page 67, line 10, at end insert—
“(4A) Content is within this subsection if it—(a) consists only of text, or(b) consists only of text accompanied by—(i) a GIF which is not itself pornographic content,(ii) an emoji or other symbol, or(iii) a combination of content mentioned in sub-paragraphs (i) and (ii).(4B) Content is within this subsection if it consists of a paid-for advertisement (see section 211).”Member’s explanatory statement
This amendment clarifies the scope of the exemption from the Part 5 duties for content which consists only of text. Such content does not count as regulated provider pornographic content.
209: Clause 70, page 67, line 20, at end insert “and
(iii) references to pornographic content that is generated on the service by means of an automated tool or algorithm in response to a prompt by a user and is only visible or audible to that user (no matter for how short a time);”Member’s explanatory statement
This amendment makes it clear that, for the purposes of Part 5 (provider pornography), content is within scope of the duties if it is AI-generated content.
Amendments 205 to 209 agreed.
Clause 72: Duties about regulated provider pornographic content
Amendments 210 to 214
Moved by
210: Clause 72, page 68, line 18, leave out subsection (2) and insert—
“(2) A duty to ensure, by the use of age verification or age estimation (or both), that children are not normally able to encounter content that is regulated provider pornographic content in relation to the service.(2A) The age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child.” Member’s explanatory statement
This amendment requires providers within scope of Part 5 to use highly effective age verification or age estimation (or both) to comply with the duty in Clause 72(2) (preventing children from encountering provider pornographic content).
211: Clause 72, page 68, line 21, leave out “A” and insert “In relation to the duty set out in subsection (2), a”
Member’s explanatory statement
This amendment is a technical change relating to the preceding amendment in my name.
212: Clause 72, page 68, line 23, leave out paragraph (a) and insert—
“(a) the kinds of age verification or age estimation used, and how they are used, and”Member’s explanatory statement
This amendment requires Part 5 providers to keep a written record about the age verification or age estimation measures they use to comply with the duty in Clause 72(2).
213: Clause 72, page 68, line 25, leave out from “on” to “has” in line 26 and insert “the kinds of age verification or age estimation and how they should be used,”
Member’s explanatory statement
This amendment is consequential on the preceding amendment in my name.
214: Clause 72, page 68, line 31, at end insert—
“(4) A duty to summarise the written record in a publicly available statement, so far as the record concerns compliance with the duty set out in subsection (2), including details about which kinds of age verification or age estimation a provider is using and how they are used.”Member’s explanatory statement
This amendment requires Part 5 providers to make publicly available a summary of the age verification or age estimation measures used to comply with the duty in Clause 72(2), and how they are used.
Amendments 210 to 214 agreed.
Clause 73: OFCOM’s guidance about duties set out in section 72
Amendment 215
Moved by
215: Clause 73, page 68, line 36, leave out from “of” to end of line 37 and insert “kinds and uses of age verification and age estimation that are, or are not, highly effective at correctly determining whether or not a particular user is a child,”
Member’s explanatory statement
This amendment requires OFCOM’s guidance about the duty in Clause 72(2) to give examples of kinds and uses of age verification and age estimation that are, or are not, highly effective at determining whether or not a user is a child.
Amendment 215 agreed.
Amendment 216
Moved by
216: Clause 73, page 68, line 43, at end insert—
“(2A) The guidance may elaborate on the following principles governing the use of age verification or age estimation for the purpose of compliance with the duty set out in section 72(2)—(a) the principle that age verification or age estimation should be easy to use;(b) the principle that age verification or age estimation should work effectively for all users regardless of their characteristics or whether they are members of a certain group; (c) the principle of interoperability between different kinds of age verification or age estimation.(2B) The guidance may refer to industry or technical standards for age verification or age estimation (where they exist).”Member’s explanatory statement
This amendment sets out principles about age verification or age estimation, which are relevant to OFCOM’s guidance to providers about their duty in Clause 72(2).
Amendment 217 (to Amendment 216) not moved.
Amendment 216 agreed.
Clause 156: Consultation and parliamentary procedure
Amendment 218 not moved.
Clause 157: Directions about advisory committees
Amendment 218A not moved.
Clause 158: Directions in special circumstances
Amendment 218B
Moved by
218B: Clause 158, page 139, line 5, leave out “duty” and insert “duties”
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
My Lords, the amendments in this group relate to provisions for media literacy in the Bill and Ofcom’s existing duty on media literacy under Section 11 of the Communications Act 2003. I am grateful to noble Lords from across your Lordships’ House for the views they have shared on this matter, which have been invaluable in helping us draft the amendments.
Media literacy remains a key priority in our work to tackle online harms; it is essential not only to keep people safe online but for them to understand how to make informed decisions which enhance their experience of the internet. Extensive work is currently being undertaken in this area. Under Ofcom’s existing duty, the regulator has initiated pilot work to promote media literacy. It is also developing best practice principles for platform-based media literacy measures and has published guidance on how to evaluate media literacy programmes.
While we believe that the Communications Act provides Ofcom with sufficient powers to undertake an ambitious programme of media literacy activity, we have listened to the concerns raised by noble Lords and understand the desire to ensure that Ofcom is given media literacy objectives which are fit for the digital age. We have therefore tabled the following amendments seeking to update Ofcom’s statutory duty to promote media literacy, in so far as it relates to regulated services.
Amendment 274B provides new objectives for Ofcom to meet in discharging its duty. The first objective requires Ofcom to take steps to increase the public’s awareness and understanding of how they can keep themselves and others safe when using regulated services, including building the public’s understanding of the nature and impact of harmful content online, such as disinformation and misinformation. To meet that objective, Ofcom will need to carry out, commission or encourage the delivery of activities and initiatives which enhance users’ media literacy in these ways.
It is important to note that, when fulfilling this new objective, Ofcom will need to increase the public’s awareness of the ways in which they can protect groups that disproportionately face harm online, such as women and girls. The updated duty will also compel Ofcom to encourage the development and use of technologies and systems that support users of regulated services to protect themselves and others. Ofcom will be required to publish a statement recommending ways in which others, including platforms, can take action to support their users’ media literacy.
Amendment 274C places a new requirement on Ofcom to publish a strategy setting out how it will fulfil its media literacy functions under Section 11, including the new objectives. Ofcom will be required to update this strategy every three years and report on progress made against it annually to provide assurance that it is fulfilling its duty appropriately. These reports will be supported by the post-implementation review of the Bill, which covers Ofcom’s media literacy duty in so far as it relates to regulated services. This will provide a reasonable point at which to establish the impact of Ofcom’s work, having given it time to take effect.
I am confident that, through this updated duty, Ofcom will be empowered to ensure that internet users become more engaged with media literacy and, as a result, are safer online. I hope that these amendments will find support from across your Lordships’ House, and I beg to move.
My Lords, I welcome this proposed new clause on media literacy and support the amendments in the names of the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth. I will briefly press the Minister on two points. First, proposed new subsection (1C) sets out how Ofcom must perform its duty under proposed new subsection (1A), but it does not explicitly require Ofcom to work in partnership with existing bodies already engaged in and expert in provision of these kinds of activities. The potential for Ofcom to commission is explicit, but this implies quite a top-down relationship, not a collaboration that builds on best practice, enables scale-up where appropriate and generally avoids reinventing wheels. It seems like a wasted opportunity to fast-track delivery of effective programmes through partnership.
My second concern is that there is no explicit requirement to consider the distinct needs of specific user communities. In particular, I share the concerns of disability campaigners and charities that media literacy activities and initiatives need to take into account the needs of people with learning disabilities, autism and mental capacity issues, both in how activities are shaped and in how they are communicated. This is a group of people who have a great need to go online and engage, but we also know that they are at greater risk online. Thinking about how media literacy can be promoted, particularly among learning disability communities, is really important.
The Minister might respond by saying that Ofcom is already covered by the public sector equality duty and so is already obliged to consider the needs of people with protected characteristics when designing and implementing policies. But the unfortunate truth is that the concerns of the learning disability community are an afterthought in legislation compared with other disabilities, which are already an afterthought. The Petitions Committee in the other place, in its report on online abuse and the experience of disabled people, noted that there are multiple disabled people around the country with the skills and experience to advise government and its bodies but that there is a general unwillingness to engage directly with them. They are often described as hard to reach, which is kind of ironic because in fact most of these people use multiple services and so are very easy to reach, because they are on lots of databases and in contact with government bodies all the time.
The Minister may also point out that Ofcom’s duties in the Communications Act require it to maintain an advisory committee on elderly and disabled persons that includes
“persons who are familiar with the needs of persons with disabilities”.
But referring to an advisory committee is not the same as consulting people with disabilities, both physical and mental, and it is especially important to consult directly with people who may have difficulty understanding what is being proposed. Talking to people directly, rather than through an advisory committee, is very much the goal.
Unlike the draft Bill, which had media literacy as a stand-alone clause, the intention in this iteration is to deal with the issue by amending the Communications Act. It may be that in the web of interactions between those two pieces of legislation, my concerns can be set to rest. But I would find it very helpful if the Minister could confirm today that the intention is that media literacy programmes will be developed in partnership with—and build on best practice of—those organisations already delivering in this space and that the organisations Ofcom collaborates with will be fully inclusive of all communities, including those with disabilities and learning disabilities. Only in this way can we be confident that media literacy programmes will meet their needs effectively, both in content and in how they are communicated.
Finally, can the Minister confirm whether Ofcom considers people with lived experience of disability as subject matter experts on disability for the purpose of fulfilling its consultation duties? I asked this question during one of the helpful briefing sessions during the Bill’s progress earlier this year, but I did not get an adequate answer. Can the Minister clarify that for the House today?
My Lords, I want to look at how, in the Government expanding Ofcom’s duties to prioritise media literacy, it has become linked to this group, and to look at the way in which Amendment 274B does this. It is very much linked with misinformation and disinformation. According to the amendment, there has to be an attempt to establish
“accuracy and authenticity of content”
and to
“understand the nature and impact of disinformation and misinformation, and reduce their and others’ exposure to it”.
I was wondering about reducing users’ exposure to misinformation and disinformation. That gives me pause, because I worry that reducing exposure will obviously mean the removal or censorship of material. I just want to probe some assumptions. Is it the presumption that incorrect or seemingly untrue or erroneous information is the predominant cause of real harm if it is not suppressed? Is there not a risk of harm in suppressing ideas too? Apart from the fact that heretical scientific and political theories were historically seen as misinformation and now are conventional wisdom, is there a danger that suppression in the contemporary period would create mistrust and encourage conspiratorial thinking—people saying, “What have you got to hide?”—and so on?
I want to push this by probing Amendment 269AA in the name of the noble Lord, Lord Clement-Jones, which itself is a probing amendment as to why Ofcom’s misinformation and disinformation committee is not required to consider the provenance of information to help empower users to understand whether content is real or true and so on, rather than the wording at the moment, “accuracy and authenticity”. When I saw the word “provenance”, I stopped for a moment. In all the debates going on in society about misinformation and disinformation, excellent provenance cannot necessarily guarantee truth.
I was shocked to discover that the then Wellcome Trust director, Jeremy Farrar, who is now the chief scientist at the World Health Organization, claimed that the Wuhan lab leak and the manmade theories around Covid were highly improbable. We now know that there were emails from Jeremy Farrar—I was shocked because I am a great fan of the Wellcome Trust and Jeremy Farrar’s work in general—in which there was a conscious bending of the truth that led to the editing of a scientific paper and a letter in the Lancet that proved to have been spun in a way to give wrong information. When issues such as the Wuhan lab leak were raised by Matt Ridley, recently of this parish—I do not know whether his provenance would count—they were dismissed as some kind of racist conspiracy theory. I am just not sure that it is that clear that you can get provenance right. We know from the Twitter files that the Biden Administration leaned on social media companies to suppress the Hunter Biden laptop story that was in the New York Post, which was described as Russian disinformation. We now know that it was true.
Therefore, I am concerned that, in attempting to be well-meaning, this amendment that says we should have better media information does not give in to these lazy labels of disinformation and misinformation, as if we all know what the truth is and all we need is fact-checkers, provenance and authenticity. Disinformation and misinformation have been weaponised, which can cause some serious problems.
Can the Minister clarify whether the clause on media literacy is a genuine, positive attempt at encouraging people to know more, or itself becomes part of an information war that is going on offline and which will not help users at all but only confuse things?
My Lords, I will speak to the government Amendments 274B and 274C. I truly welcome a more detailed approach to Ofcom’s duties in relation to media literacy. However, as is my theme today, I raise two frustrations. First, having spent weeks telling us that it is impossible to include harms that go beyond content and opposing amendments on that point, the Government’s media literacy strategy includes a duty to help users to understand the harmful ways in which regulated services may be used. This is in addition to understanding the nature and impact of harmful content. It appears to suggest that it is the users who are guilty of misuse of products and services rather than putting any emphasis on the design or processes that determine how a service is most often used.
I believe that all of us, including children, are participants in creating an online culture and that educating and empowering users of services is essential. However, it should not be a substitute for designing a service that is safe by design and default. To make my point absolutely clear, I recount the findings of researchers who undertook workshops in 28 countries with more than 1,000 children. The researchers were at first surprised to find that, whether in Kigali, São Paulo or Berlin, to an overwhelming extent children identified the same problems online—harmful content, addiction, privacy, lack of privacy and so on. The children’s circumstances were so vastly different—country and town, Africa and the global north et cetera—but when the researchers did further analysis, they realised that the reason why they had such similar experiences was because they were using the same products. The products were more determining of the outcome than anything to do with religion, education, status, age, the family or even the country. The only other factor that loomed large, which I admit that the Government have recognised, was gender. Those were the two most crucial findings. It is an abdication of adult responsibility to place the onus on children to keep themselves safe. The amendment and the Bill, as I keep mentioning, should focus on the role of design, not on how a child uses it.
My second point, which is of a similar nature, is that I am very concerned that a lot of digital literacy—for adults as well as children, but my particular concern is in schools—is provided by the tech companies themselves. Therefore, once again their responsibility, their role in the system and process of what children might find from reward loops, algorithms and so on, is very low down on the agenda. Is it possible at this late stage to consider that Ofcom might have a responsibility to consider the system design as part of its literacy review?
My Lords, this has been a very interesting short debate. Like other noble Lords, I am very pleased that the Government have proposed the new clauses in Amendments 274B and 274C. The noble Baroness, Lady Bull, described absolutely the importance of media literacy, particularly for disabled people and for the vulnerable. This is really important for them. It is important also not to fall into the trap described by the noble Baroness, Lady Kidron, of saying, “You are a child or a vulnerable person. You must acquire media literacy—it’s your obligation; it’s not the obligation of the platforms to design their services appropriately”. I take that point, but it does not mean that media literacy is not extraordinarily important.
However, sadly, I do not believe that the breadth of the Government’s new media literacy amendments is as wide as the original draft Bill. If you look back at the draft Bill, that was a completely new and upgraded set of duties right across the board, replacing Section 11 of the Communications Act and, in a sense, fit for the modern age. The Government have made a media literacy duty which is much narrower. It relates only to regulated services. This is not optimum. We need something broader which puts a bigger and broader duty for the future on to Ofcom.
It is also deficient in two respects. The noble Lord, Lord Knight, will speak to his amendments, but it struck me immediately when looking at that proposed new clause that we were missing all the debate about functionalities and so on that the noble Baroness, Lady Kidron, debated the other day, regarding design, and that we must ensure that media literacy encompasses understanding the underlying functionalities and systems of the platforms that we are talking about.
I know that your Lordships will be very excited to hear that I am going to refer again to the Joint Committee. I know that the Minister has read us from cover to cover, but at paragraph 381 on the draft Bill we said, and it is still evergreen:
“If the Government wishes to improve the UK’s media literacy to reduce online harms, there must be provisions in the Bill to ensure media literacy initiatives are of a high standard. The Bill should empower Ofcom to set minimum standards for media literacy initiatives that both guide providers and ensure the information they are disseminating aligns with the goal of reducing online harm”.
I had a very close look at the clause. I could not see that Ofcom is entitled to set minimum standards. The media literacy provisions sadly are deficient in that respect.
I am not surprised that my noble friend refers to his experience on the Joint Committee. He will not be surprised that I am about to refer to my experience on the Puttnam committee in 2003, which recommended media literacy as a priority for Ofcom. The sad fact is that media literacy was put on the back burner by Ofcom for almost 20 years. While I listen to this House, I think that my noble friend is quite right to accuse the Government, hard as the Minister has tried, of a paucity of ambition and—more than that—of letting us slip into the same mistake made by Ofcom after 2003 and allowing this to be a narrow, marginal issue. The noble Baroness, Lady Kidron, has reminded us time and again that unless we educate those who are using these technologies, these abuses will proliferate.
Therefore, with what my noble friend is advocating and what we will keep an eye on as the Bill is implemented—and I now literally speak over the Minister’s head, to the Member behind—Ofcom must take media literacy seriously and be a driving force in its implementation, for the very reasons that the noble Baroness, Lady Fox, referred to. We do not want everybody protected by regulations and powers—we want people protected by their own knowledge of what they are dealing with. This is where there is a gap between what has been pressed on the Government and what they are offering.
My Lords, I thank my noble friend very much for that intervention.
My Lords, I remind the House that, as we are on Report, interventions on current speakers should be for direct questions or points of elucidation.
I am sure my noble friend with 30 years’ experience stands duly corrected. He has reminded us that we have 20 years’ experience of something being on the statute book without really cranking up the powers and duties that are on it or giving Ofcom appropriate resources in the media literacy area. If that was about offline—the original 2003 duty—we know that it is even more important online to have these media literacy duties in place. I very much hope that the Minister can give us, in a sense, a token of earnest—that it is not just about putting these duties on the statute book but about giving Ofcom the resources to follow this up. Of course, it is also relevant to other regulators, which was partly the reason for having a duty of co-operation. Perhaps he will also, at the same time, describe how regulators such as Ofsted will have a role in media literacy.
I shall briefly talk about Amendment 269AA to Clause 141, which is the clause in the Bill setting up the advisory committee on misinformation and disinformation. I heard very clearly what the noble Baroness, Lady Fox, had to say, and I absolutely agree—there is no silver bullet in all this. Establishing provenance is but one way in which to get greater transparency and authentication and exercise judgment; it is not the complete answer, but it is one way of getting to grips more with some of the information coming through online. She may have seen that this is an “and” rather than an “or”, which is why the amendment is phrased as it is.
Of course, it is really important that there are initiatives. The one that I want to mention today about provenance is the Content Authenticity Initiative, which I mentioned in Committee. We need to use the power of such initiatives; it is a global coalition working to increase transparency in digital content through open industry standards, and it was founded four years ago and has more than 1,500 members, with some major companies such as Adobe, Microsoft, NVIDIA, Arm, Intel—I could go on. I very much hope that Ofcom will engage with the Content Authenticity Initiative, whatever the content of the Bill. In a sense, I am raising the issue for the Minister to give us assurances that this is within the scope of what the committee will be doing—that it is not just a question of doing what is in the Bill, and this will be included in the scope of the advisory committee’s work.
Thea AI has been an industry-led initiative that has developed content credentials which encode important metadata into pieces of content. Those pieces of information reside indefinitely in the content, wherever it is used, published or stored, and, as a result, viewers are able to make more informed decisions about whether or not to trust the content. The advisory committee really should consider the role of provenance tools such as content credentials to enable users to have the relevant information to decide what is real and what is disinformation or misinformation online. That would entirely fit the strategy of this Bill to empower adult users.
My Lords, the Government have moved on this issue, and I very much welcome that. I am grateful to the Minister for listening and for the fact that we now have Section 11 of the Communications Act being brought into the digital age through the Government’s Amendments 274B and 274C. The public can now expect to be informed and educated about content-related harms, reliability and accuracy; technology companies will have to play their part; and Ofcom will have to regularly report on progress, and will commission and partner with others to fulfil those duties. That is great progress.
The importance of this was underscored at a meeting of the United Nations Human Rights Council just two weeks. Nada Al-Nashif, the UN Deputy High Commissioner for Human Rights in an opening statement said that media and digital literacy empowered individuals and
“should be considered an integral part of education efforts”.
Tawfik Jelassi, the assistant director-general of UNESCO, in a statement attached to that meeting, said that
“media and information literacy was essential for individuals to exercise their right to freedom of opinion and expression”—
I put that in to please the noble Baroness, Lady Fox—and
“enabled access to diverse information, cultivated critical thinking, facilitated active engagement in public discourse, combatted misinformation, and safeguarded privacy and security, while respecting the rights of others”.
If only the noble Lord, Lord Moylan, was in his place to hear me use the word privacy. He continued:
“Together, the international community could ensure that media and information literacy became an integral part of everyone’s lives, empowering all to think critically, promote digital well-being, and foster a more inclusive and responsible global digital community”.
I thought those were great words, summarising why we needed to do this.
I am grateful to Members on all sides of the House for the work that they have done on media literacy. Part of repeating those remarks was that this is so much more about empowerment than it is about loading safety on to individuals, as the noble Baroness, Lady Kidron, rightly said in her comments.
Nevertheless, we want the Minister to reflect on a couple of tweaks. Amendment 269C in my name is around an advisory committee being set up within six months and in its first report assessing the need for a code on misinformation. I have a concern that, as the regime that we are putting in place with this Bill comes into place and causes some of the harmful content that people find engaging to be suppressed, the algorithms will go to something else that is engaging, and that something else is likely to be misinformation and disinformation. I have a fear that that will become a growing problem that the regulator will need to be able to address, which is why it should be looking at this early.
Incidentally, that is why the regulator should also look at provenance, as in Amendment 269AA from the noble Lord, Lord Clement-Jones. It was tempting in listening to him to see whether there was an AI tool that could trawl across all the comments that he has made during the deliberations on this Bill to see whether he has quoted the whole of the joint report—but that is a distraction.
My Amendment 269D goes to the need for media literacy on systems, processes and business models, not just on content. Time and again, we have emphasised the need for this Bill to be as much about systems as content. There are contexts where individual, relatively benign pieces of content can magnify if part of a torrent that then creates harm. The Mental Health Foundation has written to many of us to make this point. In the same way that the noble Baroness, Lady Bull, asked about ensuring that those with disability have their own authentic voice heard as these media literacy responsibilities are played out, so the Mental Health Foundation wanted the same kind of involvement from young people; I agree with both. Please can we have some reassurance that this will be very much part of the literacy duties on Ofcom and the obligations it places on service providers?
My Lords, I am grateful to noble Lords for their comments, and for the recognition from the noble Lord, Lord Knight, of the changes that we have made. I am particularly grateful to him for having raised media literacy throughout our scrutiny of this Bill.
His Amendments 269C and 269D seek to set a date by which the establishment of the advisory committee on misinformation and disinformation must take place and to set requirements for its first report. Ofcom recognises the valuable role that the committee will play in providing advice in relation to its duties on misinformation and disinformation, and has assured us that it will aim to establish the committee as soon as is reasonably possible, in recognition of the threats posed by misinformation and disinformation online.
Given the valuable role of the advisory committee, Ofcom has stressed how crucial it will be to have appropriate time to appoint the best possible committee. Seeking to prescribe a timeframe for its implementation risks impeding Ofcom’s ability to run the thorough and transparent recruitment process that I am sure all noble Lords want and to appoint the most appropriate and expert members. It would also not be appropriate for the Bill to be overly prescriptive on the role of the committee, including with regard to its first report, in order for it to maintain the requisite independence and flexibility to give us the advice that we want.
Amendment 269AA from the noble Lord, Lord Clement-Jones, seeks to add advice on content provenance to the duties of the advisory committee. The new media literacy amendments, which update Ofcom’s media literacy duties, already include a requirement for Ofcom to take steps to help users establish the reliability, accuracy and authenticity of content found on regulated services. Ofcom will have duties and mechanisms to be able to advise platforms on how they can help users to understand whether content is authentic; for example, by promoting tools that assist them to establish the provenance of content, where appropriate. The new media literacy duties will require Ofcom to take tangible steps to prioritise the public’s awareness of and resilience to misinformation and disinformation online. That may include enabling users to establish the reliability, accuracy and authenticity of content, but the new duties will not remove content online; I am happy to reassure the noble Baroness, Lady Fox, on that.
The advisory committee is already required under Clause 141(4)(c) to advise Ofcom on its exercise of its media literacy functions, including its new duties relating to content authenticity. The Bill does not stipulate what tools service providers should use to fulfil their duties, but Ofcom will have the ability to recommend in its codes of practice that companies use tools such as provenance technologies to identify manipulated media which constitute illegal content or content that is harmful to children, where appropriate. Ofcom is also required to take steps to encourage the development and use of technologies that provide users with further context about content that they encounter online. That could include technologies that support users to establish content provenance. I am happy to reassure the noble Lord, Lord Clement-Jones, that the advisory committee will already be required to advise on the issues that he has raised in his amendment.
On media literacy more broadly, Ofcom retains its overall statutory duty to promote media literacy, which remains broad and non-prescriptive. The new duties in this Bill, however, are focused specifically on harm; that is because the of nature of the Bill, which seeks to make the UK the safest place in the world to be online and is necessarily focused on tackling harms. To ensure that Ofcom succeeds in the delivery of these new specific duties with regard to regulated services, it is necessary that the regulator has a clearly defined scope. Broadening the duties would risk overburdening Ofcom by making its priorities less clear.
The noble Baroness, Lady Bull—who has been translated to the Woolsack while we have been debating this group—raised media literacy for more vulnerable users. Under Ofcom’s existing media literacy programme, it is already delivering initiatives to support a range of users, including those who are more vulnerable online, such as people with special educational needs and people with disabilities. I am happy to reassure her that, in delivering this work, Ofcom is already working not just with expert groups including Mencap but with people with direct personal experiences of living with disabilities.
The noble Lord, Lord Clement-Jones, raised Ofsted. Effective regulatory co-ordination is essential for addressing the crosscutting opportunities and challenges posed by digital technologies and services. Ofsted will continue to engage with Ofcom through its existing mechanisms, including engagement led by its independent policy team and those held with Ofcom’s online safety policy director. In addition to that, Ofsted is considering mechanisms through which it can work more closely with Ofcom where appropriate. These include sharing insights from inspections in an anonymised form, which could entail reviews of its inspection bases and focus groups with inspectors, on areas of particular concern to Ofcom. Ofsted is committed to working with Ofcom’s policy teams to work these plans up in more detail.
My Lords, could I ask the Minister a question? He has put his finger on one of the most important aspects of this Bill: how it will integrate with the Department for Education and all its responsibilities for schools. Again, talking from long experience, one of the worries is the silo mentality in Whitehall, which is quite often strongest in the Department for Education. Some real effort will be needed to make sure there is a crossover from the powers that Ofcom has to what happens in the classroom.
I hope what I have said about the way that Ofsted and Ofcom are working together gives the noble Lord some reassurance. He is right, and it is not just in relation to the Department for Education. In my own department, we have discussed in previous debates on media literacy the importance of critical thinking, equipping people with the sceptical, quizzical, analytic skills they need—which art, history and English literature do as well. The provisions in this Bill focus on reducing harm because the Bill is focused on making the UK the safest place to be online, but he is right that media literacy work more broadly touches on a number of government departments.
Amendment 274BA would require Ofcom to promote an understanding of how regulated services’ business models operate, how they use personal data and the operation of their algorithmic systems and processes. We believe that Ofcom’s existing duty under the Communications Act already ensures that the regulator can cover these aspects in its media literacy activities. The duty requires Ofcom to build public awareness of the processes by which material on regulated services is selected or made available. This enables Ofcom to address the platform features specified in this amendment.
The Government’s amendments include extensive new objectives for Ofcom, which apply to harmful ways in which a service is used as well as harmful content. We believe it important not to add further to this duty when the outcomes can already be achieved through the existing duty. We do not wish to limit, by implication, Ofcom’s media literacy duties in relation to other, non-regulated services.
We also judge that the noble Lord’s amendment carries a risk of confusing the remits of Ofcom and the Information Commissioner’s Office. UK data protection law already confers a right for people to be informed about how their personal data are being used, making this aspect of the amendment superfluous.
Amendment 274BB would direct Ofcom to set minimum standards for media literacy activities and initiatives. The only body that has duties in relation to media literacy is Ofcom itself and there would be no obligation for any other organisation to follow these standards. Ofcom could develop a voluntary standard for others but this could be achieved through proposed new subsection (1D), to be inserted by government Amendment 274B. This approach allows a flexibility that a requirement to set minimum standards would not. Rather than imposing a rigid set of standards, we are focusing on improving evaluation practices of media literacy initiatives to identify which measures are most effective and encourage their delivery.
Furthermore, Ofcom will be required to publish a statement recommending ways in which others, including platforms, can take action to support their users’ media literacy. Recommendations may include the development, pursuit and evaluation of activities or initiatives in relation to media literacy. This statement must also be published in a manner that Ofcom considers appropriate for bringing it to the attention of the persons who, in its opinion, are likely to be affected by it. Ofcom has undertaken extensive work to produce a comprehensive toolkit to support practitioners to deliver robust evaluations of their programmes. It was published in February this year and has met with praise from practitioners, including those who have received grant funding through the Government’s non-legislative media literacy work programme.
Having listened to this helpful debate, I remain confident that the provisions we are proposing will tackle the challenges that noble Lords have raised.
I do not believe that the Minister has dealt with the minimum standards issue.
I do not think that the noble Lord was listening to that point, but I did.
Amendment 218B agreed.
Amendment 219
Moved by
219: Clause 158, leave out Clause 158
Member’s explanatory statement
This amendment would remove Clause 158 (Directions in special circumstances) from the Bill and is intended to further probe the Secretary of State’s power in this area.
My Lords, Clause 158 is one of the more mysterious clauses in the Bill and it would greatly benefit from a clear elucidation by the Minister of how it is intended to work to reduce harm. I thank him for having sent me an email this afternoon as we started on the Bill, for which I am grateful; I had only a short time to consider it but I very much hope that he will put its content on the record.
My amendment is designed to ask how the Minister envisages using the power to direct if, say, there is a new contagious disease or riots, and social media is a major factor in the spread of the problem. I am trying to erect some kind of hypothetical situation through which the Minister can say how the power will be used. Is the intention, for example, to set Ofcom the objective of preventing the spread of information on regulated services injurious to public health or safety on a particular network for six months? The direction then forces the regulator and the social media companies to confront the issue and perhaps publicly shame an individual company into using their tools to slow the spread of disinformation. The direction might give Ofcom powers to gather sufficient information from the company to make directions to the company to tackle the problem.
If that is envisaged, which of Ofcom’s media literacy powers does the Minister envisage being used? Might it be Section 11(1)(e) of the Communications Act 2003, which talks about encouraging
“the development and use of technologies and systems for regulating access to such material, and for facilitating control over what material is received, that are both effective and easy to use”.
By this means, Ofcom might encourage a social media company to regulate access to and control over the material that is a threat.
Perhaps the Minister could set out clearly how he intends all this to work, because on a straight reading of Clause 158, we on these Benches have considerable concerns. The threshold for direction is low—merely having
“reasonable grounds for believing that circumstances exist”—
and there is no sense here of the emergency that the then Minister, Mr Philp, cited in the Commons Public Bill Committee on 26 May 2022, nor even of the exceptional circumstances in Amendment 138 to Clause 39, which the Minister tabled recently. The Minister is not compelled by the clause to consult experts in public health, safety or national security. The Minister can set any objectives for Ofcom, it seems. There is no time limit for the effect of the direction and it seems that the direction can be repeatedly extended with no limit. If the Minister directs because they believe there is a threat to national security, we will have the curious situation of a public process being initiated for reasons the Minister is not obliged to explain.
Against this background, there does not seem to be a case for breaching the international convention of the Government not directing a media regulator. Independence of media regulators is the norm in developed democracies, and the UK has signed many international statements in this vein. As recently as April 2022, the Council of Europe stated:
“Media and communication governance should be independent and impartial to avoid undue influence on policymaking or”
the discriminatory and
“preferential treatment of powerful groups”,
including those with significant political or economic power. The Secretary of State, by contrast, has no powers over Ofcom regarding the content of broadcast regulation and has limited powers to direct over radio spectrum and wireless, but not content. Ofcom’s independence in day-to-day decision-making is paramount to preserving freedom of expression. There are insufficient safeguards in this clause, which is why I argue that it should not stand part of the Bill.
I will be brief about Clause 159 because, by and large, we went through it in our debate on a previous group. Now that we can see the final shape of the Bill, it really does behove us to stand back and see where the balance has settled on Ofcom’s independence and whether this clause needs to stand part of the Bill. The Secretary of State has extensive powers under various other provisions in the Bill. The Minister has tabled welcome amendments to Clause 39, which have been incorporated into the Bill, but Clause 155 still allows the Secretary of State to issue a “statement of strategic priorities”, including specific outcomes, every five years.
Clause 159 is in addition to this comprehensive list, but the approach in the clause is incredibly broad. We have discussed this, and the noble Lord, Lord Moylan, has tabled an amendment that would require parliamentary scrutiny. The Secretary of State can issue guidance to Ofcom on more or less anything encompassed by the exercise of its functions under this Act, with no consultation of the public or Parliament prior to making such guidance. The time limit for producing strategic guidance is three years rather than five. Even if it is merely “have regard” guidance, it represents an unwelcome intervention in Ofcom going about its business. If the Minister responds that the guidance is merely “to have regard”, I will ask him to consider this: why have it all, then, when there are so many other opportunities for the Government to intervene? For the regulated companies, it represents a regulatory hazard of interference in independent regulation and a lack of stability. As the noble Lord, Lord Bethell, said in Committee, a clear benefit of regulatory independence is that it reduces lobbying of the Minister by powerful corporate interests.
Now that we can see it in context, I very much hope that the Minister will agree that Clause 159 is a set of guidance too many that compromises Ofcom’s independence and should not stand part of the Bill.
My Lords, I will add to my noble friend’s call for us to consider whether Clause 158 should be struck from the Bill as an unnecessary power for the Secretary of State to take. We have discussed powers for the Secretary of State throughout the Bill, with some helpful improvements led by the noble Baroness, Lady Stowell. This one jars in particular because it is about media literacy; some of the other powers related to whether the Secretary of State could intervene on the codes of practice that Ofcom would issue. The core question is whether we trust Ofcom’s discretion in delivering media literacy and whether we need the Secretary of State to have any kind of power to intervene.
I single out media literacy because the clue is in the name: literacy is a generic skill that you acquire about dealing with the online world; it is not about any specific text. Literacy is a broader set of skills, yet Clause 158 has a suggestion that, in response to specific forms of content or a specific crisis happening in the world, the Secretary of State would want to takesb this power to direct the media literacy efforts. To take something specific and immediate to direct something that is generic and long-term jars and seems inappropriate.
I have a series of questions for the Minister to elucidate why this power should exist at all. It would be helpful to have an example of what kind of “public statement notice”—to use the language in the clause—the Government might want to issue that Ofcom would not come up with on its own. Part of the argument we have been presented with is that, somehow, the Government might have additional information, but it seems quite a stretch that they could come up with that. In an area such as national security, my experience has been that companies often have a better idea of what is going on than anybody in government.
Thousands of people out there in the industry are familiar with APT 28 and APT 29 which, as I am sure all noble Lords know, are better known by their names Fancy Bear and Cozy Bear. These are agents of the Russian state that put out misinformation. There is nothing that UK agencies or the Secretary of State might know about them that is not already widely known. I remember talking about the famous troll factory run by Prigozhin, the Internet Research Agency, with people in government in the context of Russian interference—they would say “Who?” and have to go off and find out. In dealing with threats such as that between the people in the companies and Ofcom, you certainly want a media literacy campaign which tells you about these troll agencies and how they operate and gives warnings to the public, but I struggle to see why you need the Secretary of State to intervene as opposed to allowing Ofcom’s experts to work with company experts and come up with a strategy to deal with those kinds of threat.
The other example cited of an area where the Secretary of State might want to intervene is public health and safety. It would be helpful to be specific; had they had it, how would the Government have used this power during the pandemic in 2020 and 2021? Does the Minister have examples of what they were frustrated about and would have done with these powers that Ofcom would not do anyway in working with the companies directly? I do not see that they would have had secret information which would have meant that they had to intervene rather than trusting Ofcom and the companies to do it.
Perhaps there has been an interdepartmental workshop between DHSC, DCMS and others to cook up this provision. I assume that Clause 158 did not come from nowhere. Someone must have thought, “We need these powers in Clause 158 because we were missing them previously”. Are there specific examples of media literacy campaigns that could not be run, where people in government were frustrated and therefore wanted a power to offer it in future? It would be really helpful to hear about them so that we can understand exactly how the Clause 158 powers will be used before we allow this additional power on to the statute book.
In the view of most people in this Chamber, the Bill as a whole quite rightly grants the Government and Ofcom, the independent regulator, a wide range of powers. Here we are looking specifically at where the Government will, in a sense, overrule the independent regulator by giving it orders to do something it had not thought of doing itself. It is incumbent on the Government to flesh that out with some concrete examples so that we can understand why they need this power. At the moment, as noble Lords may be able to tell, these Benches are not convinced that they do.
My Lords, I will be very brief. The danger with Clause 158 is that it discredits media literacy as something benign or anodyne; it will become a political plaything. I am already sceptical, but if ever there was anything to add to this debate then it is that.
I am very anxious about the notion that media literacy would be used in this way for public health or safety, as in the examples, because all my examples of where it has all gone horribly wrong—through government politicisation or politicised interventions in social media companies—have been in the recent lockdowns and over Covid. I am very worried about that and will talk about it later. We have had “nudge units”, about which there have been all sorts of scandals, but I will not go on about them. There will be a real problem if this is offloaded on to Ofcom—if Ofcom is instructed to do something—the Government will effectively be interfering in what social media is allowed to say or do and in what people are to understand to be the truth. It will discredit that.
The noble Lord, Lord Moylan, made a very good point in our last session. When I try to assess this, I understand that the Secretary of State is elected and that Ofcom is an unelected regulator, so in many ways it is more democratic that the Secretary of State should be openly politicised, but I am concerned that in this instance the Secretary of State will force the unelected Ofcom to do something that the Government will not do directly but will do behind the scenes. That is the danger. We will not even be able to see it correctly and it will emerge to the public as “media literacy” or something of that nature. That will obfuscate accountability even further. I have a lot of sympathy for the amendment to leave out this clause.
My Lords, I am grateful for the opportunity to set out the need for Clauses 158 and 159. The amendments in this group consider the role of government in two specific areas: the power for the Secretary of State to direct Ofcom about its media literacy functions in special circumstances and the power for the Secretary of State to issue non-binding guidance to Ofcom. I will take each in turn.
Amendment 219 relates to Clause 158, on the Secretary of State’s power to direct Ofcom in special circumstances. These include where there is a significant threat to public safety, public health or national security. This is a limited power to enable the Secretary of State to set specific objectives for Ofcom’s media literacy activity in such circumstances. It allows the Secretary of State to direct Ofcom to issue public statement notices to regulated service providers, requiring providers to set out the steps they are taking to address the threat. The regulator and online platforms are thereby compelled to take essential and transparent actions to keep the public sufficiently informed during crises. The powers ensure that the regulatory framework is future-proofed and well equipped to respond in such circumstances.
As the noble Lord, Lord Clement-Jones, outlined, I corresponded with him very shortly before today’s debate and am happy to set out a bit more detail for the benefit of the rest of the House. As I said to him by email, we expect the media literacy powers to be used only in exceptional circumstances, where it is right that the Secretary of State should have the power to direct Ofcom. The Government see the need for an agile response to risk in times of acute crisis, such as we saw during the Covid-19 pandemic or in relation to the war in Ukraine. There may be a situation in which the Government have access to information, through the work of the security services or otherwise, which Ofcom does not. This power enables the Secretary of State to make quick decisions when the public are at risk.
Our expectation is that, in exceptional circumstances, Ofcom would already be taking steps to address harm arising from the provision of regulated services through its existing media literacy functions. However, these powers will allow the Secretary of State to step in if necessary to ensure that the regulator is responding effectively to these sudden threats. It is important to note that, for transparency, the Secretary of State will be required to publish the reasons for issuing a direction to Ofcom in these circumstances. This requirement does not apply should the circumstances relate to national security, to protect sensitive information.
The noble Lord asked why we have the powers under Clause 158 when they do not exist in relation to broadcast media. We believe that these powers are needed with respect to social media because, as we have seen during international crises such as the Covid-19 pandemic, social media platforms can sadly serve as hubs for low-quality, user-generated information that is not required to meet journalistic standards, and that can pose a direct threat to public health. By contrast, Ofcom’s Broadcasting Code ensures that broadcast news, in whatever form, is reported with due accuracy and presented with due impartiality. Ofcom can fine, or ultimately revoke a licence to broadcast in the most extreme cases, if that code is breached. This means that regulated broadcasters can be trusted to strive to communicate credible, authoritative information to their audiences in a way that social media cannot.
We established in our last debate that the notion of a recognised news publisher will go much broader than a broadcaster. I put it to the Minister that we could end up in an interesting situation where one bit of the Bill says, “You have to protect content from these people because they are recognised news publishers”. Another bit, however, will be a direction to the Secretary of State saying that, to deal with this crisis, we are going to give a media literacy direction that says, “Please get rid of all the content from this same news publisher”. That is an anomaly that we risk setting up with these different provisions.
On the previous group, I raised the issue of legal speech that was labelled as misinformation and removed in the extreme situation of a public health panic. This was seemingly because the Government were keen that particular public health information was made available. Subsequently, we discovered that those things were not necessarily untrue and should not have been removed. Is the Minister arguing that this power is necessary for the Government to direct that certain things are removed on the basis that they are misinformation—in which case, that is a direct attempt at censorship? After we have had a public health emergency in which “facts” have been contested and shown to not be as black and white or true as the Government claimed, saying that the power will be used only in extreme circumstances does not fill me with great confidence.
I am happy to make it clear, as I did on the last group, that the power allows Ofcom not to require platforms to remove content, only to set out what they are doing in response to misinformation and disinformation—to require platforms to make a public statement about what they are doing to tackle it. In relation to regulating news providers, we have brought the further amendments forward to ensure that those subject to sanctions cannot avail themselves of the special provisions in the Bill. Of course, the Secretary of State will be mindful of the law when issuing directions in the exceptional circumstances that these clauses set out.
While the Minister is describing that, can he explain exactly which media literacy power would be invoked by the kind of example I gave when I was introducing the amendment and in the circumstances he has talked about? Would he like to refer to the Communications Act?
It depends on the circumstances. I do not want to give one example for fear of being unnecessarily restrictive. In relation to the health misinformation and disinformation we saw during the pandemic, an example would be the suggestions of injecting oneself with bleach; that sort of unregulated and unhelpful advice is what we have in mind. I will write to the noble Lord, if he wants, to see what provisions of the Communications Act we would want invoked in those circumstances.
In relation to Clause 159, which is dealt with by Amendment 222, it is worth setting out that the Secretary of State guidance and the statement of strategic priorities have distinct purposes and associated requirements. The purpose of the statement of strategic priorities is to enable the Secretary of State to specifically set out priorities in relation to online safety. For example, in the future, it may be that changes in the online experience mean that the Government of the day wish to set out their high-level overarching priorities. In comparison, the guidance allows for clarification of what Parliament and Government intended in passing this legislation—as I hope we will—by providing guidance on specific elements of the Bill in relation to Ofcom’s functions. There are no plans to issue guidance under this power but, for example, we are required to issue guidance to Ofcom in relation to the fee regime.
On the respective requirements, the statement of strategic priorities requires Ofcom to explain in writing what it proposes to do in consequence of the statement and publish an annual review of what it has done. Whereas Ofcom must “have regard” to the guidance, the guidance itself does not create any statutory requirements.
This is a new regime and is different in its nature from other established areas of regulations, such as broadcasting. The power in Clause 159 provides a mechanism to provide more certainty, if that is considered necessary, about how the Secretary of State expects Ofcom to carry out its statutory functions. Ofcom will be consulted before guidance is issued, and there are checks on how often it can be issued and revised. The guidance document itself, as I said, does not create any statutory requirements, so Ofcom is required only to “have regard” to it.
This will be an open and transparent way to put forward guidance appropriately with safeguards in place. The independence of the regulator is not at stake here. The clause includes significant limitations on the power, and the guidance cannot fetter Ofcom’s operational independence. We feel that both clauses are appropriate for inclusion in the Bill, so I hope that the noble Lord will withdraw his amendment.
I thank the Minister for that more extended reply. It is a more reassuring response on Clause 159 than we have had before. On Clause 158, the impression I get is that the media literacy power is being used as a smokescreen for the Government telling social media what it should do, indirectly via Ofcom. That seems extraordinary. If the Government were telling the mainstream media what to do in circumstances like this, we would all be up in arms. However, it seems to be accepted as a part of the Bill and that we should trust the Government. The Minister used the phrase “special circumstances”. That is not the phraseology in the clause; it is that “circumstances exist”, and then it goes on to talk about national security and public health. The bar is very low.
I am sure everyone is getting hungry at this time of day, so I will not continue. However, we still have grave doubts about this clause. It seems an extraordinary indirect form of censorship which I hope is never invoked. In the meantime, I beg leave to withdraw my amendment.
Amendment 219 withdrawn.
Clause 159: Secretary of State’s guidance
Amendments 220 to 222 not moved.
Amendments 223 not moved.
Clause 161: Review
Amendment 224
Moved by
224: Clause 161, page 140, line 27, leave out “or 3” and insert “, 3 or 3A”
Member’s explanatory statement
Clause 161 is about a review by the Secretary of State of the regulatory framework established by this Bill. This amendment inserts a reference to Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name.
Amendment 224 agreed.
Consideration on Report adjourned until 8.40 pm.
Amendment 225
Moved by
225: After Clause 161, insert the following new Clause—
“Transparency of government representations to regulated service providers
(1) The Secretary of State must produce a report setting out any relevant representations His Majesty’s Government have made to providers of Part 3 services to tackle the presence of misinformation and disinformation on Part 3 services.(2) In this section “relevant representations” are representations that could reasonably be considered to be intended to persuade or encourage a provider of a Part 3 service to—(a) modify the terms of service of a regulated service in an effort to address misinformation or disinformation,(b) restrict or remove a particular user’s access to accounts used by them on a regulated service, or(c) take down, reduce the visibility of, or restrict access to content that is present or may be encountered on a regulated service.(3) The first report must be laid before both Houses of Parliament within six months of this Act being passed.(4) Subsequent reports must be laid before both Houses of Parliament at intervals not exceeding six months.(5) The Secretary of State is not required by this section to include in the report information that the Secretary of State considers would be against the interests of national security.(6) If the Secretary of State relies upon subsection (5) they must as soon as reasonably practicable send a report containing that information to the Intelligence and Security Committee of Parliament.”Member’s explanatory statement
This amendment addresses government influence on content moderation, for example by way of initiatives like the Government’s Counter Disinformation Unit.
My Lords, continuing the rather radical approach of debating an amendment that has already been debated in Committee and has not just been introduced, and picking up on the theme of our debate immediately before we adjourned, I move an amendment that seeks to address the question of the Government’s activities in interacting with providers when they seek to influence providers on what is shown on their sites.
It might be a matter of interest that according to the Daily Telegraph, which I implicitly trust, only on Tuesday of last week, a judge in Louisiana in the United States issued an injunction forbidding a lengthy list of White House officials from making contact with social media companies to report misinformation. I say this not because I expect the jurisprudence of the state of Louisiana to have any great influence in your Lordships’ House but simply to show how sensitive and important this issue is. The judge described what he had heard and seen as one of the greatest assaults on free speech in the history of the United States.
We are not necessarily quite in that territory, and nor does my amendment do anything so dramatic as to prevent the Government communicating with providers with a view to influencing their content, but Amendment 225 requires the Secretary of State to produce a report within six months of the passing of the Act, and every six months thereafter, in which he sets out
“any relevant representations His Majesty’s Government have made to providers”
that are
“intended to persuade or encourage a provider”
to do one of three things. One is to
“modify the terms of service of a regulated service in an effort to address misinformation or disinformation”;
one is to
“restrict or remove a particular user’s access to accounts used by them”;
and the third is to
“take down, reduce the visibility of, or restrict access to content that is present or may be encountered on a regulated service”.
None of these things would be prohibited or prevented by this amendment, but it would be required that His Majesty’s Government produce a report saying what they have done every six months.
Very importantly there is an exception, in that there would be no obligation on the Secretary of State to disclose publicly any information that affected national security, but he would be required in that case to make a report to the Intelligence and Security Committee here in Parliament. As I said, this is a very sensitive subject, and remarks made by the noble Baroness, Lady Fox of Buckley, in the previous debate referred in particular to this subject in connection with the pandemic. While that is in the memory, other topics may easily come up and need to be addressed, where the Government feel obliged to move and take action.
We know nothing about those contacts, because they are not instructions or actions taken under law. They are simply nudges, winks and phone conversations with providers that have an effect and, very often, the providers will act on them. Requiring the Government to make a report and say what they have done seems a modest, proportionate and appropriate means to bring transparency to this exercise, so that we all know what is going on.
I am happy to say that when this amendment was debated in Committee, it found widespread support from around the House. I hope to find that that support is still solid and strong, such that my noble friend, perhaps as a modest postprandial bonus, will be willing, for a change, to accept something proposed by a colleague from his own Benches, so that we can all rejoice as we go into a very long night. I beg to move.
My Lords, I put my name to this very important amendment—all the more important because of the previous discussions we have had about the difficulties around misinformation or potential government interference in decisions about what is online and what is not online. The noble Lord, Lord Moylan, is right to indicate that this is a very modest and moderate amendment; it addresses the problems of government influence or government moderation, or at least allows those of us who are concerned about it to keep our eye on it and make sure that the country and Parliament know what is going on.
The original idea of disinformation came from an absolutely rightful concern about foreign disinformation between states. People were rightly concerned about security; we all should be and nobody wants to be taken in, in that way. But there has been a worry when agencies designed to combat those threats increasingly turn inward against the public, in a wide range of countries. Although that might not be exactly what has happened in the UK, we should note that Meta CEO Mark Zuckerberg recently admitted that the US Government asked Facebook to suppress true information. In a recent interview, he said that the scientific establishment
“asked for a bunch of things to be censored that, in retrospect, ended up being more debatable or true”.
We should all be concerned about this. It is not just a matter for those of us who are worried about free speech or raise the issue. If we are genuinely worried about misinformation or fake news, we have to make sure that we are equally concerned if it comes from other sources, not just from malign players.
The noble Lord, Lord Moylan, mentioned the American court case Missouri v Biden. In his 155-page ruling, Judge Doughty depicted quite a dystopian scene when he said that, during the pandemic, the US Government seem
“to have assumed a role similar to an Orwellian ‘Ministry of Truth’”.
I do not think we want to emulate the worst of what is happening in the US here.
The judge there outlined a huge complex of government agencies and officials connected with big tech and an army of bureaucrats hired to monitor websites and flag and remove problematic posts. It is not like that in the UK, but some of us were quite taken aback to discover that the Government ran a counter-disinformation policy forum during the lockdown, which brought tech giants together to discuss how to deal with Covid misinformation, as it was said. There was a worry about political interference then.
I do not think that this is just paranoia. Since then, Big Brother Watch and its investigative work have shown that the UK Government had a secret unit that worked with social media companies to monitor and prevent speech critical of Covid lockdown policies, in the shape of the Counter Disinformation Unit, which was set up by Ministers to deal with groups and individuals who criticised policies such as lockdowns, school closures, vaccine mandates or what have you.
Like the noble Lord, Lord Moylan, I do not want to get stuck on what happened during lockdown. That was an exceptional, extreme situation. None the less, the Counter Disinformation Unit—which works out of the Minister’s own department, the DCMS—is still operating. It seems to be able to get content fast-tracked for possible moderation by social media firms such as Facebook and Twitter. It used an AI firm to search social media posts—we need to know the details of that.
I think, therefore, that to have the transparency which the Government and the Minister have constantly stressed is hugely important for the credibility of the Bill, it is important that there is transparency about the likes of the Counter Disinformation Unit and any government attempts at interfering in what we are allowed to see, read or have access to online.
My Lords, the noble Lord, Lord Moylan, and the noble Baroness, Lady Fox, have a very strong point to make with this amendment. I have tried in our discussions to bring some colour to the debate from my own experience so I will tell your Lordships that in my former professional life I received representations from many Ministers in many countries about the content we should allow or disallow on the Facebook platform that I worked for.
That was a frequent occurrence in the United Kingdom and extended to Governments of all parties. Almost as soon as I moved into the job, we had a Labour Home Secretary come in and suggest that we should deal with particular forms of content. It happened through the coalition years. Indeed, I remember meeting the Minister’s former boss at No. 10 in Davos, of all places, to receive some lobbying about what the UK Government thought should be on or off the platform at that time. In that case it was to do with terrorist content; there was nothing between us in terms of wanting to see that content gone. I recognise that this amendment is about misinformation and disinformation, which is perhaps a more contentious area.
As we have discussed throughout the debate, transparency is good. It keeps everybody on the straight and narrow. I do not see any reason why the Government should not be forthcoming. My experience was that the Government would often want to go to the Daily Telegraph, the Daily Mail or some other upright publication and tell it how they had been leaning on the internet companies—it was part of their communications strategy and they were extremely proud of it—but there will be other circumstances where they are doing it more behind the scenes. Those are the ones we should be worried about.
If those in government have good reason to lean on an internet company, fine—but knowing that they have to be transparent about it, as in this amendment, will instil a certain level of discipline that would be quite healthy.
My Lords, clearly, there is a limited number of speakers in this debate. We should thank the noble Lord, Lord Moylan, for tabling this amendment because it raises a very interesting point about the transparency—or not—of the Counter Disinformation Unit. Of course, it is subject to an Oral Question tomorrow as well, which I am sure the noble Viscount will be answering.
There is some concern about the transparency of the activities of the Counter Disinformation Unit. In its report, Ministry of Truth, which deals at some length with the activities of the Counter Disinformation Unit, Big Brother Watch says:
“Giving officials an unaccountable hotline to flag lawful speech for removal from the digital public square is a worrying threat to free speech”.
Its complaint is not only about oversight; it is about the activities. Others such as Full Fact have stressed the fact that there is little or no parliamentary scrutiny. For instance, freedom of information requests have been turned down and Written Questions which try to probe what the activities of the Counter Disinformation Unit are have had very little response. As it says, when the Government
“lobby internet companies about content on their platforms … this is a threat to freedom of expression”.
We need proper oversight, so I am interested to hear the Minister’s response.
My Lords, the Government share the view of my noble friend Lord Moylan about the importance of transparency in protecting freedom of expression. I reassure him and other noble Lords that these principles are central to the Government’s operational response to addressing harmful disinformation and attempts artificially to manipulate our information environment.
My noble friend and others made reference to the operational work of the Counter Disinformation Unit, which is not, as the noble Baroness, Lady Fox, said, the responsibility of my department but of the Department for Science, Innovation and Technology. The Government have always been transparent about the work of the unit; for example, recently publishing a factsheet on GOV.UK which sets out, among other things, how the unit works with social media companies.
I reassure my noble friend that there are existing processes governing government engagements with external parties and emphasise to him that the regulatory framework that will be introduced by the Bill serves to increase transparency and accountability in a way that I hope reassures him. Many teams across government regularly meet industry representatives on a variety of issues from farming and food to telecoms and digital infrastructure. These meetings are conducted within well-established transparency processes and frameworks, which apply in exactly the same way to government meetings with social media companies. The Government have been open about the fact that the Counter Disinformation Unit meets social media companies. Indeed, it would be surprising if it did not. For example, at the beginning of the Russian invasion of Ukraine, the Government worked with social media companies in relation to narratives which were being circulated attempting to deny incidents leading to mass casualties, and to encourage the promotion of authoritative sources of information. That work constituted routine meetings and was necessary in confirming the Government’s confidence in the preparedness and ability of platforms to respond to new misinformation and disinformation threats.
To require additional reporting on a sector-by-sector or department-by-department basis beyond the standardised transparency processes, as proposed in my noble friend’s amendment, would be a disproportionate and unnecessary response to what is routine engagement in an area where the Government have no greater powers or influence than in others. They cannot compel companies to alter their terms of service; nor can or do they seek to mandate any action on specific pieces of content.
I reassure the noble Baroness, Lady Fox, that the Counter Disinformation Unit does not monitor individual people, nor has it ever done so; rather, it tracks narratives and trends using publicly available information online to protect public health, public safety and national security. It has never tracked the activity of individuals, and there is a blanket ban on referring any content from journalists or parliamentarians to social media performs. The Government have always been clear that the Counter Disinformation Unit refers content for consideration only where an assessment has been made that it is likely to breach the platform’s own terms of service. It has no role in deciding what action, if any, to take in response, which is entirely a matter for the platform concerned.
As I said, the Bill will introduce new transparency, accountability and freedom of expression duties for category 1 services which will make the process for any removal or restriction of user-generated content more transparent by requiring category 1 services to set terms of service which are clear, easy for users to understand and consistently enforced. Category 1 services will be prohibited from removing or restricting user-generated content or suspending or banning users where this does not align with those terms of service. Any referrals from government will not, and indeed cannot, supersede these duties in the Bill.
Although I know it will disappoint my noble friend that another of his amendments has not been accepted, I hope I have been able to reassure him about the Government’s role in these processes. As the noble Lord, Lord Clement-Jones, noted, my noble friend Lord Camrose is answering a Question on this in your Lordships’ House tomorrow, further underlining the openness and parliamentary accountability with which we go about this work. I hope my noble friend will, in a similarly post-prandial mood of generosity, suppress his disappointment and feel able to withdraw his amendment.
Before the Minister sits down, I think that it is entirely appropriate for him to say—I have heard it before—“Oh no, nothing was taken down. None of this is believable. No individuals were targeted”. However, that is not the evidence I have seen, and it might well be that I have been shown misinformation. But that is why the Minister has to acknowledge that one of the problems here is that indicated by Full Fact—which, as we know, is often endorsed by government Ministers as fact-checkers. It says that because the Government are avoiding any scrutiny for this unit, it cannot know. It becomes a “he said, she said” situation. I am afraid that, because of the broader context, it would make the Minister’s life easier, and be clearer to the public—who are, after all, worried about this—if he accepted the ideas in the amendment of the noble Lord, Lord Moylan. We would then be clear and it would be out in the open. If the FOIs and so on that have been constantly put forward were answered, would that not clear it up?
I have addressed the points made by the noble Baroness and my noble friend already. She asks the same question again and I can give her the same answer. We are operating openly and transparently here, and the Bill sets out further provisions for transparency and accountability.
My Lords, I see what my noble friend did there, and it was very cunning. He gave us a very worthwhile account of the activities of the Counter Disinformation Unit, a body I had not mentioned at all, as if the Counter Disinformation Unit was the sole locus of this sort of activity. I had not restricted it to that. We know, in fact, that other bodies within government have been involved in undertaking this sort of activity, and on those he has given us no answer at all, because he preferred to answer about one particular unit. He referred also to its standardised transparency processes. I can hardly believe that I am reading out words such as those. The standardised transparency process allows us all to know that encounters take place but still refuses to let us know what actually happens in any particular encounter, even though there is a great public interest in doing so. However, I will not press it any further.
My noble friend, who is genuinely a friend, is in danger of putting himself, at the behest of civil servants and his ministerial colleagues, in some danger. We know what happens in these cases. The Minister stands at the Dispatch Box and says “This has never happened; it never normally happens; it will not happen. Individuals are never spoken of, and actions of this character are never taken”. Then of course, a few weeks or months later, out pour the leaked emails showing that all these things have been happening all the time. The Minister then has to resign in disgrace and it is all very sad. His friends, like myself, rally round and buy him a drink, before we never see him again.
Anyway, I think my noble friend must be very careful that he does not put himself in that position. I think he has come close to doing so this evening, through the assurances he has given your Lordships’ House. Although I do not accept those assurances, I will none the less withdraw the amendment, with the leave of the House.
Amendment 225 withdrawn.
Clause 173: Providers’ judgements about the status of content
Amendment 226 not moved.
Amendment 227
Moved by
227: Clause 173, page 150, line 23, at end insert “or
(c) an assessment required to be carried out by section (Assessment duties: user empowerment),”Member’s explanatory statement
This amendment ensures that Clause 173, which is about the approach to be taken by providers to judgements about the status of content, applies to assessments under the new Clause proposed after Clause 11 in my name.
Amendment 227 agreed.
Amendment 228
Moved by
228: Clause 173, page 151, leave out lines 1 and 2
Member’s explanatory statement
This amendment removes a requirement on providers which could encourage excessive content removal in borderline cases of illegality.
My Lords, we are coming to some critical amendments on a very important issue relatively late in the Bill, having had relatively little discussion on it. It is not often that committees of this House sit around and say, “We need more lawyers”, but this is one of those areas where that was true.
Notwithstanding the blushes of my noble friend on the Front Bench here, interestingly we have not had in our debate significant input from people who understand the law of freedom of expression and wish to contribute to our discussions on how online platforms should deal with questions of the legality of content. These questions are crucial to the Bill, which, if it does nothing else, tells online platforms that they have to be really robust in taking action against content that is deemed to be illegal under a broad swathe of law in the United Kingdom that criminalises certain forms of speech.
We are heavy with providers, and we are saying to them, “If you fail at this, you’re in big trouble”. The pressure to deal with illegal content will be huge, yet illegality itself covers a broad spectrum, from child sexual exploitation and abuse material, where in many cases it is obvious from the material that it is illegal and there is strict liability—there is never any excuse for distributing that material—and pretty much everyone everywhere in the world would agree that it should be criminalised and removed from the internet, through to things that we discussed in Committee, such as public order offences, where, under some interpretations of Section 5 of the Public Order Act, swearing at somebody or looking at them in a funny way in the street could be deemed alarming and harassing. There are people who interpret public order offences in this very broad sense, where there would be a lot less agreement about whether a specific action is or is not illegal and whether the law is correctly calibrated or being used oppressively. So we have this broad spectrum of illegality.
The question we need to consider is where we want providers to draw the line. They will be making judgments on a daily basis. I said previously that I had to make those judgments in my job. I would write to lawyers and they would send back an expensive piece of paper that said, “This is likely to be illegal”, or, “This is likely not to be illegal”. It never said that it was definitely illegal or definitely not illegal, apart from the content I have described, such as child sexual abuse. You would not need to send that, but you would send the bulk of the issues that we are dealing with to a lawyer. If you sent it to a second lawyer, you would get another “likely” or “not likely”, and you would have to come to some kind of consensus view as to the level of risk you wished to take on that particular form of speech or piece of content.
This is really challenging in areas such as hate speech, where exactly the same language has a completely different meaning in different contexts, and may or may not be illegal. Again, to give a concrete example, we would often deal with anti-Semitic content being shared by anti-anti-Semitic groups—people trying to raise awareness of anti-Semitic speech. Our reviewers would quite commonly remove the speech: they would see it and it would look like grossly violating anti-Semitic speech. Only later would they realise that the person was sharing it for awareness. The N-word is a gross term of racial abuse, but if you are an online platform you permit it a lot of the time, because if people use it self-referentially they expect to be able to use it. If you start removing it they would naturally get very upset. People expect to use it if it is in song lyrics and they are sharing music. I could give thousands of examples of speech that may or may not be illegal depending entirely on the context in which it is being used.
We will be asking platforms to make those judgments on our behalf. They will have to take it seriously, because if they let something through that is illegal they will be in serious trouble. If they misjudged it and thought the anti-Semitic hate speech was being circulated by Jewish groups to promote awareness but it turned out it was being circulated by a Nazi group to attack people and that fell foul of UK law, they would be in trouble. These judgments are critical.
We have the test in Clause 173, which says that platforms should decide whether they have “reasonable grounds to infer” that something is illegal. In Committee, we debated changing that to a higher bar, and said that we wanted a stronger evidential basis. That did not find favour with the Government. We hoped they might raise the bar themselves unilaterally, but they have not. However, we come back again in a different way to try to be helpful, because I do not think that the Government want excessive censorship. They have said throughout the Bill’s passage that they are not looking for platforms to be overly censorious. We looked at the wording again and thought about how we could ensure that the bar is not operated in a way that I do not think that the Government intend. We certainly would not want that to happen.
We look at the current wording in Clause 173 and see that the test there has two elements. One is: “Do you have reasonable grounds to infer?” and then a clause in brackets after that says, “If you do have reasonable grounds to infer, you must treat the content as illegal”. In this amendment we seek to remove the second part of that phrasing because it seems problematic. If we say to the platform, “Reasonable grounds to infer, not certainty”—and it is weird to put “inference”, which is by definition mushy, with “must”, which is very certain, into the same clause—we are saying, “If you have this mushy inference, you must treat it as illegal”, which seems quite problematic. Certainly, if I were working at a platform, the way I would interpret that is: “If in doubt, take it out”. That is the only way you can interpret that “must”, and that is really problematic. Again, I know that that is not the Government’s intention, and if it were child sexual exploitation material, of course you “must”. However, if it is the kind of abusive content that you have reasonable grounds to infer may be an offence under the Public Order Act, “must” you always treat that as illegal? As I read the rest of the Bill, if you are treating it as illegal, the sense is that you should remove it.
That is what we are trying to get at. There is a clear understanding from the Government that their intention is “must” when it comes to that hard end of very bad, very clearly bad content. However, we need something else—a different kind of behaviour where we are dealing with content where it is much more marginal. Otherwise, the price we will pay will be in freedom of expression.
People in the United Kingdom publish quite robust, sweary language. I sometimes think that some of the rules we apply penalise the vernacular. People who use sweary, robust language may be doing so entirely legally—the United Kingdom does not generally restrict people from using that kind of language. However, we risk heading towards a scenario where people post such content in future, and they will find that the platform takes it down. They will complain to the platform, saying, “Why the hell did you take my content down?”—in fact, they will probably use stronger words than that to register their complaint. When they do, the platform will say, “We had reasonable grounds to infer that that was in breach of the Public Order Act, for example, because somebody might feel alarmed, harassed or distressed by it. Oh, and look—in this clause, it says we ‘must’ treat it as illegal. Sorry—there is nothing else we can do. We would have loved to have been able to exercise the benefit of the doubt and to allow you to carry on using that kind of language, because we think there is some margin where you have not behaved in an illegal way. But unfortunately, because of the way that Clause 173 has been drafted, our lawyers tell us we cannot afford to take the risk”.
In the amendment we are trying to—I think—help the Government to get out of a situation which, as I say, I do not think they want. However, I fear that the totality of the wording of Clause 173, this low bar for the test and the “must treat as” language, will lead to that outcome where platforms will take the attitude: “Safety first; if in doubt, take it out”, and I do not think that that is the regime we want. I beg to move.
My Lords, I regret I was unable to be present in Committee to deliver my speech about the chilling effect that the present definition of illegality in the Bill will have on free speech on the internet.
I am still concerned about Clause 173, which directs platforms how to come to the judgment on what is illegal. My concern is that the criterion for illegality, “reasonable grounds to infer” that elements of the content are illegal, will encourage the tech companies to take down content which is not necessarily illegal but which they infer could be. Indeed, the noble Lord, Lord Allan, gave us a whole list of examples of where that might happen. Unfortunately, in Committee there was little support for a higher bar when asking the platforms to judge what illegal content is. However, I have added my name to Amendment 228, put forward by the noble Lord, Lord Allan, because, as he has just said, it is a much less radical way of enhancing free speech when platforms are not certain whether to take down content which they infer is illegal.
The deletion of part of Clause 173(5) is a moderate proposal. It still leaves intact the definition for the platforms of how they are to make the judgment on the illegality of content, but it takes out the compulsory element in this judgment. I believe that it will have the biggest impact on the moderation system. Some of those systems are run by machines, but many of the moderation processes, such as Meta’s Facebook, involve thousands of human beings. The deletion of the second part of Clause 173(5), which demands that they take down content that they infer is illegal, will give them more leeway to err on the side of freedom of speech. I hope that this extra leeway to encourage free speech will also be included in the way that algorithms moderate our content.
Earlier in the Bill, Clause 18 lays out, for all services, the importance of protecting users’ rights to freedom of expression, and there are various duties of assessment for large companies. However, there is not enough in the Bill which builds freedom of expression into the moderation capacity of the platforms. Alan Rusbridger, a member of the Facebook Oversight Board, gave evidence to the communications Select Committee inquiry into freedom of expression online. He said:
“I believe that freedom of speech is a hugely important right … In most judgments, I begin by thinking, ‘Why would we restrict freedom of speech in this particular case?’”.
Evidence was also given that many moderators do not have a background in freedom of expression and are not completely conversant with the Article 10 rights. The amendment will allow moderators to think more about their role in erring on the side of freedom of expression when deciding on the illegality of content.
There has been much discussion, both in Committee and on Report, on protecting freedom of expression, but not much movement by the Government. I hope that the Minister will use this small amendment to push for draft codes of practice which allow the platforms, when they are not sure of the illegality of content, to use their discretion and consider freedom of expression.
My Lords, it is all quite exciting now, is it not? I can say “hear, hear!” a lot; everyone is talking about freedom of expression. I cannot tell noble Lords how relieved and pleased I was both to hear the speeches and to see Amendment 228 from the noble Lord, Lord Allan of Hallam, and the noble Viscount, Lord Colville of Culross, who both explained well why this is so important. I am so glad that, even late in our discussions on Report, it has returned as an important issue.
We have already discussed how in many cases, especially when it comes to what is seen as illegal speech, decisions about illegality are very complicated. They are complicated in the law courts and offline, and that is when they have the full power of lawyers, the criminal justice system and so on trying to make decisions. Leaving it up to people who, through no fault of their own, are not qualified but who work in a social media company to try to make that decision in a climate of quite onerous obligations—and having phrases such as “reasonable grounds to infer”—will lead to lawful expression being overmoderated. Ultimately, online platforms will use an abundance of caution, which will lead to a lot of important speech—perfectly lawful if not worthy speech; the public’s speech and the ability to speak freely—being removed. That is not a trivial side issue; it will discredit the Bill, if it has not done so already.
Whenever noble Lords make contributions about why a wide range of amendments and changes are needed—particularly in relation to protecting children, harm and so on—they constantly tell us that the Bill should send an uncompromising message. The difficulty I have is with the danger that the Bill will send an uncompromising message that freedom of expression is not important. I urge the Minister to look carefully at the amendment, because the message should be that, while the Bill is trying to tackle online harm and to protect children in particular—which I have never argued against—huge swathes of it might inadvertently silence people and deprive them of the right to information that they should be able to have.
My Amendment 229—I am not sure why it is in this group, but that is nothing new in the way that the groupings have worked—is about lawful speech and about what content is filtered by users. I have already argued for the replacement of the old legal but harmful duty, but the new duty of user empowerment is welcome, and at face value it puts users in the driving seat and allows adults to judge for themselves what they want and do not want to see. But—and it is a large but—that will work only if users and providers agree about when content should be filtered and what content is filtered.
As with all decisions on speech, as I have just mentioned, in the context particularly of a heightened climate of confusion and sensitivity regarding identity politics and the cancel-culture issues that we are all familiar with, there are some problems with the way that things stand in the Bill. I hope I am using the term “reasonable grounds to infer” in a better way than it is used in terms of illegality. My amendment specifies that companies need to have reasonable grounds to infer that content is abusive or inciting hatred when filtering out content in those user empowerment tools. Where a user chooses to filter out hateful content based on race, on being a woman or whatever, it should catch only content that genuinely falls under those headings. There is a risk that, without this amendment, technologies or individuals working for companies could operate in a heavy-handed way in filtering out legitimate content.
I shall give a couple of examples. Say that someone chooses to filter out abusive content targeting the protected characteristic of race. I imagine that they would have a reasonable expectation that that filter would target aggressive, unpleasant content demeaning to a person because of their race, but does the provider agree with that? Will it interpret my filtering choice as a user in the most restrictive way possible in a bid to protect my safety or by seeing my sensibilities as having a low threshold for what it might consider to be abuse?
The race issue illustrates where we get into difficulties. Will the filterers take their cue from the document that has just been revealed, which was compiled by the Diocese of St Edmundsbury and Ipswich, which the anti-racist campaigning group Don’t Divide Us has just released, and which is being used in 87 schools? Under the heading of racism we have ideas like passive racism includes agreeing that
“There are two sides to every story”,
or if you deny white privilege or if you start a sentence saying, “Not all white people”. “Veiled racism” in this document—which, as I say, is being used in schools for that particular reason by the Church of England—includes a “Euro-centric curriculum” or “cultural appropriation”. “Racist discrimination” includes “anti- immigration policies”, which, as I pointed out before, would indicate that some people would call the Government’s own Bill tonight racist.
The reason why I mention that is that you might think, “I am going to have racism filtered out”, but if there is too much caution then you will have filtered out very legitimate discussions on immigration and cultural appropriation. You will be protected, but if, for example, the filterer follows certain universities that have deemed the novels of Walter Scott, the plays of William Shakespeare or Enid Blyton’s writing as racist, then you can see that we have some real problems. When universities have said there is misogynistic bullying and sexual abuse in “The Great Gatsby” and Ovid’s “Metamorphoses”, I just want to make sure that we do not end up in a situation where there is oversensitivity by the filterers. Perhaps the filtering will take place by algorithm, machine learning and artificial intelligence, but the EHRC has noted that algorithms just cannot cope with the context, cultural difference and complexity of language within the billions of items of content produced every day.
Amendment 229 ensures that there is a common standard—a standard of objective reasonableness. It is not perfect at all; I understand that reasonableness itself is open to interpretation. However, it is an attempt to ensure that the Government’s concept of user empowerment is feasible by at least aspiring to a basic shared understanding between users and providers as to what will be filtered and what will not, and a check against providers’ filter mechanisms removing controversial or unpopular content in the name of protecting users. Just as I indicated in terms of sending a message, if the Government could indicate to the companies that rather than taking a risk-averse attitude, they had to bear in mind freedom of expression, not be oversensitive and not be too risk-averse or overcautious, we might begin to get some balance. Otherwise, an awful lot of lawful material will be removed that is not even harmful.
My Lords, I support Amendment 228. I spoke on this issue to the longer amendment in Committee. To decide whether something is illegal without the entire apparatus of the justice system, in which a great deal of care is taken to decide whether something is illegal, at high volume and high speed, is very worrying. It strikes me as amusing because someone commented earlier that they like a “must” instead of a “maybe”. In this case, I caution that a provider should treat the content as content of the kind in question accordingly, that something a little softer is needed, not a cliff edge that ends up in horrors around illegality where someone who has acted in self-defence is accused of a crime of violence, as happens to many women, and so on and so forth. I do not want to labour the point. I just urge a gentle landing rather than, as it is written, a cliff edge.
My Lords, this has been a very interesting debate. Beyond peradventure my noble friend Lord Allan and the noble Viscount, Lord Colville, and the noble Baroness, Lady Fox, have demonstrated powerfully the perils of this clause. “Lawyers’ caution” is one of my noble friend’s messages to take away, as is the complexities in making these judgments. It was interesting when he mentioned the sharing for awareness’s sake of certain forms of content and the judgments that must be taken by platforms. His phrase “If in doubt, take it out” is pretty chilling in free speech terms—I think that will come back to haunt us. As the noble Baroness, Lady Fox, said, the wrong message is being delivered by this clause. It is important to have some element of discretion here and not, as the noble Baroness, Lady Kidron, said, a cliff edge. We need a gentler landing. I very much hope that the Minister will land more gently.
My Lords, this has been a good debate. It is very hard to see where one would want to take it. If it proves anything, it is that the decision to drop the legal but harmful provisions in the Bill was probably taken for the wrong reasons but was the right decision, since this is where we end up—in an impossible moral quandary which no amount of writing, legalistic or otherwise, will get us out of. This should be a systems Bill, not a content Bill.
My Lords, I start by saying that accurate systems and processes for content moderation are crucial to the workability of this Bill and keeping users safe from harm. Amendment 228 from the noble Lord, Lord Allan of Hallam, seeks to remove the requirement for platforms to treat content as illegal or fraudulent content if reasonable grounds for that inference exist. The noble Lord set out his concerns about platforms over-removing content when assessing illegality.
Under Clause 173(5), platforms will need to have reasonable grounds to determine whether content is illegal or a fraudulent advertisement. Only when a provider has reasonable grounds to infer that said content is illegal or a fraudulent advertisement must it then comply with the relevant requirements set out in the Bill. This would mean removing the content or preventing people from encountering it through risk-based and proportionate systems and processes.
Clause 173(6) further clarifies what “reasonable grounds to infer” means in relation to judgments about illegal content and fraudulent adverts. It sets out the tests that a provider must apply to the assessment of whether all the elements of an offence—including the mental elements—are present, and whether a defence might be relied on.
The noble Lord’s amendment removes this standard for judging the illegality of content but does not replace it with another standard. That would mean that the Bill provided less detail about when providers are required to treat content as illegal or a fraudulent advert. The result would be that the Bill did not set out a consistent approach to identifying and removing such content that would enable providers to interpret their duties in a broad range of ways while still complying with the framework. This could result in services both over-removing and under-removing content.
I know that the noble Lord is concerned that this provision could encourage overzealous removal of content, but the Government are clear that the approach that I have just outlined provides the necessary safeguards against platforms over-removing content when complying with their duties under the Bill. The noble Lord asked for a different standard to be associated with different types of criminal offence. That is, in effect, what we have done through the distinction that we have made between priority and non-priority offences.
To assist services further, Ofcom will be required to provide guidance on how it judges the illegality of content. In addition, the Government consider that it would not be right to weaken the test for illegal content by diluting the content moderation provisions in the way that this amendment would. Content moderation is critical to protecting users from illegal content and fraudulent advertisements.
The noble Viscount, Lord Colville, set out the importance of freedom of expression, as other noble Lords—principally the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan, but others too—have throughout our scrutiny of the Bill. Our approach regarding freedom of expression recognises that the European Convention on Human Rights imposes obligations in relation to this on states, not private entities. As we have discussed previously, private actors, including service providers in scope, have their own freedom of expression rights. This means that platforms are free to decide what content should be allowed on their sites within the bounds of the law. As such, it is more appropriate to ask them to have particular regard to these concepts rather than to be compliant or consistent with them.
In-scope companies will have to consider and implement safeguards for freedom of expression when fulfilling their duties. For example, platforms could safeguard freedom of expression by ensuring that human moderators are adequately trained to assess contextual and linguistic nuance—such as the examples that the noble Lord gave—to prevent the over-removal of content. The larger services will also have additional duties to assess their impact on freedom of expression and privacy when adopting safety policies, to keep this assessment up to date and to demonstrate that they have taken positive steps in relation to the impact assessment.
Further, platforms will not be penalised for making the wrong calls on pieces of illegal content. Ofcom will instead make its judgments on the systems and processes that platforms have in place when making these decisions. The focus on transparency through the Bill’s framework and on user reporting and redress mechanisms will enable users to appeal the removal of content more effectively than they can at present.
Amendment 229 in the name of the noble Baroness, Lady Fox, would require providers of category 1 services to apply the user empowerment features required under Clause 12 only to content that they have “reasonable grounds to infer” is user empowerment content. The Bill’s cross-cutting freedom of expression duties already prevent providers overapplying user empowerment features or adopting an inconsistent or capricious approach; Ofcom can take enforcement action if they do this. Clause 173(2) and (3) already specify how providers must make judgments about the status of content, including judgments about whether content is in scope of the user empowerment duties. That includes making this judgment based on
“all relevant information that is reasonably available to a provider”.
It is unclear whether the intention of the noble Baroness’s amendment is to go further. If so, it would be inappropriate to apply the “reasonable grounds to infer” test in Clause 173(5) and (6) to user empowerment content. This is because, as I have just outlined in relation to the amendment in the name of the noble Lord, Lord Allan, the test sets out the approach that providers must take when assessing whether content amounts to a criminal offence. The test cannot sensibly be applied to content covered by the user empowerment duties because such content is not illegal. It is not workable to suggest that providers need to apply criminal law concepts such as intent or defences to non-criminal material. Under Clause 48, Ofcom will be required to produce and publish guidance that sets out examples of the kinds of content that Ofcom considers to be relevant to the user empowerment duties. This will assist providers in determining what content is of relevance to the user empowerment duties.
I hope that this allays the concerns raised by the noble Baroness and the noble Lord, and that the noble Lord will be content to withdraw his amendment.
My Lords, I remain concerned that people who use more choice words of Anglo-Saxon origin will find their speech more restricted than those who use more Latinate words, such as “inference” and “reasonable”, but the Minister has given some important clarifications.
The first is that no single decision could result in a problem for a platform, so it will know that it is about a pattern of bad decision-making rather than a single decision; that will be helpful in terms of taking a bit of the pressure off. The Minister also gave an important clarification around—I hate this language, but we have to say it—priority versus primary priority. If everything is a priority, nothing is a priority but, in this Bill, some things are more of a priority than others. The public order offences are priority offences; therefore, they have a little bit more leeway over those offences than they do over primary priority offences, which include the really bad stuff that we all agree we want to get rid of.
As I say, I do not think that we are going to get much further in our debates today although those were important clarifications. The Minister is trying to give us reasonable grounds to infer that the guidance from Ofcom will result in a gentle landing rather than a cliff edge, which the noble Baroness, Lady Kidron, rightly suggested is what we want. With that, I beg leave to withdraw the amendment.
Amendment 228 withdrawn.
Amendment 229 not moved.
Amendment 230
Moved by
230: After Clause 174, insert the following new Clause—
“Time for publishing first guidance under certain provisions of this Act
(1) OFCOM must publish guidance to which this section applies within the period of 18 months beginning with the day on which this Act is passed. (2) This section applies to—(a) the first guidance under section 47(2)(a) (record-keeping and review);(b) the first guidance under section 47(2)(b) (children’s access assessments);(c) the first guidance under section 48(1) (content harmful to children);(d) the first guidance under section 73 (provider pornographic content);(e) the first guidance under section 90(1) (illegal content risk assessments under section 8);(f) the first guidance under section 90(2) (illegal content risk assessments under section 22);(g) the first guidance under section 90(3) (children’s risk assessments);(h) the first guidance under section 140 (enforcement);(i) the first guidance under section 174 relating to illegal content judgements within the meaning of subsection (2)(a) of that section (illegal content and fraudulent advertisements).(3) If OFCOM consider that it is necessary to extend the period mentioned in subsection (1) in relation to guidance mentioned in any of paragraphs (a) to (i) of subsection (2), OFCOM may extend the period in relation to that guidance by up to 12 months by making and publishing a statement.But this is subject to subsection (6).(4) A statement under subsection (3) must set out—(a) the reasons why OFCOM consider that it is necessary to extend the period mentioned in subsection (1) in relation to the guidance concerned, and(b) the period of extension.(5) A statement under subsection (3) may be published at the same time as (or incorporate) a statement under section 38(12) (extension of time to prepare certain codes of practice).(6) But a statement under subsection (3) may not be made in relation to guidance mentioned in a particular paragraph of subsection (2) if—(a) a statement has previously been made under subsection (3) (whether in relation to guidance mentioned in the same or a different paragraph of subsection (2)), or(b) a statement has previously been made under section 38(12).”Member’s explanatory statement
This amendment provides that OFCOM must prepare the first guidance under certain provisions of the Bill within 18 months of Royal Assent, unless they consider a longer period to be necessary in which case OFCOM may (on one occasion only) extend the period and set out why in a published statement.
Amendment 230 agreed.
Clause 176: Individuals providing regulated services: liability
Amendment 231
Moved by
231: Clause 176, page 152, line 33, at end insert—
“(ga) Chapter 3A of Part 4 (deceased child users);”Member’s explanatory statement
Clause 176 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name, so that individuals may be jointly and severally liable for the duties imposed by that clause.
Amendment 231 agreed.
Clause 179: Information offences: supplementary
Amendments 231A and 231B
Moved by
231A: Clause 179, page 154, line 8, leave out “is” and insert “has been”
Member’s explanatory statement
This amendment is a minor change to ensure consistency of tenses.
231B: Clause 179, page 154, line 11, leave out “is” and insert “has been”
Member’s explanatory statement
This amendment is a minor change to ensure consistency of tenses.
Amendments 231A and 231B agreed.
Schedule 17: Video-sharing platform services: transitional provision etc
Amendments 232 to 236
Moved by
232: Schedule 17, page 247, line 35, at end insert—
“(ba) section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)), and”
Member’s explanatory statement
This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty in the new clause proposed after Clause 11 in my name to carry out assessments for the purposes of the user empowerment duties in Clause 12(2).
233: Schedule 17, page 247, line 36, leave out “and (9) (records of risk assessments)” and insert “, (8A) and (9) (records of assessments)”
Member’s explanatory statement
This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty inserted in Clause 19 (see the amendments of that Clause proposed in my name) to keep records of the new assessments.
234: Schedule 17, page 248, line 20, at end insert—
“(ea) the duties set out in section (Disclosure of information about use of service by deceased child users) (deceased child users);”Member’s explanatory statement
This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by the clause proposed after Clause 67 in my name during the transitional period.
235: Schedule 17, page 250, line 12, leave out “risk assessments and children’s access” and insert “certain”
Member’s explanatory statement
This amendment makes a technical drafting change related to the new Clause proposed after Clause 11 in my name.
236: Schedule 17, page 250, line 15, leave out “risk assessments and children’s access” and insert “certain”
Member’s explanatory statement
This amendment makes a technical drafting change related to the new Clause proposed after Clause 11 in my name.
Amendments 232 to 236 agreed.
Amendment 236A
Moved by
236A: After Clause 194, insert the following new Clause—
“Power to regulate app stores
(1) Subject to the following provisions of this section and section (Power to regulate app stores: supplementary), the Secretary of State may by regulations amend any provision of this Act to make provision for or in connection with the regulation of internet services that are app stores.
(2) Regulations under this section may not be made before OFCOM have published a report under section (OFCOM’s report about use of app stores by children)(report about use of app stores by children).
(3) Regulations under this section may be made only if the Secretary of State, having considered that report, considers that there is a material risk of significant harm to an appreciable number of children presented by either of the following, or by both taken together—
(a) harmful content present on app stores, or
(b) harmful content encountered by means of regulated apps available in app stores.
(4) Before making regulations under this section the Secretary of State must consult—
(a) persons who appear to the Secretary of State to represent providers of app stores,
(b) persons who appear to the Secretary of State to represent the interests of children (generally or with particular reference to online safety matters),
(c) OFCOM,
(d) the Information Commissioner,
(e) the Children’s Commissioner, and
(f) such other persons as the Secretary of State considers appropriate.
(5) In this section and in section (Power to regulate app stores: supplementary)—
“amend” includes repeal and apply (with or without modifications);
“app” includes an app for use on any kind of device, and “app store” is to be read accordingly;
“content that is harmful to children” has the same meaning as in Part 3 (see section 54);
“harmful content” means—
(a) content that is harmful to children,
(b) search content that is harmful to children, and
(c) regulated provider pornographic content;
“regulated app” means an app for a regulated service;
“regulated provider pornographic content” has the same meaning as in Part 5 (see section 70);
“search content” has the same meaning as in Part 3 (see section 51).
(6) In this section and in section (Power to regulate app stores: supplementary) references to children are to children in the United Kingdom.”
Member’s explanatory statement
This amendment provides that the Secretary of State may make regulations amending this Bill so as to bring app stores within its scope. The regulations may not be made until OFCOM have published their report about the use of app stores by children (see the new Clause proposed to be inserted after Clause 147 in my name).
My Lords, we have had some productive discussions on application stores, commonly known as “app stores”, and their role as a gateway for children accessing online services. I am grateful in particular to my noble friend Lady Harding of Winscombe for her detailed scrutiny of this area and the collaborative approach she has taken in relation to it and to her amendments, to which I will turn in a moment. These share the same goals as the amendments tabled in my name in seeking to add evidence-based duties on app stores to protect children.
The amendments in my name will do two things. First, they will establish an evidence base on the use of app stores by children and the role that app stores play in children encountering harmful content online. Secondly, following consideration of this evidence base, the amendments also confer a power on the Secretary of State to bring app stores into scope of the Bill should there be a material risk of significant harm to children on or through them.
On the evidence base, Amendment 272A places a duty on Ofcom to publish a report on the role of app stores in children accessing harmful content on the applications of regulated services. To help build a greater evidence base about the types of harm available on and through different kinds of app stores, the report will consider a broad range of these stores, which could include those available on various devices, such as smartphones, gaming devices and smart televisions. The report will also assess the use and effectiveness of age assurance on app stores and consider whether the greater use of age assurance or other measures could protect children further.
Publication of the report must be two to three years after the child safety duties come into force so as not to interfere with the Bill’s implementation timelines. This timing will also enable the report to take into account the impact of the regulatory framework that the Bill establishes.
Amendment 274A is a consequential amendment to include this report in the Bill’s broader confidentiality provisions, meaning that Ofcom will need to exclude confidential matters—for example, commercially sensitive information—from the report’s publication.
Government Amendments 236A, 236B and 237D provide the Secretary of State with a delegated power to bring app stores into the scope of regulation following consideration of Ofcom’s report. The power will allow the Secretary of State to make regulations putting duties on app stores to reduce the risks of harm presented to children from harmful content on or via app stores. The specific requirements in these regulations will be informed by the outcome of the Ofcom report I have mentioned.
As well as setting out the rules for app stores, the regulations may also make provisions regarding the duties and functions of Ofcom in regulating app stores. This may include information-gathering and enforcement powers, as well as any obligations to produce guidance or codes of practice for app store providers.
By making these amendments, our intention is to build a robust evidence base on the potential risks of app stores for children without affecting the Bill’s implementation more broadly. Should it be found that duties are required, the Secretary of State will have the ability to make robust and comprehensive duties, which will provide further layers of protection for children. I beg to move.
My Lords, before speaking to my Amendment 239A, I thank my noble friend the Minister, the Secretary of State and the teams in both the department and Ofcom for their collaborative approach in working to bring forward this group of amendments. I also thank my cosignatories. My noble friend Lady Stowell cannot be in her place tonight but she has been hugely helpful in guiding me through the procedure, as have been the noble Lords, Lord Stevenson, Lord Clement-Jones and Lord Knight, not to mention the noble Baroness, Lady Kidron. It has been a proper cross-House team effort. Even the noble Lord, Lord Allan, who started out quite sceptical, has been extremely helpful in shaping the discussion.
I also thank the NSPCC and Barnardo’s for their invaluable advice and support, as well as Snap and Match—two companies which have been willing to stick their heads above the parapet and challenge suppliers and providers on which they are completely dependent in the shape of the current app store owners, Apple and Google.
I reassure my noble friend the Minister—and everyone else—that I have no intention of dividing the House on my amendment, in case noble Lords were worried. I am simply seeking some reassurance on a number of points where my amendments differ from those tabled by the Government—but, first, I will highlight the similarities.
As my noble friend the Minister has referred to, I am delighted that we have two packages of amendments that in both cases recognise that this was a really significant gap in the Bill as drafted. Ignoring the elements of the ecosystem that sell access to regulated services, decide age guidelines and have the ability to do age assurance was a substantial gap in the framing of the Bill. But we have also recognised together that it is very important that this is an “and” not an “or”—it is not instead of regulating user-to-user services or search but in addition to. It is an additional layer that we can bring to protect children online, and it is very important that we recognise that—and both packages do.
Finally, considerable work needs to be done to do this properly—to properly research how we should regulate app stores and other app store-like things, and to do the full package, which I am pleased to say that both groups of amendments do. They instruct Ofcom to do the research but also give the Secretary of State the powers to enact any recommendations that come from that research.
However, there are four differences that I will briefly tease out. I live in hope, but I suspect that two we will continue to disagree on—but on two I hope my noble friend can give us some reassurance that we are in fact aligned. I shall take the two that I fear we will disagree on first. First, on timing, the Government’s amendments require Ofcom to conduct the work two to three years after Sections 11 and 25 come into force. I suspect that that means that we are talking about four to five years away, which is a very long time in the history of the digital world; whereas my amendments require 12 months after the first element of Sections 11 and 25 coming into being, so probably about two years. We are looking for an air gap between the implementation of user-to-user regulation, search regulation and app store regulation—but I would argue that the government amendment is too big an air gap. I ask my noble friend the Minister to consider whether it is possible to reduce the length of time to a digital speed rather than an analogue one.
On my second point, on which I am not so hopeful—but I am certain that we will come back to it again on Wednesday—the Government’s amendments require Ofcom to consider whether app stores cover harmful content. Once again, we have an amendment that focuses on content rather than functionality, systems and processes, and the non-content harms that a number of us are most worried about. It is a real shame that, in this new amendment, the Government have chosen to use the very language that is causing us so much concern in other parts of the Bill; whereas my amendments ask Ofcom to look at the objectives of Part 3 of the Bill, which I think is a much neater way. I genuinely believe that the Government want to capture the non-content harms, and I ask the Minister to consider whether it is possible to tidy up their amendment at Third Reading.
In two areas, I hope that the Minister can give me reassurance. First, on transparency, the government amendments are clear that Ofcom needs to publish the output of its research—its report to the Secretary of State—but they are not clear that the Secretary of State needs to publish if they choose not to implement Ofcom’s recommendations. Possibly I am just a novice in parliamentary procedure and they just would not be able to get away with that, and one of us will remember the point in time to ask the right question and it will come up in a ballot at the right time, but can my noble friend the Minister assure me that, in fact, the Secretary of State will publish, even if they choose not to implement any further regulation of app stores?
Finally, on scope, my amendments talk about app stores and “other access means”, to make sure that this is future-proofed. Today 99% of all user-to-user services are reached through the Apple App Store and Google Play. We all hope that that will change and that there will be competition in the stores that we use and the means and mechanisms that we use; hence the group of amendments that I have tabled refer to app stores and “other access means”.
The government amendments do not really define what an app store is and I notice that my noble friend the Minister did not either. Can he give us some assurance that he is confident that the wording in the government amendments is future-proof and not prone to the rebranding of app stores into something else and/or replatforming among these enormous tech companies so that we access them through some other form of technology, and that they will still be caught by the Ofcom review?
These are my four questions. Fundamentally, I am extremely grateful for the collaborative way in which the Government and all of us in this House have developed this. We have been able to move this forward constructively, and I am very pleased and grateful.
My Lords, I pay tribute to the noble Baroness, Lady Harding, for her role in bringing this issue forward. I too welcome the government amendments. It is important to underline that adding the potential role of app stores to the Bill is neither an opportunity for other companies to fail to comply and wait for the gatekeepers to do the job nor a one-stop shop in itself. It is worth reminding ourselves that digital journeys rarely start and finish in one place. In spite of the incredible war for our attention, in which products and services attempt to keep us rapt on a single platform, it is quite important for everyone in the ecosystem to play their part.
I have two minor points. First, I was not entirely sure why the government amendment requires the Secretary of State to consult as opposed to Ofcom. Can the Minister reassure me that, whoever undertakes the consultation, it will include children and children’s organisations as well as tech companies? Secondly, like the noble Baroness, Lady Harding, I was a little surprised that the amendment does not define an app store but uses the term “the ordinary meaning of”. That seems like it may have the possibility for change. If there is a good reason for that—I am sure there is—then it must be stated that app stores cannot suddenly rebrand to something else and that that gatekeeper function will be kept absolutely front and centre.
Notwithstanding those comments, and associating myself with the idea that nothing should wait until 2025-26, I am very grateful to the Government for bringing this forward.
My Lords, I will make a brief contribution because I was the misery guts when this was proposed first time round. I congratulate the noble Baroness, Lady Harding, not just on working with colleagues to come up with a really good solution but on seeking me out. If I heard someone be as miserable as I was, I might try to avoid them. She did not; she came and asked me, “Why are you miserable? What is the problem here?”, and took steps to address it. Through her work with the Government, their amendments address my main concerns.
My first concern, as we discussed in Committee, was that we would be asking large companies to regulate their competitors, because the app stores are run by large tech companies. She certainly understood that concern. The second was that I felt we had not necessarily yet clearly defined the problem. There are lots of problems. Before you can come up with a solution, you need a real consensus on what problem you are trying to address. The government amendment will very much help in saying, “Let’s get really crunchy about the actual problem that we need app stores to address”.
Finally, I am a glass-half-full kind of guy as well as a misery guts—there is a contradiction there—and so I genuinely think that these large tech businesses will start to change their behaviour and address some of the concerns, such as getting age ratings correct, just by virtue of our having this regulatory framework in place. Even if today the app stores are technically outside, the fact that the sector is inside and that this amendment tells them that they are on notice will, I think and hope, have a hugely positive effect and we will get the benefits much more quickly than the timescale envisaged in the Bill. That feels like a true backstop. I sincerely hope that the people in those companies, who I am sure will be glued to our debate, will be thinking that they need to get their act together much more quickly. It is better for them to do it themselves than wait for someone to do it to them.
My Lords, I add my congratulations to the noble Baroness, Lady Harding, on her tenacity, and to the Minister on his flexibility. I believe that where we have reached is pretty much the right balance. There are the questions that the noble Baroness, Lady Harding, and others have asked of the Minister, and I hope he will answer those, but this is a game-changer, quite frankly. Rightly, the noble Baroness has paid tribute to the companies which have put their head above the parapet. That was not that easy for them to do when you consider that those are the platforms they have to depend on for their services to reach the public.
Unlike the research report, they have reserved powers that the Secretary of State can use if the report is positive, which I hope it will be. I believe this could be a turning point. The digital markets and consumers Bill is coming down the track this autumn and that is going to give greater powers to make sure that the app stores can be tackled—after all, there are only two of them and they are an oligopoly. They are the essence of big tech, and they need to function in a much more competitive way.
The noble Baroness talked about timing, and it needs to be digital timing, not analogue. Four years does seem a heck of a long time. I hope the Minister will address that.
Then there is the really important aspect of harmful content. In the last group, the Minister reassured us about systems and processes and the illegality threshold. Throughout, he has tried to reassure us that this is all about systems and processes and not so much about content. However, every time we look, we see that content is there almost by default, unless the subject is raised. We do not yet have a Bill that is actually fit for purpose in that sense. I hope the Minister will use his summer break wisely and read through the Bill to make sure that it meets its purpose, and then come back at Third Reading with a whole bunch of amendments that add functionalities. How about that for a suggestion? It is said in the spirit of good will and summer friendship.
The noble Baroness raised a point about transparency when it comes to Ofcom publishing its review. I hope the Minister can give that assurance as well.
The noble Baroness, Lady Kidron, asked about the definition of app store. That is the gatekeeper function, and we need to be sure that that is what we are talking about.
I end by congratulating once again the noble Baroness and the Minister on where we have got to so far.
My Lords, I will start with the final point of the noble Lord, Lord Clement-Jones. I remind him that, beyond the world of the smartphone, there is a small company called Microsoft that also has a store for software—it is not just Google and Apple.
Principally, I say well done to the noble Baroness, Lady Harding, in deploying all of her “winsome” qualities to corral those of us who have been behind her on this and then persuade the Minister of the merits of her arguments. She also managed to persuade the noble Lord, Lord Allan of Misery Guts, that this was a good idea. The sequence of research, report, regulation and regulate is a good one, and as the noble Lord, Lord Clement-Jones, reminded us it is being deployed elsewhere in the Bill. I agree with the noble Baroness about the timing: I much prefer two years to four years. I hope that at least Ofcom would have the power to accelerate this if it wanted to do so.
I was reminded of the importance of this in an article I read in the Guardian last week, headed:
“More than 850 people referred to clinic for video game addicts”.
This was in reference to the NHS-funded clinic, the National Centre for Gaming Disorders. A third of gamers receiving treatment there were spending money on loot boxes in games such as “Fortnite”, “FIFA”, “Minecraft”, “Call of Duty” and “Roblox”—all games routinely accessed by children. Over a quarter of those being treated by the centre were children.
The article reported that Apple’s and Google’s app stores are
“increasingly offering games with gambling-style mechanics”—
systems addictive by design rather than safe by design. Leon Xiao, a loot box expert and PhD fellow at the IT University of Copenhagen, reports that
“there were informal standards for these games—such as a requirement to publish information about the probability of winning on a slot machine spin—but even these were not being adhered to”.
He is quoted as saying:
“Apple says if you want to upload your game to the Apple Store, you need to make disclosures about the probability of randomised features”.
He continued:
“We checked in 2021 and a third of companies were not doing it. Existing regulation is not being enforced”.
I gather that the Minister’s department has a working group to examine loot boxes. An update on that now, or in writing if he would prefer, would be helpful. The main point of raising this is apparent: app stores are an important pinch point in the digital user journey. We need to ensure that Ofcom has a proper look at whether including them helps it deliver the aims of the Bill. We should include the powers for it to be able to do that, in addition to the other safeguards that we are putting in the Bill to protect children. We strongly support these amendments.
My Lords, I am very grateful for the strength of support and echo the tributes that have been paid to my noble friend Lady Harding—the winsome Baroness from Winscombe —for raising this issue and working with us so collaboratively on it. I am particularly glad that we were able to bring these amendments on Report; as she knows, it involved some speedy work by the Bill team and some speedy drafting by the Office of the Parliamentary Counsel, but I am glad that we were able to do it on Report, so that I can take it off my list of things to do over the summer, which was kindly written for me by the noble Lord, Lord Clement-Jones.
My noble friend’s amendments were laid before the Government’s, so she rightly asked a couple of questions on where they slightly differ. Her amendment seeks to ensure that other websites or online marketplaces that allow users to download apps are also caught by these duties. I reassure her that the Government’s amendments would capture these types of services. We have intentionally not provided detail about what constitutes an app store to ensure that the Bill remains future-proof. I will say a bit more about that in a moment. Regulations made by the Secretary of State under this power will be able to specify thresholds for which app stores are in scope, giving clarity to providers and users about the application of the duties.
On questions of definition, we are intentionally choosing not to define app stores in these amendments. The term is generally understood as meaning a service that makes applications available, which means that the Secretary of State will be able to impose duties on any such service. Any platform that enables apps to be downloaded can therefore be considered an app store for the purpose of this duty, regardless of whether or not it calls itself one. Regulations will clearly set out which providers are in scope of the duties. The ability to set threshold conditions will also ensure that any duties capture only those that pose the greatest risk of children accessing harmful content.
We touched on the long-running debate about content and functionality. We have made our position on that clear; it will be caught by references to content. I am conscious that we will return to this on Wednesday, when we will have a chance to debate it further.
On timing, as I said, I am glad that we were able to bring these amendments forward at this stage. The publication date for Ofcom’s report is to ensure that Ofcom can prioritise the implementation of the child safety duties and put in place the Bill’s vital protections for children before turning to its research on app stores.
That timing also allows the Secretary of State to base his or her decision on commencement on the effectiveness of the existing framework and to use the research of Ofcom’s report to set out a more granular approach to issues such as risk assessment and safety duties. It is necessary to await the findings of Ofcom’s report before those duties are commenced.
To the questions posed by the noble Baroness, Lady Kidron, and others about the consultation for that report by Ofcom, we expect Ofcom to consult widely and with all relevant parties when producing its report. We do not believe that there is a need for a specific list of consultees given Ofcom’s experience and expertise in this area as well as the great experience it will have through its existing enforcement and wider consultation requirements. In addition, the Secretary of State, before making regulations, will be required to consult a range of key parties, such as the Children’s Commissioner and the Information Commissioner, and those who represent the interests of children, as well as providers of app stores. That can include children themselves.
On the questions asked by the noble Lord, Lord Knight, on loot boxes, he is right that this piece of work is being led by my department. We want to see the games industry take the lead in strengthening protections for children and adults to mitigate the risk of harms. We are pursuing that through a DCMS-led technical working group, and we will publish an update on progress in the coming months. I again express my gratitude to my noble friend Lady Harding and other noble Lords who have expressed their support.
Amendment 236A agreed.
Amendment 236B
Moved by
236B: After Clause 194, insert the following new Clause—
“Power to regulate app stores: supplementary
(1) In this section (except in subsection (4)(c)) “regulations” means regulations under section (Power to regulate app stores)(1).(2) Provision may be made by regulations only for or in connection with the purposes of minimising or mitigating the risks of harm to children presented by harmful content as mentioned in section (Power to regulate app stores)(3)(a) and (b).(3) Regulations may not have the effect that any body other than OFCOM is the regulator in relation to app stores.(4) Regulations may—(a) make provision exempting specified descriptions of app stores from regulation under this Act;(b) make provision amending Part 2, section 49 or Schedule 1 in connection with provision mentioned in paragraph (a);(c) make provision corresponding or similar to provision which may be made by regulations under paragraph 1 of Schedule 11 (“threshold conditions”), with the effect that only app stores which meet specified conditions are regulated by this Act.(5) Regulations may make provision having the effect that app stores provided from outside the United Kingdom are regulated by this Act (as well as app stores provided from within the United Kingdom), but, if they do so, must contain provision corresponding or similar to section 3(5) and (6)(UK links).(6) The provision that may be made by regulations includes provision—(a) imposing on providers of app stores duties corresponding or similar to duties imposed on providers of Part 3 services by—(i) section 10 or 11 (children’s online safety: user-to-user services) or any of sections 16 to 19 so far as relating to section 10 or 11;(ii) section 24 or 25 (children’s online safety: search services) or any of sections 26 to 29 so far as relating to section 24 or 25;(b) imposing on providers of app stores duties corresponding or similar to duties imposed on providers of internet services within section 71(2) by section 72 (duties about regulated provider pornographic content);(c) imposing on providers of app stores requirements corresponding or similar to requirements imposed on providers of regulated services by, or by OFCOM under, Part 6 (fees); (d) imposing on OFCOM duties in relation to app stores corresponding or similar to duties imposed in relation to Part 3 services by Chapter 3 of Part 7 (OFCOM’s register of risks, and risk profiles);(e) conferring on OFCOM functions in relation to app stores corresponding or similar to the functions that OFCOM have in relation to regulated services under—(i) Chapter 4 of Part 7 (information), or(ii) Chapter 6 of Part 7 (enforcement), including provisions of that Chapter conferring power for OFCOM to impose monetary penalties;(f) about OFCOM’s production of guidance or a code of practice relating to any aspect of the regulation of app stores that is included in the regulations.(7) The provision that may be made by regulations includes provision having the effect that app stores fall within the definition of “Part 3 service” or “regulated service” for the purposes of specified provisions of this Act (with the effect that specified provisions of this Act which apply in relation to Part 3 services or regulated services, or to providers of Part 3 services or regulated services, also apply in relation to app stores or to providers of app stores).(8) Regulations may not amend or make provision corresponding or similar to—(a) Chapter 2 of Part 4 (reporting CSEA content),(b) Chapter 5 of Part 7 (notices to deal with terrorism content and CSEA content), or(c) Part 10 (communications offences).(9) Regulations may make different provision with regard to app stores of different kinds.(10) In this section “specified” means specified in regulations.”Member’s explanatory statement
This amendment makes provision about the purpose and contents of regulations to regulate app stores which may be made by the Secretary of State under the preceding new Clause proposed to be inserted in my name.
Amendment 236B agreed.
I beg to move that further consideration on Report be adjourned and that the House be adjourned during pleasure until 10.15 pm.
My Lords, has the noble Lord, Lord Harlech, seen paragraph 3.1 of the Companion? In case he has not—I know he is very new to this House—it states:
“It is a firm convention”—
not any old convention, but a firm convention—
“that the House normally rises by about 10pm on Mondays to Wednesdays”.
Can he explain why today is so different?
I take the noble Lord’s points on board. I think that my noble friend the Chief Whip answered those points at the Dispatch Box earlier today.
I appreciate that the noble Lord is put in a difficult position, but all the Chief Whip said was that this is usual. When was the last occasion that this had to happen in the way that it is happening tonight?
Perhaps this is something to discuss with my noble friend the Chief Whip while the House adjourns during pleasure.
Sitting suspended.