Skip to main content

Online Safety Bill

Volume 831: debated on Thursday 6 July 2023

Report (1st Day) (Continued)

Schedule 1: Exempt user-to-user and search services

Amendment 27

Moved by

27: Schedule 1, page 185, line 11, leave out from “provider” to end of line 13 and insert “, including where the publication of the content is effected or controlled by means of—

(a) software or an automated tool or algorithm applied by the provider or by a person acting on behalf of the provider, or(b) an automated tool or algorithm made available on the service by the provider or by a person acting on behalf of the provider.”Member’s explanatory statement

This amendment is about what counts as “provider content” for the purposes of the exemption in paragraph 4 of Schedule 1 of the Bill (which provides that limited functionality services are exempt). Words are added to expressly cover the case where an automated tool or algorithm is made available on the service by a provider, such as a generative AI bot.

My Lords, the Government are committed to protecting children against accessing pornography online. As technology evolves, it is important that the regulatory framework introduced by the Bill keeps pace with emerging risks to children and exposure to pornography in new forms, such as generative artificial intelligence.

Part 5 of the Bill has been designed to be future-proof, and we assess that it would already capture AI-generated pornography. Our Amendments 206 and 209 will put beyond doubt that content is “provider pornographic content” where it is published or displayed on a Part 5 service by means of an automated tool or algorithm, such as a generative AI bot, made available on the service by a provider. Amendments 285 and 293 make clear that the definition of an automated tool includes a bot. Amendment 276 clarifies the definition of a provider of a Part 5 service, to make clear that a person who controls an AI bot that generates pornography can be regarded as the provider of a service.

Overall, our amendments provide important certainty for users, providers and Ofcom on the services and content in scope of the Part 5 duties. This will ensure that the new, robust duties for Part 5 providers to use age verification or age estimation to prevent children accessing provider pornographic content will also extend to AI-generated pornography. I beg to move.

My Lords, the noble Baroness, Lady Kidron, has unfortunately been briefly detained. If you are surprised to see me standing up, it is because I am picking up for her. I start by welcoming these amendments. I am grateful for the reaction to the thought-provoking debate that we had in Committee. I would like to ask a couple of questions just to probe the impact around the edges.

Amendment 27 looks as if it implies that purely content-generating machine-learning or AI bots could be excluded from the scope of the Bill, rather than included, which is the opposite of what we were hoping to achieve. That may be us failing to understand the detail of this large body of different amendments, but I would welcome my noble friend the Minister’s response to make sure that in Amendment 27 we are not excluding harm that could be generated by some form of AI or machine-learning instrument.

Maybe I can give my noble friend the Minister an example of what we are worried about. This is a recent scenario that noble Lords may have seen in the news, of a 15 year-old who asked, “How do I have sex with a 30 year-old?”. The answer was given in forensic detail, with no reference to the fact that it would in fact be statutory rape. Would the regulated service, or the owner of the regulated service that generated that answer, be included or excluded as a result of Amendment 27? That may be my misunderstanding.

This group is on AI-generated pornography. My friend, the noble Baroness, Lady Kidron, and I are both very concerned that it is not just about pornography, and that we should make sure that AI is included in the Bill. Specifically, many of us with teenage children will now be learning how to navigate the Snap AI bot. Would harm generated by that bot be captured in these amendments, or is it only content that is entirely pornographic? I hope that my noble friend the Minister can clarify both those points, then we will be able to support all these amendments.

My Lords, I rise briefly to welcome the fact that there is a series of amendments here where “bot” is replaced by

“bot or other automated tool”.

I point out that there is often a lot of confusion about what a bot is or is not. It is something that was largely coined in the context of a particular service—Twitter—where we understand that there are Twitter bots: accounts that have been created to pump out lots of tweets. In other contexts, on other services, there is similar behaviour but the mechanism is different. It seems to me that the word “bot” may turn out to be one of those things that was common and popular at the end of the 2010s and in the early 2020s, but in five years we will not be using it at all. It will have served its time, it will have expired and we will be using other language to describe what it is that we want to capture: a human being has created some kind of automated tool that will be very context dependent, depending on the nature of the service, and they are pumping out material. It is very clear that we want to make sure that such behaviour is in scope and that the person cannot hide behind the fact that it was an automated tool, because we are interested in the mens rea of the person sitting behind the tool.

I recognise that the Government have been very wise in making sure that whenever we refer to a bot we are adding that “automated tool” language, which will make the Bill inherently much more future-proof.

My Lords, I just want to elucidate whether the Minister has any kind of brief on my Amendment 152A. I suspect that he does not; it is not even grouped—it is so recent that it is actually not on today’s groupings list. However, just so people know what will be coming down the track, I thought it would be a good idea at this stage to say that it is very much about exactly the question that the noble Baroness, Lady Harding, was asking. It is about the interaction between a provider environment and a user, with the provider environment being an automated bot—or “tool”, as my noble friend may prefer.

It seems to me that we have an issue here. I absolutely understand what the Minister has done, and I very much support Amendment 153, which makes it clear that user-generated content can include bots. But this is not so much about a human user using a bot or instigating a bot; it is much more about a human user encountering content that is generated in an automated way by a provider, and then the user interacting with that in a metaverse-type environment. Clearly, the Government are apprised of that with regard to Part 5, but there could be a problem as regards Part 3. This is an environment that the provider creates, but it is interacted with by a user as if that environment were another user.

I shall not elaborate or make the speech that I was going to make, because that would be unfair to the Minister, who needs to get his own speaking note on this matter. But I give him due warning that I am going to degroup and raise this later.

My Lords, I warmly welcome this group of amendments. I am very grateful to the Government for a number of amendments that they are bringing forward at this stage. I want to support this group of amendments, which are clearly all about navigating forward and future-proofing the Bill in the context of the very rapid development of artificial intelligence and other technologies. In responding to this group of amendments, will the Minister say whether he is now content that the Bill is sufficiently future-proofed, given the hugely rapid development of technology, and whether he believes that Ofcom now has sufficient powers to risk assess for the future and respond, supposing that there were further parallel developments in generative AI such as we have seen over the past year?

My Lords, this is a quick-fire debate on matters where most of us probably cannot even understand the words, let alone the purpose and particularity of the amendments. I want to raise points already raised by others: it seems that the Government’s intention is to ensure that the Bill is future-proofed. Why then are they restricting this group to Part 5 only? It follows that, since Part 5 is about pornography, it has to be about only pornography—but it is rather odd that we are not looking at the wider context under which harm may occur, involving things other than simply pornography. While the Bill may well be currently able to deal with the issues that are raised in Part 3 services, does it not need to be extended to that as well? I shall leave it at that. The other services that we have are probably unlikely to raise the sorts of issues of concern that are raised by this group. None the less, it is a point that we need reassurance on.

My Lords, this has been a short but important debate and I am grateful to noble Lords for their broad support for the amendments here and for their questions. These amendments will ensure that services on which providers control a generative tool, such as a generative AI bot, are in scope of Part 5 of the Bill. This will ensure that children are protected from any AI-generated pornographic content published or displayed by provider-controlled generative bots. These changes will not affect the status of any non-pornographic AI-generated content, or AI-generated content shared by users.

We are making a minor change to definitions in Part 3 to ensure that comments or reviews on content generated by a provider-controlled artificial intelligence source are not regulated as user-generated content. This is consistent with how the Bill treats comments and reviews on other provider content. These amendments do not have any broader impact on the treatment of bots by Part 3 of the Bill’s regime beyond the issue of comments and reviews. The basis on which a bot will be treated as a user, for example, remains unchanged.

I am grateful to the noble Lord, Lord Clement-Jones, for degrouping his Amendment 152A so that I can come back more fully on it in a later group and I am grateful for the way he spoke about it in advance. I am grateful too for my noble friend Lady Harding’s question. These amendments will ensure that providers which control a generative tool on a service, such as a generative AI bot, are in scope of Part 5 of the Bill. A text-only generative AI bot would not be in scope of Part 5. It is important that we focus on areas which pose the greatest risk of harm to children. There is an exemption in Part 5 for text-based provider pornographic content because of the limited risks posed by published pornographic content. This is consistent with the approach of Part 3 of the Digital Economy Act 2017 and its provisions to protect children from commercial online pornography, which did not include text-based content in scope.

The right reverend Prelate the Bishop of Oxford is right to ask whether we think this is enough. These changes certainly help. The way that the Bill is written in a technology-neutral way will help us to future proof it but, as we have heard throughout the passage of the Bill, we all know that this area of work will need constant examination and scrutiny. That is why the Bill is subject the post-Royal Assent review and scrutiny that it is and why we are grateful for the anticipation noble Lords and Members of Parliament in the other place have already given to ensuring that it delivers on what we want to see. I believe these amendments, which put out of doubt important provisions relating to generative AI, are a helpful addition and I beg to move.

Amendment 27 agreed.

Amendment 28

Moved by

28: Schedule 1, page 185, line 23, at end insert—

“Public information services

5A A user-to-user service is exempt if its primary purpose is the creation of public information resources and it has the following characteristics—(a) user-to-user functions are limited to those necessary for the creation and maintenance of a public information resource,(b) OFCOM has determined that there is minimal risk of users sharing harmful content on the service, and(c) it is non-commercial.”Member’s explanatory statement

This amendment would allow OFCOM to exempt services like Wikipedia from regulation where it deems them to be low risk.

My Lords, as we enter the final stages of consideration of this Bill, it is a good time to focus a little more on what is likely to happen once it becomes law, and my Amendment 28 is very much in that context. We now have a very good idea of what the full set of obligations that in-scope services will have to comply with will look like, even if the detailed guidance is still to come.

With this amendment I want to return to the really important question that I do not believe we answered satisfactorily when we debated it in Committee. That is that there is a material risk that, without further amendment or clarification, Wikipedia and other similar services may feel that they can no longer operate in the United Kingdom.

Wikipedia has already featured prominently in our debates, but there are other major services that might find themselves in a similar position. As I was discussing the definitions in the Bill with my children yesterday—this may seem an unusual dinner conversation with teenagers, but I find mine to be a very useful sounding board—they flagged that OpenStreetMap, to which we all contribute, also seems to be in the scope of how we have defined user-to-user services. I shall start by asking some specific questions so that the Minister has time to find the answers in his briefing or have them magically delivered to him before summing up: I shall ask the questions and then go on to make the argument.

First, is it the Government’s view that Wikipedia and OpenStreetMap fall within the definition of user-to-user services as defined in Clause 2 and the content definition in Clause 211? We need to put all these pieces together to understand the scope. I have chosen these services because each is used by millions of people in the UK and their functionality is very well known, so I trust that the Government had them in mind when they were drafting the legislation, as well as the more obvious services such as Instagram, Facebook et cetera.

Secondly, can the Minister confirm whether any of the existing exemptions in the Bill would apply to Wikipedia and OpenStreetMap such that they would not have to comply with the obligations of a category 1 or 2B user-to-user service?

Thirdly, does the Minister believe that the Bill as drafted allows Ofcom to use its discretion in any other way to exempt Wikipedia and OpenStreetMap, for example through the categorisation regulations in Schedule 11? As a spoiler alert, I expect the answers to be “Yes”, “No” and “Maybe”, but it is really important that we have the definitive government response on the record. My amendment would seek to turn that to “Yes”, “Yes” and therefore the third would be unnecessary because we would have created an exemption.

The reason we need to do this is not in any way to detract from the regulation or undermine its intent but to avoid facing the loss of important services at some future date because of situations we could have avoided. This is not hyperbole or a threat on the part of the services; it is a natural consequence if we impose legal requirements on a responsible organisation that wants to comply with the law but knows it cannot meet them. I know it is not an intended outcome of the Bill that we should drive these services out, but it is certainly one intended outcome that we want other services that cannot meet their duties of care to exit the UK market rather than continue to operate here in defiance of the law and the regulator.

We should remind ourselves that at some point, likely to be towards the end of 2024, letters will start to arrive on the virtual doormats of all the services we have defined as being in scope—these 25,000 services—and their senior management will have a choice. I fully expect that the Metas, the Googles and all such providers will say, “Fine, we will comply. Ofcom has told us what we need to do, and we will do it”. There will be another bunch of services that will say, “Ofcom, who are they? I don’t care”, and the letter will go in the bin. We have a whole series of measures in the Bill by which we will start to make life difficult for them: we will disrupt their businesses and seek to prosecute them and we will shut them out of the market.

However, there is a third category, which is the one I am worried about in this amendment, who will say, “We want to comply, we are responsible, but as senior managers of this organisation”, or as directors of a non-profit foundation, “we cannot accept the risk of non-compliance and we do not have the resources to comply. There is no way that we can build an appeals mechanism, user reporter functions and all these things we never thought we would need to have”. If you are Wikipedia or OpenStreetMap, you do not need to have that infrastructure, yet as I read the Bill, if they are in scope and there is no exemption, then they are going to be required to build all that additional infrastructure.

The Bill already recognises that there are certain classes of services where it would be inappropriate to apply this new regulatory regime, and it describes these in Schedule 1, which I am seeking to amend. My amendment just seeks to add a further class of exempted service and it does this quite carefully so that we would exclude only services that I believe most of us in this House would agree should not be in scope. There are three tests that would be applied.

The first is a limited functionality test—we already have something similar in Schedule 1—so that the user-to-user functions are only those that relate to the production of what I would call a public information resource. In other words, users engage with one another to debate a Wikipedia entry or a particular entry on a map on OpenStreetMap. So, there is limited user-to-user functionality all about this public interest resource. They are not user-to-user services in the classic sense of social media; they are a particular kind of collective endeavour. These are much closer to newspaper publishers, which we have explicitly excluded from the Bill. It is much more like a newspaper; it just happens to be created by users collectively, out of good will, rather than by paid professional journalists. They are very close to that definition, but if you read Schedule 1, I do not think the definition of “provider content” in paragraph 4(2) includes at the moment these collective-user endeavours, so they do not currently have the exemption.

I have also proposed that Ofcom would carry out a harm test to avoid the situation where someone argues that their services are a public information resource, while in practice using it to distribute harmful material. That would be a rare case, but noble Lords can conceive of it happening. Ofcom would have the ability to say that it recognises that Wikipedia does not carry harmful content in any meaningful way, but it would also have the right not to grant the exemption to service B that says it is a new Wikipedia but carries harmful content.

Thirdly, I have suggested that this is limited to non-commercial services. There is an argument for saying any public information resource should benefit, and that may be more in line with the amendment proposed by the noble Lord, Lord Moylan, where it is defined in terms of being encyclopaedic or the nature of the service. I recognise that I have put in “non-commercial” as belt and braces because there is a rationale for saying that, while we do not really want an encyclopaedic resource to be in the 2B service if it has got user-to-user functions, if it is commercial, we could reasonably expect it to find some way to comply. It is different when it is entirely non-commercial and volunteer-led, not least because the Wikimedia Foundation, for example, would struggle to justify spending the money that it has collected from donors on compliance costs with the UK regime, whereas a commercial company could increase its resources from commercial customers to do that.

I hope this is a helpful start to a debate in which we will also consider Amendment 29, which has similar goals. I will close by asking the Minister some additional questions. I have asked him some very specific ones to which I hope he can provide answers, but first I ask: does he acknowledges the genuine risk that services like Wikipedia and OpenStreetMap could find themselves in a position where they have obligations under the Bill that they simply cannot comply with? It is not that they are unwilling, but there is no way for them to do all this structurally.

Secondly, I hope the Minister would agree that it is not in the public interest for Ofcom to spend significant time and effort on the oversight of services like these; rather, it should spend its time and effort on services, such as social media services, that we believe to be creating harms and are the central focus of the Bill.

Thirdly, will the Minister accept that there is something very uncomfortable about a government regulator interfering with the running of a neutral public resource like Wikipedia, when there is so much benefit from it and little or no demonstrative harm? It is much closer to the model that exists for a newspaper. We have debated endlessly in this House—and I am sure we will come back to it—that there is, rightly, considerable reluctance to have regulators going too far and creating this relationship with neutral public information goods. Wikipedia falls into that category, as does OpenStreetMap and others, and there would be fundamental in principle challenges around that.

I hope the Government will agree that we should be taking steps to make sure we are not inadvertently creating a situation where, in one or two years’ time, Ofcom will come back to us saying that it wrote to Wikipedia, because the law told it to do so, and told Wikipedia all the things that it had to do; Wikipedia took it to its senior management and then came back saying that it is shutting shop in the UK. Because it is sensible, Ofcom would come back and say that it did not want that and ask to change the law to give it the power to grant an exemption. If such things deserve an exemption, let us make it clear they should have it now, rather than lead ourselves down this path where we end up effectively creating churn and uncertainty around what is an extraordinarily valuable public resource. I beg to move.

My Lords, Amendments 29 and 30 stand in my name. I fully appreciated, as I prepared my thoughts ahead of this short speech, that a large part of what I was going to say might be rendered redundant by the noble Lord, Lord Allan of Hallam. I have not had a discussion with him about this group at all, but it is clear that his amendment is rather different from mine. Although it addresses the same problem, we are coming at it slightly differently. I actually support his amendment, and if the Government were to adopt it I think the situation would be greatly improved. I do prefer my own, and I think he put his finger on why to some extent: mine is a little broader. His relates specifically to public information, whereas mine relates more to what can be described as the public good. So mine can be broader than information services, and I have not limited it to non-commercial operations, although I fully appreciate that quite a lot of the services we are discussing are, in practice, non-commercial. As I say, if his amendment were to pass, I would be relatively satisfied, but I have a moderate preference for my own.

Wikipedia has been mentioned frequently, and the Government cannot say that they have not had notice of this problem, because it was frequently mentioned in Committee. The fact that the Government have not come forward with any suggestions or amendments to address this at all—I can summarise what I think is their response in a moment—is truly remarkable. There has been an open letter signed recently drawing attention to this problem, which includes not only OpenStreetMap, which the noble Lord referred to, but the Heritage Alliance; INSPIRE, which is a physics research platform operated by CERN; the Wellcome Sanger Institute; the Chartered Institute of Library and Information Professionals in Scotland; and Liberty. They are all concerned about how they are going to operate. They will be caught. Curiously, the Taliban will not be caught, because the Taliban will benefit from the exemption that exists in paragraph 9(1)(c) of Schedule 1 for a “foreign sovereign power”. So the Taliban will not be in Ofcom’s scope at all, but all these organisations doing perfectly decent work are going to be chased down by our regulator.

The Minister has said from the Dispatch Box that he does not think that Wikipedia would be in scope. But when pressed on this, both at the Dispatch Box and in private conversation—for which I am grateful—he said that that was his opinion but it was going to be decided by Ofcom; his assurance at the Dispatch Box, in other words, carries no weight, because the decision is to be made by Ofcom. When I say that we can still pass these amendments, he says we cannot tell Ofcom what to do because it is independent. But the entire Bill is telling Ofcom what to do, and of course Parliament can tell it that services of this character are not in scope. The Bill specifies what services are and are not in scope. So I think it is pretty much a nonsense answer.

Although these organisations, in many cases, are non-profits, that does not mean they are not businesses, and businesses have to plan, invest and think about what they are going to do next. They are hanging there, waiting and absolutely uncertain until Ofcom make a regulatory decision of an existential character, because we in Parliament cannot possibly take a stance on it and the Government cannot have a view. I know that businesses are constantly subject to the possibility of regulatory change in the future, but I do not know of any regulatory changes that, in the normal course of events, threaten the entire business model. They might threaten how it is you plan to make a particular product or what insulation you might have to put in a house you are building; they do not put you entirely out of business, which is what this threatens to do. So I think there is a very strong argument indeed for an amendment that takes these services out of scope, not only because it is a nonsense not to but because it really does threaten investment, planning, jobs and the things that go with that.

My Amendments 29 and 30 should be taken together. Amendment 29 creates an exemption, the terms of which are stated very plainly so I do not need to read them out. Amendment 30 creates what is being referred to as a “rescue clause”; in other words, it says that Ofcom has the discretion to withdraw the exemption if it sees that it is in the public good to do so. If any of these services were to start going rogue and behaving in a way that was contrary to the public interest, or objectionable in terms of how the Bill operates and is intended to operate, Ofcom would be able to intervene and say, “That exemption no longer applies; if you get the exemption under Amendment 29, we can take it away under Amendment 30”. This is not unprecedented. This rescue clause has been almost cut and pasted from the Gambling Act. This process of creating an exemption which can be withdrawn is not unprecedented and has great merit. This is why I recommend Amendments 29 and 30 while not wanting to be exclusive to Amendment 28.

This is not going away. Germany exempts non-profit organisations. France has recently passed laws with similar scope to ours and has exempted those entities which operate in the public interest. There is nothing strange about doing what I and the noble Lord, Lord Allan, want to do. This will not go away, because the consequences could be very severe. It is not a question of whether Wikipedia will close. Wikipedia in the English language will probably survive all of this. It has a lot of people supporting it and a lot of volunteers working for it. However, what of many of the minor languages? Wait until the Government find out whether Welsh Wikipedia, the largest Welsh-language website in the world, will survive and the consequences. What will the Government say when they start getting letters? “Oh, it’s nothing to do with us, we have no answer for that, it’s all a matter for Ofcom”. This is a completely unsustainable position. It is indefensible for us in Parliament to let it pass and it is completely unsustainable for the Government. Some action must be taken.

My Lords, I will speak to my Amendments 281 to 281B. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to them. I will deal first with Amendments 281 and 281B, then move to 281A.

On Amendments 281 and 281B, the Minister will recall that in Committee we had a discussion around how functionality is defined in the Bill and that a great deal of the child risk assessments and safety duties must have regard to functionality, as defined in Clause 208. However, as it is currently written, this clause appears to separate out functionalities of user-to-user services and search services. These two amendments are designed to adjust that slightly, to future-proof the Bill.

Why is this necessary? First, it reflects that it is likely that in the future, many of the functionalities that we currently see on user-to-user services will become present on search services and possibly vice versa. Therefore, we need to try to take account of how the world is likely to move. Secondly, this is already happening, and it poses a risk to children. Some research done by the 5Rights Foundation has found that “predictive search”, counted in the Bill as a search service functionality, is present on social media websites, leading one child user using a search bar to be presented in nanoseconds with prompts associated with eating disorders. In Committee, the Minister noted that the functionalities listed in this clause are non-exhaustive. At the very least, it would be helpful to clarify this in the Bill language.

Amendment 281A would add specific functionalities which we know are addictive or harmful to children and put them in the Bill. We have a great deal of research and evidence which demonstrates how persuasive certain design strategies are with children. These are features which are solely designed to keep users on the platform, at any cost, as much as possible and for as long as possible. The more that children are on the platform, the more harm they are likely to suffer. Given that the purpose of this Bill is for services to be safe by design, as set out usefully in Amendment 1, please can we make sure that where we know—and we do know—that risk exists, we are doing our utmost to tackle it?

The features that are listed in this amendment are known as “dark patterns”—and they are known as “dark patterns” for a very good reason. They have persuasive and pervasive design features which are deliberately baked into the design of the digital services and products, to capture and hold, in this case, children’s attention, and to create habitual, even compulsive behaviours. The damage this does to children is proven and palpable. For example, one of the features mentioned is infinite scroll, which is now ubiquitous on most major social media platforms. The inventor of infinite scroll, a certain Aza Raskin, who probably thought it was a brilliant idea at the time, has said publicly that he now deeply regrets ever introducing it, because of the effect it is having on children.

One of the young people who spoke to the researchers at 5Rights said of the struggle they have daily with the infinite scroll feature:

“Scrolling forever gives me a sick feeling in my stomach. I’m so aware of how little control I have and the feeling of needing to be online is overwhelming and consuming”.

Features designed to keep users—adults, maybe fine, but children not fine—online at any cost are taking a real toll. Managing public and frequent interactions online, which the features encourage, creates the most enormous pressures for young people, and with that comes anxiety, low self-esteem and mental health challenges. This is only increasing, and unless we are very specific about these, they are going to continue.

We have the evidence. We know what poses harm and risk to children. Please can we make sure that this is reflected accurately in the Bill?

My Lords, I rise briefly to support many of the amendments in this group. I will start with Amendments 281, 281A and 281B in the name of my noble friend Lord Russell, to which I have added my name. The noble Lord set out the case very well. I will not reiterate what he said, but it is simply the case that the features and functionalities of regulated companies should not be separated by search and user-to-user but should apply across any regulated company that has that feature. There is no need to worry about a company that does not have one of the features on the list, but it is a much more dangerous thing to have an absent feature than it is to have a single list and hold companies responsible for their features.

Only this morning, Meta released Thread as its challenger to Twitter. In the last month, Snapchat added generative AI to its offering. Instagram now does video, and TikTok does shopping. All these companies are moving into a place where they would like to be the one that does everything. That is their commercial endgame, and that is where the Bill should set its sights.

Separating out functionality and, as the noble Lord, Lord Russell, said, failing to add what we already know, puts the Bill in danger of looking very old before the ink is dry. I believe it unnecessarily curtails Ofcom in being able to approach the companies for what they are doing, rather than for what the Bill thought they might be doing at this point. So, if the Minister is not in a position to agree to the amendment, I urge him at least to take it away and have a look at it, because it is a technical rather than an ideological matter. It would be wonderful to fix it.

I support the other amendments in the group: Amendments 28 and 29, and the very interesting Amendment 30. We come back to a very similar issue, which is about design. The thing about Wikipedia is that it does not stand at the doorway and grab your attention, and it does not follow you for six months after you visit it. It does not have that algorithmic push. So, although I freely admit that there are some unsavoury things on Wikipedia, it does not push them at you or at young people. That is a really interesting thing for us to hold in mind when we talk about the next group of amendments on harm.

I am bound to say that, although the noble Lord, Lord Allan, might prefer his amendment and the noble Lord, Lord Moylan, might prefer his, I prefer Amendment 245 in the name of the noble Baroness, Lady Morgan, which says that all services should be judged according to risk. This would stop this endless game of taking things out and putting things in, in case they behave badly, or taking things out for companies that we recognise now although we do not know what the companies of the future will be. We all have to remember that, even when we had the pre-legislative committee, we were not talking about large language models and when we started this Bill we were not talking about TikTok. Making laws for individual services is not a grand idea, but saying that it is not the size but the risk that should determine the category of a regulated service, and therefore its duties, seems a comprehensive way of getting to the same place.

My Lords, there is a danger of unanimity breaking out. The noble Lord, Lord Moylan, and I are not always on the same page as others, but this is just straightforward. I hope the Government listen to the fact that, even though we might be coming at this in different ways, there is concern on all sides.

I also note that this is a shift from what happened in Committee, when I tabled an amendment to try to pose the same dilemmas by talking about the size of organisations. Many a noble Lord said that size did not matter and that that did not work—but it was trying to get at the same thing. I do feel rather guilty that, to move the core philosophy forward, I have dumped the small and micro start-ups and SMEs that I also wanted to protect from overregulation—that is what has happened in this amendment—but now it seems an absolute no-brainer that we should find a way to exempt public interest organisations. This is where I would go slightly further. We should have a general exemption for public interest organisations, but with the ability for Ofcom to come down hard if they look as though they have moved from being low risk to being a threat.

As the noble Lord, Lord Moylan, noted, public interest exemptions happen throughout the world. Although I do not want to waste time reading things out, it is important to look at the wording of Amendment 29. As it says, we are talking about:

“historical, academic, artistic, educational, encyclopaedic, journalistic, or statistical content”.

We are talking about the kind of online communities that benefit the public interest. We are talking about charities, user-curated scientific publications and encyclopaedias. They is surely not what this Bill was designed to thwart. However, there is a serious danger that, if we put on them the number of regulatory demands in the Bill, they will not survive. That is not what the Government intend but it is what will happen.

Dealing with the Bill’s complexity will take much time and money for organisations that do not have it. I run a small free-speech organisation called the Academy of Ideas and declare my interest in it. I am also on the board of the Free Speech Union. When you have to spend so much time on regulatory issues it costs money and you will go under. That is important. This could waste Ofcom’s time. The noble Lord, Lord Allan of Hallam, has explained that. It would prevent Ofcom concentrating on the nasty bits that we want it to. It would be wasting its time trying to deal with what is likely to happen.

I should mention a couple of other things. It is important to note that there is sometimes controversy over the definition of a public interest organisation. It is not beyond our ken to sort it out. I Googled it—it is still allowed—and came up with a Wikipedia page that still exists. That is always good. If one looks, the term “public interest” is used across a range of laws. The Government know what kind of organisations they are talking about. The term has not just been made up for the purpose of an exemption.

It is also worth noting that no one is talking about public interest projects and organisations not being regulated at all but this is about an exemption from this regulation. They still have to deal with UK defamation, data protection, charity, counterterrorism and pornography laws, and the common law. Those organisations’ missions and founding articles will require that they do some good in the world. That is what they are all about. The Government should take this matter seriously.

Finally, on the rescue clauses, it is important to note—there is a reference to the Gambling Act—the Bill states that if there is problem, Ofcom should intervene. That was taken from what happens under the Gambling Act, which allows UK authorities to strip one or more gambling businesses of their licensing exemptions when they step out of line. No one is trying to say do not look at those exemptions at all but they obviously should not be in the scope of the Bill. I hope that when we get to the next stage, the Government will, on this matter at least, accept the amendment.

My Lords, I also speak in support of Amendments to 281, 281A and 281B, to which I have added my name, tabled by the noble Lord, Lord Russell. He and, as ever, the noble Baroness Kidron, have spoken eloquently, I am not going to spend much time on these amendments but I wanted to emphasise Amendment 281A.

In the old world of direct marketing—I am old enough to remember that when I was a marketing director it was about sending magazines, leaflets and letters—one spent all of one’s time working out how to build loyalty: how to get people to engage longer as a result of one’s marketing communication. In the modern digital world, that dwell time has been transformed into a whole behavioural science of its own. It has developed a whole set of tools. Today, we have been using the word “activity” at the beginning of the Bill in the new Clause 1 but also “features” and “functionality”. The reason why Amendment 281A is important is that there is a danger that the Bill keeps returning to being just about content. Even in Clause 208 on functionality, almost every item in subsection (2) mentions content, whereas Amendment 281A tries to spell out the elements of addiction-driving functionality that we know exist today.

I am certain that brilliant people will invent some more but we know that these ones exist today. I really think that we need to put them in the Bill to help everyone understand what we mean because we have spent days on this Bill—some of us have spent years, if not decades, on this issue—yet we still keep getting trapped in going straight back to content. That is another reason why I think it is so important that we get some of these functionalities in the Bill. I very much hope that, if he cannot accept the amendment today, my noble friend the Minister will go back, reflect and work out how we could capture these specific functionalities before it is too late.

I speak briefly on Amendments 28 to 30. There is unanimity of desire here to make sure that organisations such as Wikipedia and Streetmap are not captured. Personally, I am very taken—as I often am—by the approach of the noble Baroness, Lady Kidron. We need to focus on risk rather than using individual examples, however admirable they are today. If Wikipedia chose to put on some form of auto-scroll, the risk of that service would go up; I am not suggesting that Wikipedia is going to do so today but, in the digital world, we should not assume that, just because organisations are charities or devoted to the public good, they cannot inadvertently cause harm. We do not make that assumption in the physical world either. Charities that put on physical events have to do physical risk assessments. I absolutely think that we should hold all organisations to that same standard. However, viewed through the prism of risk, Wikipedia—brilliant as it is—does not have a risk for child safety and therefore should not be captured by the Bill.

My Lords, I broadly support all the amendments in this group but I will focus on the three amendments in the names of the noble Lord, Lord Russell, and others; I am grateful for their clear exposition of why these amendments are important. I draw particular attention to Amendment 281A and its helpful list of functions that are considered to be harmful and to encourage addiction.

There is a very important dimension to this Bill, whose object, as we have now established, is to encourage safety by design. An important aspect of it is cleaning up, and setting right, 20 years or more of tech development that has not been safe by design and has in fact been found to be harmful by way of design. As the noble Baroness, Lady Harding, just said, in many conversations and in talking to people about the Bill, one of the hardest things to communicate and get across is that this is about not only content but functionality. Amendment 281A provides a useful summary of the things that we know about in terms of the functions that cause harm. I add my voice to those encouraging the Minister and the Government to take careful note of it and to capture this list in the text of the Bill in some way so that this clean-up operation can be about not only content for the future but functionality and can underline the objectives that we have set for the Bill this afternoon.

My Lords, I start by saying amen—not to the right reverend Prelate but to my noble friend Lady Harding. She said that we should not assume that, just because charities exist, they are all doing good; as a former chair of the Charity Commission, I can say that that is very true.

The sponsors of Amendments 281 to 281B have made some powerful arguments in support of them. They are not why I decided to speak briefly on this group but, none the less, they made some strong points.

I come back to Amendments 28 to 30. Like others, I do not have a particular preference for which of the solutions is proposed to address this problem but I have been very much persuaded by the various correspondence that I have received—I am sure that other noble Lords have received such correspondence—which often uses Wikipedia as the example to illustrate the problem.

However, I take on board what my noble friend said: there is a danger of identifying one organisation and getting so constrained by it that we do not address the fundamental problems that the Bill is about, which is making sure that there is a way of appropriately excluding organisations that should not be subject to these various regulations because they are not designed for them. I am open to the best way of doing that.

My Lords, this has been a very interesting debate, as it is a real contrast. We have one set of amendments which say that the net is too wide and another which say that the net is not wide enough, and I agree with both of them. After all, we are trying to fine-tune the Bill to get it to deal with the proper risks—the word “risk” has come up quite a lot in this debate—that it should. Whether or not we make a specific exemption for public interest services, public information services, limited functionality services or non-commercial services, we need to find some way to deal with the issue raised by my noble friend and the noble Lord, Lord Moylan, in their amendments. All of us are Wikipedia users; we all value the service. I particularly appreciated what was said by the noble Baroness, Lady Kidron: Wikipedia does not push its content at us—it is not algorithmically based.

What the noble Lord, Lord Russell, said, resonated with me, because I think he has found a thundering great hole in the Bill. This infinite scrolling and autoplay is where the addiction of so much of social media lies, and the Bill absolutely needs systemically and functionally to deal with it. So, on the one hand, we have a service which does not rely on that infinite scrolling and algorithmic type of pushing of content and, on the other hand, we are trying to identify services which have that quality.

I very much hope the Minister is taking all this on board, because on each side we have identified real issues. Whether or not, when we come to the light at the end of the tunnel of Amendment 245 from the noble Baroness, Lady Morgan, it will solve all our problems, I do not know. All I can say is that I very much hope that the Minister will consider both sets of amendments and find a way through this that is satisfactory to all sides.

My Lords, much like the noble Lord, Lord Clement-Jones, I started off being quite certain I knew what to say about these amendments. I even had some notes—unusual for me, I know—but I had to throw them away, which I always do with my notes, because the arguments have been persuasive. That is exactly why we are here in Parliament discussing things: to try to reach common solutions to difficult problems.

We started with a challenge to the Minister to answer questions about scope, exemptions and discretion in relation to a named service—Wikipedia. However, as the debate went on, we came across the uncomfortable feeling that, having got so far into the Bill and agreed a lot of amendments today improving it, we are still coming up against quite stubborn issues that do not fit neatly into the categorisation and structures that we have. We do not seem to have the right tools to answer the difficult questions before us today, let alone the myriad questions that will come up as the technology advances and new services come in. Why have we not already got solutions to the problems raised by Amendments 281, 281A and 281B?

There is also the rather difficult idea we have from the noble Lord, Lord Russell, of dark patterns, which we need to filter into our thinking. Why does that not fit into what we have got? Why is it that we are still worried about Wikipedia, a service for public good, which clearly has risks in it and is sometimes capable of making terrible mistakes but is definitely a good thing that should not be threatened by having to conform with a structure and a system which we think is capable of dealing with some of the biggest and most egregious companies that are pushing stuff at us in the way that we have talked about?

I have a series of questions which I do not have the answers to. I am looking forward to the Minister riding to my aid on a white charger of enormous proportions and great skill which will take us out without having to fall over any fences.

If I may, I suggest to the Minister a couple of things. First, we are stuck on the word “content”. We will come back to that in the future, as we still have an outstanding problem about exactly where the Bill sets it. Time and again in discussions with the Bill team and with Ministers we have been led back to the question of where the content problem lies and where the harms relate to that, but this little debate has shown beyond doubt that harm can occur independent of and separate from content. We must have a solution to that, and I hope it will be quick.

Secondly, when approaching anybody or anything or any business or any charity that is being considered in scope for this Bill, we will not get there if we are looking only at the question of its size and its reach. We have to look at the risks it causes, and we have to drill down hard into what risks we are trying to deal with using our armoury as we approach these companies, because that is what matters to the children, vulnerable people and adults who would suffer otherwise, and not the question of whether or not these companies are big or small. I think there are solutions to that and we will get there, but, when he comes to respond, the Minister needs to demonstrate to us that he is still willing to listen and think again about one or two issues. I look forward to further discussions with him.

I am grateful to noble Lords for their contributions during this debate. I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, and particularly that the Bill should not inhibit services from providing valuable information which is of benefit to the public. However, I want to be clear that that is why the Bill has been designed in the way that it has. It has a broad scope in order to capture a range of services, but it has exemptions and categorisations built into it. The alternative would be a narrow scope, which would be more likely inadvertently to exempt risky sites or to displace harm on to services which we would find are out of scope of the Bill. I will disappoint noble Lords by saying that I cannot accept their amendments in this group but will seek to address the concerns that they have raised through them.

The noble Lord, Lord Allan, asked me helpfully at the outset three questions, to which the answers are yes, no and maybe. Yes, Wikipedia and OpenStreetMap will be in scope of the Bill because they allow users to interact online; no, we do not believe that they would fall under any of the current exemptions in the Bill; and the maybe is that Ofcom does not have the discretion to exempt services but the Secretary of State can create additional exemptions for further categories of services if she sees fit.

I must also say maybe to my noble friend Lord Moylan on his point about Wikipedia—and with good reason. Wikipedia, as I have just explained, is in scope of the Bill and is not subject to any of its exemptions. I cannot say how it will be categorised, because that is based on an assessment made by the independent regulator, but I reassure my noble friend that it is not the regulator but the Secretary of State who will set the categorisation thresholds through secondary legislation; that is to say, a member of the democratically elected Government, accountable to Parliament, through legislation laid before that Parliament. It will then be for Ofcom to designate services based on whether or not they meet those thresholds.

It would be wrong—indeed, nigh on impossible—for me to second-guess that designation process from the Dispatch Box. In many cases it is inherently a complex and nuanced matter since, as my noble friend Lady Harding said, many services change over time. We want to keep the Bill’s provisions flexible as services change what they do and new services are invented.

I would just like to finish my thought on Wikipedia. Noble Lords are right to mention it and to highlight the great work that it does. My honourable friend the Minister for Technology and the Digital Economy, Paul Scully, met Wikipedia yesterday to discuss its concerns about the Bill. He explained that the requirements for platforms in this legislation will be proportionate to the risk of harm, and that as such we do not expect the requirements for Wikipedia to be unduly burdensome.

I am computing the various pieces of information that have just been given, and I hope the Minister can clarify whether I have understood them correctly. These services will be in scope as user-to-user services and do not have an exemption, as he said. The Secretary of State will write a piece of secondary legislation that will say, “This will make you a category 1 service”—or a category 2 or 2B service—but, within that, there could be text that has the effect that Wikipedia is in none of those categories. So it and services like it could be entirely exempt from the framework by virtue of that secondary legislation. Is that a correct interpretation of what he said?

The Secretary of State could create further exemptions but would have to bring those before Parliament for it to scrutinise. That is why there is a “maybe” in answer to his third question in relation to any service. It is important for the legislation to be future-proofed that the Secretary of State has the power to bring further categorisations before Parliament for it to discuss and scrutinise.

My Lords, I will keep pressing this point because it is quite important, particularly in the context of the point made by the noble Baroness, Lady Kidron, about categorisation, which we will debate later. There is a big difference when it comes to Schedule 11, which defines the categorisation scheme: whether in the normal run of business we might create an exemption in the categorisation secondary legislation, or whether it would be the Secretary of State coming back with one of those exceptional powers that the Minister knows we do not like. He could almost be making a case for why the Secretary of State has to have these exceptional powers. We would be much less comfortable with that than if the Schedule 11 categorisation piece effectively allowed another class to be created, rather than it being an exceptional Secretary of State power.

To follow on from that, we are talking about the obligation to bring exemptions to Parliament. Well, we are in Parliament and we are bringing exemptions. The noble Lord is recommending that we bring very specific exemptions while those that the noble Lord, Lord Moylan, and I have been recommending may be rather broad—but I thought we were bringing exemptions to Parliament. I am not being facetious. The point I am making is, “Why can’t we do it now?” We are here now, doing this. We are saying, as Parliament, “Look at these exemptions”. Can the Minister not look at them now instead of saying that we will look at them some other time?

I may as well intervene now as well, so that the Minister can get a good run at this. I too am concerned at the answer that has been given. I can see the headline now, “Online Safety Bill Age-Gates Wikipedia”. I cannot see how it does not, by virtue of some of the material that can be found on Wikipedia. We are trying to say that there are some services that are inherently in a child’s best interests—or that are in their best interests according to their evolving capacity, if we had been allowed to put children’s rights into the Bill. I am concerned that that is the outcome of the answer to the noble Lord, Lord Allan.

I do not think that it is, but it will be helpful to have a debate on categorisation later on Report, when we reach Amendment 245, to probe this further. It is not possible for me to say that a particular service will certainly be categorised one way or another, because that would give it carte blanche and we do not know how it may change in the future—estimable though I may think it is at present. That is the difficulty of setting the precise parameters that the noble Baroness, Lady Fox, sought in her contribution. We are setting broad parameters, with exemptions and categorisations, so that the burdens are not unduly heavy on services which do not cause us concern, and with the proviso for the Secretary of State to bring further exemptions before Parliament, as circumstances strike her as fit, for Parliament to continue the debate we are having now.

The noble Baroness, Lady Kidron, in her earlier speech, asked about the functionalities of user-to-user services. The definitions of user-to-user services are broad and flexible, to capture new and changing services. If a service has both user-to-user functionality and a search engine, it will be considered a combined service, with respective duties for the user-to-user services which form part of its service and search duties in relation to the search engine.

I reassure my noble friend Lady Harding of Winscombe that the Bill will not impose a disproportionate burden on services, nor will it impede the public’s access to valuable content. All duties on services are proportionate to the risk of harm and, crucially, to the capacity of companies. The Bill’s proportionate design means that low-risk services will have to put in place only measures which reflect the risk of harm to their users. Ofcom’s guidance and codes of practice will clearly set out how these services can comply with their duties. We expect that it will set out a range of measures and steps for different types of services.

Moreover, the Bill already provides for wholesale exemptions for low-risk services and for Ofcom to exempt in-scope services from requirements such as record-keeping. That will ensure that there are no undue burdens to such services. I am grateful for my noble friend’s recognition, echoed by my noble friend Lady Stowell of Beeston, that “non-profit” does not mean “not harmful” and that there can be non-commercial services which may pose harms to users. That is why it is important that there is discretion for proper assessment.

Amendment 30 seeks to allow Ofcom to withdraw the exemptions listed in Schedule 1 from the Bill. I am very grateful to my noble friend Lord Moylan for his time earlier this week to discuss his amendment and others. We have looked at it, as I promised we would, but I am afraid that we do not think that it would be appropriate for Ofcom to have this considerable power—my noble friend is already concerned that the regulator has too much.

The Bill recognises that it may be necessary to remove certain exemptions if there is an increased risk of harm from particular types of services. That is why the Bill gives the Secretary of State the power to remove particular exemptions, such as those related to services which have limited user-to-user functionality and those which offer one-to-one live aural communications. These types of services have been carefully selected as areas where future changes in user behaviour could necessitate the repeal or amendment of an exemption in Schedule 1. This power is intentionally limited to only these types of services, meaning that the Secretary of State will not be able to remove exemptions for comments on recognised news publishers’ sites. That is in recognition of the Government’s commitment to media freedom and public debate. It would not be right for Ofcom to have the power to repeal those exemptions.

Amendments 281 and 281B, in the name of the noble Lord, Lord Russell of Liverpool, are designed to ensure that the lists of features under the definition of “functionality” in the Bill apply to all regulated services. Amendment 281A aims to add additional examples of potentially addictive functionalities to the Bill’s existing list of features which constitute a “functionality”. I reassure him and other noble Lords that the list of functionalities in the Bill is non-exhaustive. There may be other functionalities which could cause harm to users and which services will need to consider as part of their risk assessment duties. For example, if a provider’s risk assessment identifies that there are functionalities which risk causing significant harm to an appreciable number of children on its service, the Bill will require the provider to put in place measures to mitigate and manage that risk.

He and other noble Lords spoke about the need for safety by design. I can reassure them this is already built into the framework of the Bill, which recognises how functionalities including many of the things mentioned today can increase the risk of harm to users and will encourage the safe design of platforms.

Amendments 281 and 281B have the effect that regulated services would need to consider the risk of harm of functionalities that are not relevant for their kind of service. For example, sharing content with other users is a functionality of user-to-user services, which is not as relevant for search services. The Bill already outlines specific features that both user-to-user and search services should consider, which are the most relevant functionalities for those types of service. Considering these functionalities would create an unnecessary burden for regulated services which would detract from where their efforts can best be focused. That is why I am afraid I cannot accept the amendments that have been tabled.

My Lords, surely it is the role of the regulators to look at functionalities of this kind. The Minister seemed to be saying that it would be an undue burden on the regulator. Is not that exactly what we are meant to be legislating about at this point?

Perhaps I was not as clear as I could or should have been. The regulator will set out in guidance the duties that fall on the businesses. We do not want the burden on the business to be unduly heavy, but there is an important role for Ofcom here. I will perhaps check—

Hence Ofcom will make the assessments about categorisation based on that. Maybe I am missing the noble Lord’s point.

I will check what I said but I hope that I have set out why we have taken the approach that we have with the broad scope and the exemptions and categorisations that are contained in it. With that, I urge the noble Lord to withdraw his amendment.

My Lords, that was a very useful debate. I appreciate the Minister’s response and his “yes, no, maybe” succinctness, but I think he has left us all more worried than when the debate started. My noble friend Lord Clement-Jones tied it together nicely. What we want is for the regulator to be focused on the greatest areas of citizen risk. If there are risks that are missing, or things that we will be asking the regulator to do that are a complete waste of time because they are low risk, then we have a problem. We highlighted both those areas. The noble Lord, Lord Russell, rightly highlighted that we are not content with just “content” as the primary focus of the legislation; it is about a lot more than content. In my amendment and those by the noble Lord, Lord Moylan, we are extremely worried—and remain so—that the Bill creates a framework that will trap Wikipedia and services like it, without that being our primary intention. We certainly will come back to this in later groups; I will not seek to press the amendment now, because there is a lot we all need to digest. However, at the end of this process, we want to get to point where the regulator is focused on things that are high risk to the citizen and not wasting time on services that are very low risk. With that, I beg leave to withdraw my amendment.

Amendment 28 withdrawn.

Amendment 29 not moved.

Amendment 30 not moved.

Clause 5: Overview of Part 3

Amendment 31

Moved by

31: Clause 5, page 4, line 40, leave out “section 54” and insert “sections 54 to (“Priority content that is harmful to children”)”

Member’s explanatory statement

This amendment is consequential on the new Clauses proposed to be inserted after Clause 54 in my name setting out which kinds of content count as primary priority content and priority content harmful to children.

My Lords, the government amendments in this group relate to the categories of primary priority and priority content that is harmful to children.

Children must be protected from the most harmful online content and activity. As I set out in Committee, the Government have listened to concerns about designating primary priority and priority categories of content in secondary legislation and the need to protect children from harm as swiftly as possible. We have therefore tabled amendments to set out these categories in the Bill. I am grateful for the input from across your Lordships’ House in finalising the scope of these categories.

While it is important to be clear about the kinds of content that pose a risk of harm to children, I acknowledge what many noble Lords raised during our debates in Committee, which is that protecting children from online harm is not just about content. That is why the legislation takes a systems and processes approach to tackling the risk of harm. User-to-user and search service providers will have to undertake comprehensive, mandatory risk assessments of their services and consider how factors such as the design and operation of a service and its features and functionalities may increase the risk of harm to children. Providers must then put in place measures to manage and mitigate these risks, as well as systems and processes to prevent and protect children from encountering the categories of harmful content.

We have also listened to concerns about cumulative harm. In response to this, the Government have tabled amendments to Clause 209 to make it explicit that cumulative harm is addressed. This includes cumulative harm that results from algorithms bombarding a user with content, or where combinations of functionality cumulatively drive up the risk of harm. These amendments will be considered in more detail under a later group of amendments, but they are important context for this discussion.

I turn to the government amendments, starting with Amendment 171, which designates four categories of primary priority content. First, pornographic content has been defined in the same way as in Part 5—to give consistent and comprehensive protection for children, regardless of the type of service on which the pornographic content appears. The other three categories capture content which encourages, promotes or provides instructions for suicide, self-harm or eating disorders. This will cover, for example, glamorising or detailing methods for carrying out these dangerous activities. Designating these as primary priority content will ensure that the most stringent child safety duties apply.

Government Amendment 172 designates six categories of priority content. Providers will be required to protect children from encountering a wide range of harmful violent content, which includes depictions of serious acts of violence or graphic injury against a person or animal, and the encouragement and promotion of serious violence, such as content glamorising violent acts. Providers will also be required to protect children from encountering abusive and hateful content, such as legal forms of racism and homophobia, and bullying content, which sadly many children experience online.

The Government have heard concerns from the noble Baronesses, Lady Kidron and Lady Finlay of Llandaff, about extremely dangerous activities being pushed to children as stunts, and content that can be harmful to the health of children, including inaccurate health advice and false narratives. As such, we are designating content that encourages dangerous stunts and challenges as a category of priority content, and content which encourages the ingestion or inhalation of, or exposure to, harmful substances, such as harmful abortion methods designed to be taken by a person without medical supervision.

Amendment 174, from the noble Baroness, Lady Kidron, seeks to add “mis- and disinformation” and “sexualised content” to the list of priority content. On the first of these, I reiterate what I said in Committee, which is that the Bill will protect children from harmful misinformation and disinformation where it intersects with named categories of primary priority or priority harmful content—for example, an online challenge which is promoted to children on the basis of misinformation or disinformation, or abusive content with a foundation in misinformation or disinformation. However, I did not commit to misinformation and disinformation forming its own stand-alone category of priority harmful content, which could be largely duplicative of the categories that we have already included in the Bill and risks capturing a broad range of legitimate content.

We have already addressed key concerns related to misinformation and disinformation content which presents the greatest risk to children by including content which encourages the ingestion or inhalation of, or exposure to, harmful substances to the list of priority categories. However, the term “mis- and disinformation”, as proposed by Amendment 174, in its breadth and subjectivity risks inadvertently capturing a wide range of content resulting in disproportionate, excessive censorship of the content children see online, including in areas of legitimate debate. The harm arising from misinformation or disinformation usually arises from the context or purpose of the content, rather than the mere fact that it is untrue. Our balanced approach ensures that children are protected from the most prevalent and concerning harms associated with misinformation and disinformation.

I turn to sexualised and adult content. Again, we must tread carefully here. What might constitute this is subjective and presents challenges for both providers and Ofcom to interpret. It is important that what constitutes priority content is sufficiently well defined so it is clear both to providers and to Ofcom what their obligations under the Bill are. Amendment 174 sets an extremely broad scope and gives rise to a risk of censorship if providers take an excessively broad interpretation of what is sexualised and adult content. It is important that we safeguard children’s freedom of expression through the Bill and do not inadvertently limit their access to innocuous and potentially helpful content.

The duties are, of course, not limited to the content in the Government’s amendments. The Bill requires providers to identify and act on any “non-designated content” which meets the Bill’s threshold of

“content that is harmful to children”,

even where it has not been designated as primary priority or priority content. Therefore, I hope the noble Baroness will understand why we cannot accept Amendment 174.

Amendment 237 in my name introduces a delegated power to update and amend these lists. This is essential for ensuring that the legislation remains flexible to change and that new and emerging risks of harm can be captured swiftly. Amendment 238, also in my name, ensures that the draft affirmative procedure will apply except in cases where there is an urgent need to update the lists, when the affirmative procedure can be used. This ensures that Parliament will retain the appropriate degree of oversight over any changes. I beg to move.

My Lords, we spent a lot of time in Committee raising concerns about how pornography and age verification were going to operate across all parts of the Bill. I have heard what the Minister has said in relation to this group, priority harms to children, which I believe is one of the most important groups under discussion in the Bill. I agree that children must be protected from the most harmful content online and offline.

I am grateful to the Government for having listened carefully to the arguments put forward by the House in this regard and commend the Minister for all the work he and his team have done since them. I also commend the noble Lord, Lord Bethell. He and I have been in some discussion between Committee and now in relation to these amendments.

In Committee, I argued for several changes to the Bill which span three groups of amendments. One of my concerns was that pornography should be named as a harm in the Bill. I welcome the Government’s Amendment 171, which names pornography as a primary priority content. I also support Amendment 174 in the name of the noble Baroness, Lady Kidron. She is absolutely right that sexualised content can be harmful to children if not age appropriate and, in that regard, before she even speaks, I ask the Minister tousb reconsider his views on this amendment and to accept it.

Within this group are the amendments which move the definition of “pornographic content” from Part 5 to Clause 211. In that context, I welcome the Government’s announcement on Monday about a review of the regulation, legislation and enforcement of pornography offences.

In Committee, your Lordships were very clear that there needed to be a consistent approach across the Bill to the regulation of pornography. I am in agreement with the amendments tabled in Committee to ensure that consistency applies across all media. In this regard, I thank the noble Baroness, Lady Benjamin, for her persistence in raising this issue. I also thank my colleagues on the Opposition Front Bench, the noble Lord, Lord Stevenson, and the noble Baroness, Lady Merron.

I appreciate that the Government made this announcement only three days ago, but I hope the Minister will set out a timetable for publishing the terms of reference and details of how this review will take place. The review is too important to disappear into the long grass over the Summer Recess, never to be heard of again, so if he is unable to answer my question today, will he commit to writing to your Lordships with the timeframe before the House rises for the summer? Will he consider the active involvement of external groups in this review, as much expertise lies outside government in this area? In that regard, I commend CARE, CEASE and Barnardo’s for all their input into the debates on the Bill.

My Lords, I think the noble Baroness’s comments relate to the next group of amendments, on pornography. She might have skipped ahead, but I am grateful for the additional thinking time to respond to her questions. Perhaps she will save the rest of her remarks for that group.

I thank the Minister for that. In conclusion, I hope he will reflect on those issues and come back, maybe at the end of the next group. I remind the House that in February the APPG on Commercial Sexual Exploitation, in its inquiry on pornography, recommended that the regulation of pornography should be consistent across all online platforms and between the online and offline spheres. I hope we can incorporate the voices I have already mentioned in the NGO sphere in order to assist the Government and both Houses in ensuring that we regulate the online platforms and that children are protected from any harms that may arise.

My Lords, I shall speak briefly to Amendment 174 in my name and then more broadly to this group—I note that the Minister got his defence in early.

On the question of misinformation and disinformation, I recognise what he said and I suppose that, in my delight at hearing the words “misinformation and disinformation”, I misunderstood to some degree what he was offering at the Dispatch Box, but I make the point that this poses an enormous risk to children. As an example, children are the fastest-growing group of far-right believers/activists online, and there are many areas in which we are going to see an exponential growth in misinformation and disinformation as large language models become the norm. So I ask him, in a tentative manner, to look at that.

On the other issue, I have to push back at the Minister’s explanation. Content classification around sexual content is a well-established norm. The BBFC does it and has done it for a very long time. There is an absolute understanding that what is suitable for a U, a PG, a 12 or a 12A are different things, and that as children’s capacities evolve, as they get older, there are things that are more suitable for older children, including, indeed, stronger portrayals of sexual behaviour as the age category rises. So I cannot accept that this opens a new can of worms: this is something that we have been doing for many, many years.

I think it is a bit wrongheaded to imagine that if we “solve” the porn problem, we have solved the problem—because there is still sexualisation and the commercialisation of sex. Now, if you say something about feet to a child, they start to giggle uproariously because, in internet language, you get paid for taking pictures of feet and giving them to strange people. There are such detailed and different areas that companies should be looking at. This amendment in my name and the names of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, should be taken very seriously. It is not new ground, so I would ask the Minister to reconsider it.

More broadly, the Minister will have noticed that I liberally added my name to the amendments he has brought forward to meet some of the issues we raised in Committee, and I have not added my name to the schedule of harms. I want to be nuanced about this and say I am grateful to the Government for putting them in the Bill, I am grateful that the content harms have been discussed in this Chamber and not left for secondary legislation, and I am grateful for all the conversations around this. However, harm cannot be defined only as content, and the last grouping got to the core of the issue in the House. Even when the Minister was setting out this amendment, he acknowledged that the increase in harm to users may be systemic and by design. In his explanation, he used the word “harm”; in the Bill, it always manifests as “harmful content”.

While the systemic risk of increasing the presence of harmful content is consistently within the Bill, which is excellent, the concept that the design of service may in and of itself be harmful is absent. In failing to do that, the Government, and therefore the Bill, have missed the bull’s-eye. The bull’s-eye is what is particular about this method of communication that creates harm—and what is particular are the features, functionalities and design. I draw noble Lords back to the debate about Wikipedia. It is not that we all love Wikipedia adoringly; it is that it does not pursue a system of design for commercial purposes that entraps people within its grasp. Those are the harms we are trying to get at. I am grateful for the conversations I have had, and I look forward to some more. I have laid down some other amendments for Monday and beyond that would, I hope, deal with this—but until that time, I am afraid this is an incomplete picture.

My Lords, I have a comment about Amendment 174 in the name of the noble Baroness, Lady Kidron. I have no objection to the insertion of subsection (9B), but I am concerned about (9A), which deals with misinformation and disinformation. It is far too broad and political, and if we start at this late stage to try to run off into these essentially political categories, we are going to capsize the Bill altogether. So I took some heart from the fact that my noble friend on the Front Bench appeared disinclined to accept at least that limb of the amendment.

I did want to ask briefly some more detailed questions about Amendment 172 and new subsection (2) in particular. This arises from the danger of having clauses added at late stages of the Bill that have not had the benefit of proper discussion and scrutiny in Committee. I think we are all going to recognise the characteristics that are listed in new subsection (2) as mapping on to the Equality Act, which appears to be their source. I note in passing that it refers in that regard to gender reassignment. I would also note that most of the platforms, in their terms and conditions, refer not to gender reassignment but to various other things such as gender identity, which are really very different, or at least different in detail, and I would be interested to ask my noble friend how effectively he expects it to be enforced that the words used in English statute are actually applied by these foreign platforms—I am going to come back to this in a further amendment later—or how the words used in English statute are applied by what are, essentially, foreign platforms when they are operating for an audience in the United Kingdom.

I also want to make a more general point. I have looked up the Equality Act, to make sure I am not mistaken about this. The Equality Act is essentially about discrimination against people as individuals or as groups. In defining direct discrimination, it says:

“A person (A) discriminates against another (B) if, because of a protected characteristic, A treats B less favourably than A treats or would treat others”.

So, there has to be a “B”—there has to be a person—for the Equality Act to be engaged. But this Bill says content is abusive when it

“targets any of the following characteristics”.

“Targets” is an interesting word. It does not say “attacks” and it does not relate to treatment. One can “target” something favourably. One can be positive in targeting something. It does not have to be a negative thing.

On the question of religion, if I, A, treat B adversely because they adhere to a particular religion, I fall foul of the Equality Act. But this appears to cover religion as a phenomenon. So, if I say that I am going to treat somebody badly because they are Jewish, of course I fall foul of the Equality Act. But this appears to say that if I say something adverse and abusive about the Jewish religion without reference to any particular individual, I will fall foul of this clause. I know that sounds a minor point of detail, but it is actually very significant. I want to hear my noble friend explain how in detail this is going to operate. If I say something adverse or abusive about gender reassignment and disability, that would not fall foul of the Equality Act necessarily, but it would fall foul of the Bill, as far as I can see. Are we creating a new blasphemy offence here, in effect, in relation to religion, as opposed to what the Equality Act does? I would like my noble friend to be able to expand on this. I know this is a Committee stage-type query, but this is our first opportunity to ask these questions.

My Lords, interestingly, because I have not discussed this at all with the noble Lord, Lord Moylan, I have some similar concerns to his. I have always wanted this to be a children’s online safety Bill. My concerns generally have been about threats to adults’ free speech and privacy and the threat to the UK as the home of technological innovation. I have been happy to keep shtum on things about protecting children, but I got quite a shock when I saw the series of government amendments.

I thought what most people in the public think: the Bill will tackle things such as suicide sites and pornography. We have heard some of that very grim description, and I have been completely convinced by people saying, “It’s the systems”. I get all that. But here we have a series of amendments all about content—endless amounts of content and highly politicised, contentious content at that—and an ever-expanding list of harms that we now have to deal with. That makes me very nervous.

On the misinformation and disinformation point, the Minister is right. Whether for children or adults, those terms have been weaponised. They are often used to delegitimise perfectly legitimate if contrary or minority views. I say to the noble Baroness, Lady Kidron, that the studies that say that youth are the fastest-growing far-right group are often misinformation themselves. I was recently reading a report about this phenomenon, and things such as being gender critical or opposing the small boats arriving were considered to be evidence of far-right views. That was not to do with youth, but at least you can see that this is quite a difficult area. I am sure that many people even in here would fit in the far right as defined by groups such as HOPE not hate, whose definition is so broad.

My main concerns are around the Minister’s Amendment 172. There is a problem: because it is about protected characteristics—or apes the protected characteristics of the Equality Act—we might get into difficulty. Can we at least recognise that, even in relation to the protected characteristics as noted in the Equality Act, there are raging rows politically? I do not know how appropriate it is that the Minister has tabled an amendment dragging young people into this mire. Maya Forstater has just won a case in which she was accused of being opposed to somebody’s protected characteristics and sacked. Because of the protected characteristics of her philosophical views, she has won the case and a substantial amount of money.

I worry when I see this kind of list. It is not just inciting hatred—in any case, what that would mean is ambivalent. It refers to abuse based on race, religion, sex, sexual orientation, disability and so on. This is a minefield for the Government to have wandered into. Whether you like it or not, it will have a chilling effect on young people’s ability to debate and discuss. If you worry that some abuse might be aimed at religion, does that mean that you will not be able to discuss Charlie Hebdo? What if you wanted to show or share the Charlie Hebdo cartoons? Will that count? Some people would say that is abusive or inciteful. This is not where the Bill ought to be going. At the very least, it should not be going there at this late stage. Under race, it says that “nationality” is one of the indicators that we should be looking out for. Maybe it is because I live in Wales, but there is a fair amount of abuse aimed at the English. A lot of Scottish friends dole it out as well. Will this count for young people who do that? I cannot get it.

My final question is in relation to proposed subsection (11). This is about protecting children, yet it lists a person who

“has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex”.

Are the Government seriously accepting that children have not just proposed to reassign but have been reassigned? That is a breach of the law. That is not meant to be happening. Your Lordships will know how bad this is. Has the Department for Education seen this? As we speak, it is trying to untangle the freedom for people not to have to go along with people’s pronouns and so on.

This late in the day, on something as genuinely important as protecting children, I just want to know whether there is a serious danger that this has wandered into the most contentious areas of political life. I think it is very dangerous for a government amendment to affirm gender reassignment to and about children. It is genuinely irresponsible and goes against the guidance the Government are bringing out at the moment for us to avoid. Please can the Minister clarify what is happening with Amendment 172?

My Lords, I am not entirely sure how to begin, but I will try to make the points I was going to make. First, I would like to respond to a couple of the things said by the noble Baroness, Lady Fox. With the greatest respect, I worry that the noble Baroness has not read the beginning of the proposed new clause in Amendment 172, subsection (2), which talks about “Content which is abusive”, as opposed to content just about race, religion or the other protected characteristics.

One of the basic principles of the Bill is that we want to protect our children in the digital world in the same way that we protect them in the physical world. We do not let our children go to the cinema to watch content as listed in the primary priority and priority content lists in my noble friend the Minister’s amendments. We should not let them in the digital world, yet the reality is that they do, day in and day out.

I thank my noble friend the Minister, not just for the amendments that he has tabled but for the countless hours that he and his team have devoted to discussing this with many of us. I have not put my name to the amendments either because I have some concerns but, given the way the debate has turned, I start by thanking him and expressing my broad support for having the harms in the Bill, the importance of which this debate has demonstrated. We do not want this legislation to take people by surprise. The important thing is that we are discussing some fundamental protections for the most vulnerable in our society, so I thank him for putting those harms in the Bill and for allowing us to have this debate. I fear that it will be a theme not just of today but of the next couple of days on Report.

I started with the positives; I would now like to bring some challenges as well. Amendments 171 and 172 set out priority content and primary priority content. It is clear that they do not cover the other elements of harm: contact harms, conduct harms and commercial harms. In fact, it is explicit that they do not cover the commercial harms, because proposed new subsection (4) in Amendment 237 explicitly says that no amendment can be made to the list of harms that is commercial. Why do we have a perfect crystal ball that means we think that no future commercial harms could be done to our children through user-to-user and search services, such that we are going to expressly make it impossible to add those harms to the Bill? It seems to me that we have completely ignored the commercial piece.

I move on to Amendment 174, which I have put my name to. I am absolutely aghast that the Government really think that age-inappropriate sexualised content does not count as priority content. We are not necessarily talking here about a savvy 17 year-old. We are talking about four, five and six year-olds who are doomscrolling on various social media platforms. That is the real world. To suggest that somehow the digital world is different from the old-fashioned cinema, and a place where we do not want to protect younger children from age-inappropriate sexualised material, just seems plain wrong. I really ask my noble friend the Minister to reconsider that element.

I am also depressed about the discussion that we had about misinformation. As I said in Committee several times, I have two teenage girls. The reality is that we are asking today’s teenagers to try to work out what is truth and what is misinformation. My younger daughter will regularly say, “Is this just something silly on the internet?” She does not use the term “misinformation”; she says, “Is that just unreal, Mum?” She cannot tell about what appears in her social media feeds because of the degree of misinformation. Failing to recognise that misinformation is a harm for young people who do not yet know how to validate sources, which was so much easier for us when we were growing up than it is for today’s generations, is a big glaring gap, even in the content element of the harms.

I support the principle behind these amendments, and I am pleased to see the content harms named. We will come back next week to the conduct and contact harms—the functionality—but I ask my noble friend the Minister to reconsider on both misinformation and inappropriate sexualised material, because we are making a huge mistake by failing to protect our children from them.

My Lords, I too welcome these amendments and thank the Minister and the Government for tabling them. The Bill will be significantly strengthened by Amendment 172 and related amendments by putting the harms as so clearly described in the Bill. I identify with the comments of others that we also need to look at functionality. I hope we will do that in the coming days.

I also support Amendment 174, to which I added my name. Others have covered proposed new subsection (9B) very well; I add my voice to those encouraging the Minister to give it more careful consideration. I will also speak briefly to proposed new subsection (9A), on misinformation and disinformation content. With respect to those who have spoken against it and argued that those are political terms, I argue that they are fundamentally ethical terms. For me, the principle of ethics and the online world is not the invention of new ethics but finding ways to acknowledge and support online the ethics we acknowledge in the offline world.

Truth is a fundamental ethic. Truth builds trust. It made it into the 10 commandments:

“You shall not bear false witness against your neighbour”.

It is that ethic that would be translated across in proposed new subsection (9A). One of the lenses through which I have viewed the Bill throughout is the lens of my eight grandchildren, the oldest of whom is eight years old and who is already using the internet. Proposed new subsection (9A) is important to him because, at eight years old, he has very limited ways of checking out what he reads online—fewer even than a teenager. He stands to be fundamentally misled in a variety of ways if there is no regulation of misinformation and disinformation.

Also, the internet, as we need to keep reminding ourselves in all these debates, is a source of great potential good and benefit, but only if children grow up able to trust what they read there. If they can trust the web’s content, they will be able to expand their horizons, see things from the perspective of others and delve into huge realms of knowledge that are otherwise inaccessible. But if children grow up necessarily imbued with cynicism about everything they read online, those benefits will not accrue to them.

Misinformation and disinformation content is therefore harmful to the potential of children across the United Kingdom and elsewhere. We need to guard against it in the Bill.

My Lords, Amendment 172 is exceptionally helpful in putting the priority harms for children on the face of the Bill. It is something that we have asked for and I know the pre-legislative scrutiny committee asked for it and it is good to see it there. I want to comment to make sure that we all have a shared understanding of what this means and that people out there have a shared understanding.

My understanding is that “primary priority” is, in effect, a red light—platforms must not expose children to that content if they are under 18—while “priority” is rather an amber light and, on further review, for some children it will be a red light and for other children it be a green light, and they can see stuff in there. I am commenting partly having had the experience of explaining all this to my domestic focus group of teenagers and they said, “Really? Are you going to get rid of all this stuff for us?” I said, “No, actually, it is quite different”. It is important in our debate to do that because otherwise there is a risk that the Bill comes into disrepute. I look at something like showing the harms to fictional characters. If one has seen the “Twilight” movies, the werewolves do not come off too well, and “Lord of the Rings” is like an orc kill fest.

As regards the point made by the noble Baroness, Lady Harding, about going to the cinema, we allow older teenagers to go to the cinema and see that kind of thing. Post the Online Safety Bill, they will still be able to access it. When we look at something like fictional characters, the Bill is to deal with the harm that is there and is acknowledged regarding people pushing quite vile stuff, whereby characters have been taken out of fiction and a gory image has been created, twisted and pushed to a younger child. That is what we want online providers to do—to prevent an 11 year-old seeing that—not to stop a 16 year-old enjoying the slaughter of werewolves. We need to be clear that that is what we are doing with the priority harms; we are not going further than people think we are.

There are also some interesting challenges around humour and evolving trends. This area will be hard for platforms to deal with. I raised the issue of the Tide pod challenge in Committee. If noble Lords are not familiar, it is the idea that one eats the tablets, the detergent things, that one puts into washing machines. It happened some time ago. It was a real harm and that is reflected here in the “do not ingest” provisions. That makes sense but, again talking to my focus group, the Tide pod challenge has evolved and for older teenagers it is a joke about someone being stupid. It has become a meme. One could genuinely say that it is not the harmful thing that it was. Quite often one sees something on the internet that starts harmful—because kids are eating Tide pods and getting sick—and then over time it becomes a humorous meme. At that point, it has ceased to be harmful. I read it as that filter always being applied. We are not saying, “Always remove every reference to Tide pods” but “At a time when there is evidence that it is causing harm, remove it”. If at a later stage it ceases to be harmful, it may well move into a category where platforms can permit it. It is a genuine concern.

To our freedom of expression colleagues, I say that we do not want mainstream platforms to be so repressive of ordinary banter by teenagers that they leave those regulated mainstream platforms because they cannot speak any more, even when the speech is not harmful, and go somewhere else that is unregulated—one of those platforms that took Ofcom’s letter, screwed it up and threw it in the bin. We do not want that to be an effect of the Bill. Implementation has to be very sensitive to common trends and, importantly, as I know the noble Baroness, Lady Kidron, agrees, has to treat 15, 16 and 17 year-olds very differently from 10, 11 or 12 year-olds. That will be hard.

The other area that jumped out was about encouraging harm through challenges and stunts. That immediately brought “Jackass” to mind, or the Welsh version, “Dirty Sanchez”, which I am sure is a show that everyone in the House watched avidly. It is available on TV. Talking about equality, one can go online and watch it. It is people doing ridiculous, dangerous things, is enjoyed by teenagers and is legal and acceptable. My working assumption has to be that we are expecting platforms to distinguish between a new dangerous stunt such as the choking game—such things really exist—from a ridiculous “Jackass” or “Dirty Sanchez” stunt, which has existed for years and is accessible elsewhere.

The point that I am making in the round is that it is great to have these priority harms in the Bill but it is going to be very difficult to implement them in a meaningful way whereby we are catching the genuinely harmful stuff but not overrestricting. But that is that task that we have set Ofcom and the platforms. The more that we can make it clear to people out there what we are expecting to happen, the better. We are not expecting a blanket ban on all ridiculous teenage humour or activity. We are expecting a nuanced response. That is really helpful as we go through the debate.

I just have a question for the noble Lord. He has given an excellent exposé of the other things that I was worried about but, even when he talks about listing the harms, I wonder how helpful it is. Like him, I read them out to a focus group. Is it helpful to write these things, for example emojis, down? Will that not encourage the platforms to over-panic? That is my concern.

On the noble Baroness’s point, that is why I intervened in the debate: so that we are all clear. We are not saying that, for priority content, it is an amber light and not a red light. We are not saying, “Just remove all this stuff”; it would be a wrong response to the Bill to say, “It’s a fictional character being slaughtered so remove it”, because now we have removed “Twilight”, “Watership Down” and whatever else. We are saying, “Think very carefully”. If it is one of those circumstances where this is causing harm—they exist; we cannot pretend that they do not—it should be removed. However, the default should not be to remove everything on this list; that is the point I am really trying to make.

My Lords, our debate on this group is on the topic of priority harms to children. It is not one that I have engaged in so I tread carefully. One reason why I have not engaged in this debate is because I have left it to people who know far more about it than I do; I have concentrated on other parts of the Bill.

In the context of this debate, one thing has come up on which I feel moved to make a short contribution: misinformation and disinformation content. There was an exchange between my noble friend Lady Harding and the noble Baroness, Lady Fox, on this issue. Because I have not engaged on the topic of priority harms, I genuinely do not have a position on what should and should not be featured. I would not want anybody to take what I say as support for or opposition to any of these amendments. However, it is important for us to acknowledge that, as much as misinformation and disinformation are critical issues—particularly for children and young people because, as the right reverend Prelate said, the truth matters—we cannot, in my view, ignore the fact that misinformation and disinformation have become quite political concepts. They get used in a way where people often define things that they do not agree with as misinformation—that is, opinions are becoming categorised as misinformation.

We are now putting this in legislation and it is having an impact on content, so it is important, too, that we do not just dismiss that kind of concern as not relevant because it is real. That is all I wanted to say.

My Lords, I will speak briefly as I know that we are waiting for a Statement.

If you talk to colleagues who know a great deal about the harm that is happening and the way in which platforms operate, as well as to colleagues who talk directly to the platforms, one thing that you commonly hear from them is a phrase that often recurs when they talk to senior people about some of the problems here: “I never thought of that before”. That is whether it is about favourites on Snapchat, which cause grief in friendship groups, about the fact that, when somebody leaves a WhatsApp group, it flags up who that person is—who wants to be seen as the person who took the decision to leave?—or about the fact that a child is recommended to other children even if the company does not know whether they are remotely similar.

If you are 13, you are introduced as a boy to Andrew Tate; if you are a girl, you might be introduced to a set of girls who may or may not share anorexia content, but they dog-whistle and blog. The companies are not deliberately orchestrating these outcomes—it is the way they are designed that is causing those consequences—but, at the moment, they take no responsibility for what is happening. We need to reflect on that.

I turn briefly to a meeting that the noble Lord, Lord Stevenson, and I were at yesterday afternoon, which leads neatly on to some of the comments the noble Baroness, Lady Fox, made, a few moments ago about the far right. The meeting was convened by Luke Pollard MP and was on the strange world known as the manosphere, which is the world of incels—involuntary celibates. As your Lordships may be aware, on various occasions, certain individuals who identify as that have committed murder and other crimes. It is a very strange world.

I was introduced to two terms that I was not aware of. If you are an incel, you refer to males who are fortunate enough to get on well with ladies as Chads, and ladies who are fortunate enough to get on well with men or boys are apparently known as Stacys. That was something I did not particularly want to learn, but I did. This was the second meeting that Luke Pollard had convened; the people at that meeting were from a huge variety of groups and they all said that it is the only forum they have found in United Kingdom that is pulling all these different elements together. They are acutely aware of how much of a problem this is, and they find that forum incredibly helpful, because nobody else is doing it.

I have asked some of the people gave evidence in that meeting to forward some of their concerns to us, which the noble Lord, Lord Stevenson, and I would like to forward to the Minister and the Bill team. I raise this so as to encourage the Minister to understand the scale of this, what is happening and its effects and then to look at how harms are defined in the Bill and how functionality is looked at, to see whether this live, growing area is dealt with effectively by the Bill.

This takes me into the area of media literacy, which I flag so that the Minister and the Bill team can do some homework. In Scotland, there is a very sensible scheme called Mentors in Violence Prevention. Essentially, it uses the older pupils in the sixth form, who have learned beforehand from other mentors how to talk about and understand the harms they might experience online. They deliver to the younger children the information that they have accumulated. The evidence is that this is infinitely more effective than teachers or outside experts doing it. It is almost peer-to-peer—perhaps a very appropriate approach for your Lordships’ House.

I shall be brief, my Lords, because I know we have a Statement to follow. It is a pleasure to follow the noble Lord, Lord Russell. I certainly share his concern about the rise of incel culture, and this is a very appropriate point to raise it.

This is all about choices and the Minister, in putting forward his amendments, in response not only to the Joint Committee but the overwhelming view in Committee on the Bill that this was the right thing to do, has done the right thing. I thank him for that, with the qualification that we must make sure that the red and amber lights are used—just as my noble friend Lord Allan and the noble Baroness, Lady Stowell, qualified their support for what the Minister has done. At the same time, I make absolutely clear that I very much support the noble Baroness, Lady Kidron. I was a bit too late to get my name down to her amendment, but it would be there otherwise.

I very much took to what the right reverend Prelate had to say about the ethics of the online world and nowhere more should they apply than in respect of children and young people. That is the place where we should apply these ethics, as strongly as we can. With some knowledge of artificial intelligence, how it operates and how it is increasingly operating, I say that what the noble Baroness wants to add to the Minister’s amendment seems to be entirely appropriate. Given the way in which algorithms are operating and the amount of misinformation and disinformation that is pouring into our inboxes, our apps and our social media, this is a very proportionate addition. It is the future. It is already here, in fact. So I very strongly support Amendment 174 from the noble Baroness and I very much hope that after some discussion the Minister will accept it.

My Lords, like the noble Baroness, Lady Harding, I want to make it very clear that I think the House as a whole welcomes the change of heart by the Government to ensure that we have in the Bill the two sides of the question of content that will be harmful to children. We should not walk away from that. We made a big thing of this in Committee. The Government listened and we have now got it. The fact that we do not like it—or do not like bits of it—is the price we pay for having achieved something which is, probably on balance, good.

The shock comes from trying to work out why it is written the way it is, and how difficult it is to see what it will mean in practice when companies working to Ofcom’s instructions will take this and make this happen in practice. That lies behind, I think I am right in saying, the need for the addition to Amendment 172 from the noble Baroness, Lady Kidron, which I have signed, along with the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. Both of them have spoken well in support of it and I do not need to repeat those points.

Somehow, in getting the good of Amendments 171 and 172, we have lost the flexibility that we think we want as well to try to get that through. The flexibility does exist, because the Government have retained powers to amend and change both primary priority content that is harmful to children and the primary content. Therefore, subject to approval through the secondary legislation process, this House will continue to have a concern about that—indeed, both Houses will.

Somehow, however, that does not get to quite where the concern comes from. The concern should be both the good points made by the noble Lord, Lord Russell—I should have caught him up in the gap and said I had already mentioned the fact that we had been together at the meeting. He found some additional points to make which I hope will also be useful to future discussion. I am glad he has done that. He is making a very good point in relation to cultural context and the work that needs to go on—which we have talked about in earlier debates—in order to make this live: in other words, to make people who are responsible for delivering this through Ofcom, but also those who are delivering it through companies, to understand the wider context. In that sense, clearly we need the misinformation/disinformation side of that stuff. It is part and parcel of the problems we have got. But more important even than that is the need to see about the functionality issues. We have come back to that. This Bill is about risk. The process that we will be going through is about risk assessment and making sure that the risks are understood by those who deliver services, and the penalties that follow the failure of the risk assessment process delivering change that we want to see in society.

However, it is not just about content. We keep saying that, but we do not see the changes around it. The best thing that could happen today would be if the Minister in responding accepted that these clauses are good—“Tick, we like them”—but could we just not finalise them until we have seen the other half of that, which is: what are the other risks to which those users of services that we have referred to and discussed are receiving through the systemic design processes that are designed to take them in different directions? It is only when we see the two together that we will have a proper concern.

I may have got this wrong, but the only person who can tell us is the Minister because he is the only one who really understands what is going on in the Bill. Am I not right in saying—I am going to say I am right; he will say no, I am not, but I am, aren’t I?—that we will get to Clauses 208 and 209, or the clauses that used to be 208 and 209, one of which deals with harms from content and the other deals with functionality? We may need to look at the way in which those are framed in order to come back and understand better how these lie and how they interact with that. I may have got the numbers wrong—the Minister is looking a bit puzzled, so I probably have—but the sense is that this will probably not come up until day 4. While I do not want to hold back the Bill, we may need to look at some of the issues that are hidden in the interstices of this set of amendments in order to make sure that the totality is better for those who have to use it.

My Lords, this has been a useful debate. As the noble Baroness, Lady Kidron, says, because I spoke first to move the government amendments, in effect I got my response in first to her Amendment 174, the only non-government amendment in the group. That is useful because it allows us to have a deeper debate on it.

The noble Baroness asked about the way that organisations such as the British Board of Film Classification already make assessments of sexualised content. However, the Bill’s requirement on service providers and the process that the BBFC takes to classify content are not really comparable. Services will have far less time and much more content to consider them the BBFC does, so will not be able to take the same approach. The BBFC is able to take an extended time to consider maybe just one scene, one image or one conversation, and therefore can apply nuance to its assessments. That is not possible to do at the scale at which services will have to apply the child safety duties in the Bill. We therefore think there is a real risk that they would excessively apply those duties and adversely affect children’s rights online.

I know the noble Baroness and other noble Lords are rightly concerned with protecting rights to free expression and access to information online for children and for adults. It is important that we strike the right balance, which is what we have tried to do with the government amendments in this group.

To be clear, the point that I made about the BBFC was not to suggest a similar arrangement but to challenge the idea that we cannot categorise material of a sexualised nature. Building on the point made by the noble Lord, Lord Allan, perhaps we could think about it in terms of the amber light rather than the red light—in other words, something to think about.

I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.

My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.

The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—

I clarify that what I said was aimed at protecting children. Somebody corrected me and asked, “Do you know that this says ‘abusive’?”—of course I do. What I suggested was that this is an area that is very contentious when we talk about introducing it to children. I am thinking about safeguarding children in this instance, not just copying and pasting a bit of an Act.

I take this opportunity to ask my noble friend the Minister a question; I want some clarity about this. Would an abusive comment about a particular religion—let us say a religion that practised cannibalism or a historical religion that sacrificed babies, as we know was the norm in Carthage—count as “priority harmful content”? I appreciate that we are mapping the language of the Equality Act, but are we creating a new offence of blasphemy in this Bill?

As was pointed out by others in the debate, the key provision in Amendment 172 is subsection (2) of the proposed new clause, which relates to:

“Content which is abusive and which targets any of the following characteristics”.

It must both be abusive and target the listed characteristics. It does not preclude legitimate debate about those things, but if it were abusive on the basis of those characteristics—rather akin to the debate we had in the previous group and the points raised by the noble Baroness, Lady Kennedy of The Shaws, about people making oblique threats, rather than targeting a particular person, by saying, “People of your characteristic should be abused in the following way”—it would be captured.

I will keep this short, because I know that everyone wants to get on. It would be said that it is abusive to misgender someone; in the context of what is going on in sixth forms and schools, I suggest that this is a problem. It has been suggested that showing pictures of the Prophet Muhammad in an RE lesson—these are real-life events that happen offline—is abusive. I am suggesting that it is not as simple as saying the word “abusive” a lot. In this area, there is a highly contentious and politicised arena that I want to end, but I think that this will exacerbate, not help, it.

My noble friend seemed to confirm what I said. If I wish to be abusive—in fact, I do wish to be abusive—about the Carthaginian religious practice of sacrificing babies to Moloch, and I were to do that in a way that came to the attention of children, would I be caught as having created “priority harmful content”? My noble friend appears to be saying yes.

With respect, it does not say “directed at children”. Of course, I am safe in expressing that abuse in this forum, but if I were to do it, it came to the attention of children and it were abusive—because I do wish to be abusive about that practice—would I have created “priority harmful content”, about which action would have to be taken?

I will leap to the Minister’s defence on this occasion. I remind noble colleagues that this is not about individual pieces of content; there would have to be a consistent flow of such information being proffered to children before Ofcom would ask for a change.

My Lords, these words have obviously appeared in the Bill in one of those unverified sections; I have clicked the wrong button, so I cannot see them. Where does it say in Amendment 172 that it has to be a consistent flow?

May I attempt to assist the Minister? This is the “amber” point described by the noble Lord, Lord Allan: “priority content” is not the same as “primary priority content”. Priority content is our amber light. Even the most erudite and scholarly description of baby eating is not appropriate for five year-olds. We do not let it go into “Bod” or any of the other of the programmes we all grew up on. This is about an amber warning: that user-to-user services must have processes that enable them to assess the risk of priority content and primary priority content. It is not black and white, as my noble friend is suggesting; it is genuinely amber.

My Lords, we may be slipping back into a Committee-style conversation. My noble friend Lord Moylan rightly says that this is the first chance we have had to examine this provision, which is a concession wrung out of the Government in Committee. As the noble Lord, Lord Stevenson, says, sometimes that is the price your Lordships’ House pays for winning these concessions, but it is an important point to scrutinise in the way that my noble friend Lord Moylan and the noble Baroness, Lady Fox, have done.

I will try to reassure my noble friend and the noble Baroness. This relates to the definition of a characteristic with which we began our debates today. To be a characteristic it has to be possessed by a person; therefore, the content that is abusive and targets any of the characteristics has to be harmful to an individual to meet the definition of harm. Further, it has to be material that would come to the attention of children in the way that the noble Baronesses who kindly leapt to my defence and added some clarity have set out. So my noble friend would be able to continue to criticise the polytheistic religions of the past and their tendencies to his heart’s content, but there would be protections in place if what he was saying was causing harm to an individual—targeting them on the basis of their race, religion or any of those other characteristics—if that person was a child. That is what noble Lords wanted in Committee, and that is what the Government have brought forward.

My noble friend and others asked why mis- and disinformation were not named as their own category of priority harmful content to children. Countering mis- and disinformation where it intersects with the named categories of primary priority or priority harmful content, rather than as its own issue, will ensure that children are protected from the mis- and disinformation narratives that present the greatest risk of harm to them. We recognise that mis- and disinformation is a broad and cross-cutting issue, and we therefore think the most appropriate response is to address directly the most prevalent and concerning harms associated with it; for example, dangerous challenges and hoax health advice for children to self-administer harmful substances. I assure noble Lords that any further harmful mis- and disinformation content will be captured as non-designated content where it presents a material risk of significant harm to an appreciable number of children.

In addition, the expert advisory committee on mis- and disinformation, established by Ofcom under the Bill, will have a wide remit in advising on the challenges of mis- and disinformation and how best to tackle them, including how they relate to children. Noble Lords may also have seen that the Government have recently tabled amendments to update Ofcom’s statutory media literacy duty. Ofcom will now be required to prioritise users’ awareness of and resilience to misinformation and disinformation online. This will include children and their awareness of and resilience to mis- and disinformation.

My noble friend Lady Harding of Winscombe talked about commercial harms. Harms exacerbated by the design and operation of a platform—that is, their commercial models—are covered in the Bill already through the risk assessment and safety duties. Financial harm, as used in government Amendment 237, is dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This exemption ensures that there is no regulatory overlap.

The noble Lord, Lord Russell of Liverpool, elaborated on remarks made earlier by the noble Lord, Lord Stevenson of Balmacara, about their meeting looking at the incel movement, if it can be called that. I assure the noble Lord and others that Ofcom has a review and report duty and will be required to stay on top of changes in the online harms landscape and report to government on whether it recommends changes to the designated categories of content because of the emerging risks that it sees.

The noble Baroness, Lady Kidron, anticipated the debate we will have on Monday about functionalities and content. I am grateful to her for putting her name to so many of the amendments that we have brought forward. We will continue the discussions that we have been having on this point ahead of the debate on Monday. I do not want to anticipate that now, but I undertake to carry on those discussions.

In closing, I reiterate what I know is the shared objective across your Lordships’ House—to protect children from harmful content and activity. That runs through all the government amendments in this group, which cover the main categories of harmful content and activity that, sadly, too many children encounter online every day. Putting them in primary legislation enables children to be swiftly protected from encountering them. I therefore hope that noble Lords will be heartened by the amendments that we have brought forward in response to the discussion we had in Committee.

Amendment 31 agreed.