Skip to main content

Online Safety Bill

Volume 829: debated on Tuesday 2 May 2023

Committee (4th Day)

Relevant document: 28th Report from the Delegated Powers Committee

Clause 11: Safety duties protecting children

Amendment 23

Moved by

23: Clause 11, page 10, line 9, at beginning insert “eliminate,”

Member’s explanatory statement

This amendment would require user to user services to eliminate identified risks to children from their platforms in addition to mitigating and managing them.

My Lords, this large group of 33 amendments is concerned with preventing harm to children, by creating a legal requirement to design the sites and services that children will access in a way that will put their safety first and foremost. I thank my co-sponsors, the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lord, Lord Knight. First of all, I wish to do the most important thing I will do today: to wish the noble Baroness, Lady Kidron, a very happy birthday.

My co-sponsors will deal with some of the more detailed elements of the 30 amendments that we are dealing with. These will include safety duties, functionality and harm, and codes of practice. I am sure that the noble Lords, Lord Stevenson and Lord Knight, and the right reverend Prelate the Bishop of Oxford will speak to their own amendments.

I will provide a brief overview of why we are so convinced of the paramount need for a safety by design approach to protect children and remind digital companies and platforms, forcibly and legally, of their obligation to include the interests and safety of children as a paramount element within their business strategies and operating models. These sites and services are artificial environments. They were designed artificially and can be redesigned artificially.

In her testimony to the US Senate in July 2021, the Facebook whistleblower Frances Haugen put her finger on it rather uncomfortably when talking about her erstwhile employer:

“Facebook know that they are leading young users to anorexia content … Facebook’s internal research is aware that there are a variety of problems facing children on Instagram … they know that severe harm is happening to children”.

She was talking about, probably, three years ago.

On the first day of Committee, the noble Lord, Lord Allan, who is not with us today, used the analogy of the legally mandated and regulated safe design of aeroplanes and automobiles and the different regimes that cover their usage to illustrate some of our choices in dealing with regulation. We know why aeroplanes and cars have to be designed safely; we also know that either form of transportation could be used recklessly and dangerously, which is why we do not allow children to fly or drive them.

First, let us listen to the designers of these platforms and services through some research done by the 5Rights Foundation in July 2021. These are three direct quotes from the designers:

“Companies make their money from attention. Reducing attention will reduce revenue. If you are a designer working in an attention business, you will design for attention … Senior stakeholders like simple KPIs. Not complex arguments about user needs and human values … If a senior person gives a directive, say increase reach, then that’s what designers design for without necessarily thinking about the consequences”.

Companies know exactly what they need to do to grow and to drive profitability. However, they mostly choose not to consider, mitigate and prioritise to avoid some of the potentially harmful consequences. What they design and prioritise are strategies to maximise consumption, activity and profitability. They are very good at it.

Let us hear what the children say, remembering that some recent research indicates that 42% of five to 12 year-olds in this country use social media. The Pathways research project I referred to earlier worked closely with 21 children aged 12 to 18, who said: “We spend more time online than we feel we should, but it’s tough to stop or cut down”. “If we’re not on social media, we feel excluded”. “We like and value the affirmations and validations we receive”. “We create lots of visual content, much of it about ourselves, and we share it widely”. “Many of us are contacted by unknown adults”. “Many of us recognise that, through using social media, we have experienced body image and relationships problems”.

To test whether the children in this research project were accurately reporting their experiences, the project decided to place a series of child avatars—ghost children, in effect—on the internet, whose profiles very clearly stated that they were children. It did this to test whether these experiences were true.

They found—in many cases within a matter of hours of the profiles going online—proactive contacting by strangers and rapid recommendations to engage more and more. If searches were conducted for eating disorders or self-harm, the avatars were quickly able to access content irrespective of their stated ages and clearly evident status as children. At the same time they were being sent harmful or inappropriate content, they also received age-relevant advertising for school revision and for toys—the social media companies knew that these accounts were registered as children.

This research was done two years ago. Has anything improved since then? It just so happens that 5Rights has produced another piece of research which is about to be released, and which used the exact same technique—creating avatars to see what they would experience online. They used 10 avatars based on real children aged between 10 and 16, so what happened? For an 11 year-old avatar, Instagram was recommending images of knives with the caption “This is what I use to self-harm”; design features were leading children from innocent searches to harmful content very quickly.

I think any grandparents in the Chamber will be aware of an interesting substance known as “Slime”—a form of particularly tactile playdough which one’s grandchildren seem to enjoy. Typing in “Slime” on Reddit was one search, and one click, away from pornography; exactly the same thing happened on Reddit when the avatar typed in “Minecraft”, another very popular game with our children or grandchildren. A 15 year-old female avatar was private-messaged on Instagram by a user that she did not follow—an unknown adult who encouraged her to link on to pornographic content on Telegram, another instant messaging service. On the basis of this evidence, it appears that little or nothing has changed; it may have even got slightly worse.

By an uncomfortable coincidence, last week, Meta, the parent company of Facebook and Instagram, published better than expected results and saw its market value increase by more than $50 billion in after-hours trading. Mark Zuckerberg, the founder of Meta, proudly announced that Meta is pouring investment into artificial intelligence tools to make its platform more engaging and its advertising more effective. Of particular interest and concern given the evidence of the avatars was his announcement that since the introduction of Reels, a short-term video feed designed specifically to respond to competition from TikTok, its AI-driven recommendations had boosted the average time people spend on Instagram by 24%.

To return to the analogy of planes and cars used by the noble Lord, Lord Allan, we are dealing here with planes and cars in the shape of platforms and applications which we know are flawed in their design. They are not adequately designed for safety, and we know that they can put users, particularly children and young people, in the way of great harm, as many grieving families can testify.

In conclusion, our amendments propose that companies must design digital services that cater for the vulnerabilities, needs, and rights of children and young people by default; children’s safety cannot and must not be an afterthought or a casualty of their business models. We are asking for safety by design to protect children to become the mandatory standard. What we have today is unsafe design by default, driven by commercial strategies which can lead to children becoming collateral damage.

Given that it is the noble Baroness’s birthday, I am sure we can feel confident that the Minister will have a positive tone when he replies. I beg to move.

It is a great pleasure to follow my noble friend Lord Russell and to thank him for his good wishes. I assure the Committee that there is nowhere I would rather spend my birthday, in spite of some competitive offers. I remind noble Lords of my interests in the register, particularly as the chair of 5Rights Foundation.

As my noble friend has set out, these amendments fall in three places: the risk assessments, the safety duties and the codes of practice. However, together they work on the overarching theme of safety by design. I will restrict my detailed remarks to a number of amendments in the first two categories. This is perhaps a good moment to recall the initial work of Carnegie, which provided the conceptual approach of the Bill several years ago in arguing for a duty of care. The Bill has gone many rounds since then, but I think the principle remains that a regulated service should consider its impact on users before it causes them harm. Safety by design, to which all the amendments in this group refer, is an embodiment of a duty of care. In thinking about these amendments as a group, I remind the Committee that both the proportionality provisions and the fact that this is a systems and processes Bill means that no company can, should or will be penalised for a single piece of content, a single piece of design or, indeed, low-level infringements.

Amendments 24, 31, 77 and 84 would delete “content” from the Government’s description of what is harmful to children, meaning that the duty is to consider harm in the round rather than just harmful content. The definition of “content” is drawn broadly in Clause 207 as

“anything communicated by means of an internet service”,

but the examples in the Bill, including

“written material … music and data of any description”,

once again fail to include design features that are so often the key drivers of harm to children.

On day three of Committee, the Minister said:

“The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service … This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children”.—[Official Report, 27/4/23; col. 1385.]

However, in looking at the child safety duties, Clause 11(5) says:

“The duties … in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used”,

but subsection (14) says:

“The duties set out in subsections (3) and (6)”—

which are the duties to operate proportionate systems and processes to prevent and protect children from encountering harmful content and to include them in terms of service—

“are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.

I hesitate to say whether that is contradictory. I am not actually sure, but it is confusing. I am concerned that while we are reassured that “content” means content and activity and that the risk assessment considers functionality, “harm” is then repeatedly expressed only in the form of content.

Over the weekend, I had an email exchange with the renowned psychoanalyst and author, Norman Doidge, whose work on the plasticity of the brain profoundly changed how we think about addiction and compulsion. In the exchange, he said that

“children’s exposures to super doses, of supernormal images and scenes, leaves an imprint that can hijack development”.

Then, he said that

“the direction seems to be that AI would be working out the irresistible image or scenario, and target people with these images, as they target advertising”.

His argument is that it is not just the image but the dissemination and tailoring of that image that maximises the impact. The volume and frequency of those images create habits in children that take a lifetime to change—if they change at all. Amendments 32 and 85 would remove this language to ensure that content that is harmful by virtue of its dissemination is accounted for.

I turn now to Amendments 28 and 82, which cut the reference to the

“size and capacity of the provider of the service”

in deeming what measures are proportionate. We have already discussed that small is not safe. Such platforms such as Yubo, Clapper and Discord have all been found to harm children and, as both the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, told us, small can become big very quickly. It is far easier to build to a set of rules than it is to retrofit them after the event. Again, I point out that Ofcom already has duties of proportionality; adding size and capacity is unnecessary and may tip the scale to creating loopholes for smaller services.

Amendment 138 seeks to reverse the exemption in Clause 54 of financial harms. More than half of the 100 top-grossing mobile phone apps contain loot boxes, which are well established as unfair and unhealthy, priming young children to gamble and leading to immediate hardship for parents landed with extraordinary bills.

By rights, Amendments 291 and 292 could fit in the future-proof set of amendments. The way that the Bill in Clause 204 separates out functionalities in terms of search and user-to-user is in direct opposition to the direction of travel in the tech sector. TikTok does shopping, Instagram does video, Amazon does search; autocomplete is an issue across the full gamut of services, and so on and so forth. This amendment simply combines the list of functionalities that must be risk-assessed and makes them apply on any regulated service. I cannot see a single argument against this amendment: it cannot be the Government’s intention that a child can be protected, on search services such as Google, from predictive search or autocomplete, but not on TikTok.

Finally, Amendment 295 will embed the understanding that most harm is cumulative. If the Bereaved Parents for Online Safety were in the Chamber, or any child caught up in self-harm, depression sites, gambling, gaming, bullying, fear of exposure, or the inexorable feeling of losing their childhood to an endless scroll, they would say at the top of their voices that it is not any individual piece of content, or any one moment or incident, but the way in which they are nudged, pushed, enticed and goaded into a toxic, harmful or dangerous place. Adding the simple words

“the volume of the content and the frequency with which the content is accessed”

to the interpretation of what can constitute harm in Clause 205 is one of the most important things that we can do in this Chamber. This Bill comes too late for a whole generation of parents and children but, if these safety by design amendments can protect the next generation of children, I will certainly be very glad.

My Lords, it is an honour, once again, to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, in this Committee. I am going to speak in detail to the amendments that seek to change the way the codes of practice are implemented. Before I do, however, I will very briefly add my voice to the general comments that the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, have just taken us through. Every parent in the country knows that both the benefit and the harm that online platforms can bring our children is not just about the content. It is about the functionality: the way these platforms work; the way they suck us in. They do give us joy but they also drive addiction. It is hugely important that this Bill reflects the functionality that online platforms bring, and not just content in the normal sense of the word “content”.

I will now speak in a bit more detail about the following amendments: Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A—I will finish soon, I promise—112, 122ZA, 122ZB and 122ZC.

I am afraid I may well have done.

That list shows your Lordships some of the challenges we all have with the Bill. All these amendments seek to ensure that the codes of practice relating to child safety are binding. Such codes should be principles-based and flexible to allow companies to take the most appropriate route of compliance, but implementing these codes should be mandatory, rather than, as the Bill currently sets out, platforms being allowed to use “alternative measures”. That is what all these amendments do—they do exactly the same thing. That was a clear and firm recommendation from the joint scrutiny committee. The government’s response to that joint scrutiny committee report was really quite weak. Rather than rehearse the joint scrutiny committee’s views, I will rehearse the Government’s response and why it is not good enough to keep the Bill as it stands.

The first argument the Government make in their response to the joint scrutiny report is that there is no precedent for mandatory codes of conduct. But actually there are. There is clear precedent in child protection. In the physical world, the SEND code for how we protect some of our most vulnerable children is mandatory. Likewise, in the digital world, the age-appropriate design code, which we have mentioned many a time, is also mandatory. So there is plenty of precedent.

The second concern—this is quite funny—was that stakeholders were concerned about having multiple codes of conduct because it could be quite burdensome on them. Well, forgive me for not crying too much for these enormous tech companies relative to protecting our children. The burden I am worried about is the one on Ofcom. This is an enormous Bill, which places huge amounts of work on a regulator that already has a very wide scope. If you make codes of conduct non-mandatory, you are in fact making the work of the regulator even harder. The Government themselves in their response say that Ofcom has to determine what the minimum standards should be in these non-binding codes of practice. Surely it is much simpler and more straightforward to make these codes mandatory and, yes, to add potentially a small additional burden to these enormous tech companies to ensure that we protect our children.

The third challenge is that non-statutory guidance already looks as if it is causing problems in this space. On the video-sharing platform regime, which is non-mandatory, Ofcom has already said that in its first year of operation it has

“seen a large variation in platforms’ readiness to engage with Ofcom”.

All that will simply make it harder and harder, so the burden will lie on this regulator—which I think all of us in this House are already worried is being asked to do an awful lot—if we do not make it very clear what is mandatory and what is not. The Secretary of State said of the Bill that she is

“determined to put these vital protections for … children … into law as quickly as possible”.

A law that puts in place a non-mandatory code of conduct is not what parents across the country would expect from that statement from the Secretary of State. People out there—parents and grandparents across the land—would expect Ofcom to be setting some rules and companies to be required to follow them. That is exactly what we do in the physical world, and I do not understand why we would not want to do it in the digital world.

Finally—I apologise for having gone on for quite a long time—I will very briefly talk specifically to Amendment 32A, in the name of the noble Lord, Lord Knight, which is also in this group. It is a probing amendment which looks at how the Bill will address and require Ofcom and participants to take due regard of VPNs: the ability for our savvy children—I am the mother of two teenage girls—to get round all this by using a VPN to access the content they want. This is an important amendment and I am keen to hear what my noble friend Minister will say in response. Last week, I spoke about my attempts to find out how easy it would be for my 17 year-old daughter to access pornography on her iPhone. I spoke about how I searched in the App Store on her phone and found that immediately a whole series of 17-plus-rated apps came up that were pornography sites. What I did not mention then is that with that—in fact, at the top of the list—came a whole series of VPN apps. Just in case my daughter was naive enough to think that she could just click through and watch it, and Apple was right that 17 year-olds were allowed to watch pornography, which obviously they are not, the App Store was also offering her an easy route to access it through a VPN. That is not about content but functionality, and we need to properly understand why this bundle of amendments is so important.

My Lords, I was not going to speak on this group, but I was provoked into offering some reflections on the speech by the noble Lord, Lord Russell of Liverpool, especially his opening remarks about cars and planes, which he said were designed to be safe. He did not mention trains, about which I know something as well, and which are also designed to be safe. These are a few initial reflective points. They are designed in very different ways. An aeroplane is designed never to fail; a train is designed so that if it fails, it will come to a stop. They are two totally different approaches to safety. Simply saying that something must be designed to be safe does not answer questions; it opens questions about what we actually mean by that. The noble Lord went on to say that we do not allow children to drive cars and fly planes. That is absolutely true, but the thrust of his amendment is that we should design the internet so that it can be driven by children and used by children— so that it is designed for them, not for adults. That is my problem with the general thrust of many of these amendments.

A further reflection that came to mind as the noble Lord spoke was on a book of great interest that I recommend to noble Lords. It is a book by the name of Risk written in 1995 by Professor John Adams, then professor of geography at University College London. He is still an emeritus professor of geography there. It was a most interesting work on risk. First, it reflected how little we actually know of many of the things of which we are trying to assess risk.

More importantly, he went on to say that people have an appetite for risk. That appetite for risk—that risk budget, so to speak—changes over the course of one’s life: one has much less appetite for risk when one gets to a certain age than perhaps one had when one was young. I have never bungee jumped in my life, and I think I can assure noble Lords that the time has come when I can say I never shall, but there might have been a time when I was younger when I might have flung myself off a cliff, attached to a rubber band and so forth—noble Lords may have done so. One has an appetite for risk.

The interesting thing that he went on to develop from that was the notion of risk compensation: that if you have an appetite for risk and your opportunities to take risks are taken away, all you do is compensate by taking risks elsewhere. So a country such as New Zealand, which has some of the strictest cycling safety laws, also has a very high incidence of bungee jumping among the young; as they cannot take risks on their bicycles, they will find ways to go and do it elsewhere.

Although these reflections are not directly germane to the amendments, they are important as we try to understand what we are seeking to achieve here, which is a sort of hermetically sealed absence of risk for children. I do not think it will work. I said at Second Reading that I thought the flavour of the debate was somewhat similar to a late medieval conclave of clerics trying to work out how to mitigate the harmful effects of the invention of movable type. That did not work either, and I think we are in a very similar position today as we discuss this.

There is also the question of harm and what it means. While the examples being given by noble Lords are very specific and no doubt genuinely harmful, and are the sorts of things that we should like to stop, the drafting of the amendments, using very vague words such as “harm”, is dangerous overreach in the Bill. To give just one example, for the sake of speed, when I was young, administering the cane periodically was thought good for a child in certain circumstances. The mantra was, “Spare the rod and spoil the child”, though I never heard it said. Nowadays, we would not think it morally or psychologically good to do physical harm to a child. We would regard it as an unmitigated harm and, although not necessarily banned or illegal, it is something that—

My Lords, I respond to the noble Lord in two ways. First, I ask him to reflect on how the parents of the children who have died through what the parents would undoubtedly view as serious and unbearable harm would feel about his philosophical ruminations. Secondly, as somebody who has the privilege of being a Deputy Speaker in your Lordships’ House, it is incumbent and germane for us all to focus on the amendment in question and stay on it, to save time and get through the business.

Well, I must regard myself as doubly rebuked, and unfairly, because my reflections are very relevant to the amendments, and I have developed them in that direction. In respect of the parents, they have suffered very cruelly and wrongly, but although it may sound harsh, as I have said in this House before on other matters, hard cases make bad law. We are in the business of trying to make good law that applies to the whole population, so I do not think that these are wholly—

If my noble friend could, would he roll back the health and safety regulations for selling toys, in the same way that he seems so happy to have no health and safety regulations for children’s access to digital toys?

My Lords, if the internet were a toy, aimed at children and used only by children, those remarks would of course be very relevant, but we are dealing with something of huge value and importance to adults as well. It is the lack of consideration of the role of adults, the access for adults and the effects on freedom of expression and freedom of speech, implicit in these amendments, that cause me so much concern.

I seem to have upset everybody. I will now take issue with and upset the noble Baroness, Lady Benjamin, with whom I have not engaged on this topic so far. At Second Reading and earlier in Committee, she used the phrase, “childhood lasts a lifetime”. There are many people for whom this is a very chilling phrase. We have an amendment in this group—a probing amendment, granted—tabled by the noble Lord, Lord Knight of Weymouth, which seeks to block access to VPNs as well. We are in danger of putting ourselves in the same position as China, with a hermetically sealed national internet, attempting to put borders around it so that nobody can breach it. I am assured that even in China this does not work and that clever and savvy people simply get around the barriers that the state has erected for them.

Before I sit down, I will redeem myself a little, if I can, by giving some encouragement to the noble Baroness, Lady Kidron, on Amendments 28 and 32 —although I think the amendments are in the name of the noble Lord, Lord Russell of Liverpool. These amendments, if we are to assess the danger posed by the internet to children, seek to substitute an assessment of the riskiness of the provider for the Government’s emphasis on the size of the provider. As I said earlier in Committee, I do not regard size as being a source of danger. When it comes to many other services— I mentioned that I buy my sandwich from Marks & Spencer as opposed to a corner shop—it is very often the bigger provider I feel is going to be safer, because I feel I can rely on its processes more. So I would certainly like to hear how my noble friend the Minister responds on that point in relation to Amendments 28 and 32, and why the Government continue to put such emphasis on size.

More broadly, in these understandable attempts to protect children, we are in danger of using language that is far too loose and of having an effect on adult access to the internet which is not being considered in the debate—or at least has not been until I have, however unwelcomely, raised it.

My Lords, I assure your Lordships that I rise to speak very briefly. I begin by reassuring my noble friend Lord Moylan that he is loved in this Chamber and outside. I was going to say that he is the grit in the oyster that ensures that a consensus does not establish itself and that we think hard about these amendments, but I will revise that and say he is now the bungee jumper in our ravine. I think he often makes excellent and worthwhile points about the scope and reach of the Bill and the unintended consequences. Indeed, we debated those when we debated the amendments relating to Wikipedia, for example.

Obviously, I support these amendments in principle. The other reason I wanted to speak was to wish the noble Baroness, Lady Kidron—Beeban—a happy birthday, because I know that these speeches will be recorded on parchment bound in vellum and presented to her, but also to thank her for all the work that she has done for many years now on the protection of children’s rights on the internet. It occurred to me, as my noble friend Lady Harding was speaking, that there were a number of points I wanted to seek clarity on, either from the Minister or from the proponents of the amendments.

First, the noble Baroness, Lady Harding, mentioned the age-appropriate design code, which was a victory for the noble Baroness, Lady Kidron. It has, I think, already had an impact on the way that some sites that are frequented by children are designed. I know, for instance, that TikTok—the noble Baroness will correct me—prides itself on having made some changes as a result of the design code; for example, its algorithms are able, to a certain extent, to detect whether a child is under 13. I know anecdotally that children under 13 sometimes do have their accounts taken away; I think that is a direct result of the amendments made by the age-appropriate design code.

I would like to understand how these amendments, and the issue of children’s rights in this Bill, will interact with the age-appropriate design code, because none of us wants the confetti of regulations that either overlap or, worse, contradict themselves.

Secondly, I support the principle of functionality. I think it is a very important point that these amendments make: the Bill should not be focused solely on content but should take into account that functionality leads to dangerous content. That is an important principle on which platforms should be held to account.

Thirdly, going back to the point about the age-appropriate design code, the design of websites is extremely important and should be part of the regulatory system. Those are the points I wanted to make.

In relation to how my noble friend Lord Moylan is approaching the Bill, I would say this: having been a Minister when the British Government—and, indeed, other Governments—had no power at all, it was very telling when the then Prime Minister threatened Google with legislation on the issue of child abuse images, saying, “If you do not do something, I will legislate”.

At that time, I was on the tech side of the argument. Google went from saying, “It is impossible to do anything” to identifying 130,000 phrases that people might type into search engines when searching for child abuse images, which, in theory—I have not tried this myself, I hasten to add—would come up with no return and, indeed, a warning that the person in question was searching for those images.

Again, I say to my noble friend Lord Moylan—who I encourage to keep going with his scepticism about the Bill; it is important—that it is a bit of a dead end at any point in his argument to compare us with China. That is genuinely comparing apples with oranges. When people were resisting regulation in this sphere, they would always say, “That’s what the Chinese want”. We have broadcasting regulation and other forms of health and safety regulation. It is not the mark of an autocratic or totalitarian state to have regulation; platforms need to be held to account. I simply ask the proponents of the amendments to make it clear as they proceed how this fits in with existing regulations, such as the age-appropriate design code.

My Lords, I want, apart from anything else, to speak in defence of philosophical ruminations. The only way we can scrutinise the amendments in Committee is to do a bit of philosophical rumination. We are trying to work out what the amendments might mean in terms of changing the Bill.

I read these amendments, noted their use of “eliminate” —we have to “eliminate” all risks—and wondered what that would mean. I do not want to feel that I cannot ask these kinds of difficult questions for fear that I will offend a particular group or that it would be insensitive to a particular group of parents. It is difficult but we are required as legislators to try to understand what each other are trying to change, or how we are going to try to change the law.

I say to those who have put “eliminate” prominently in a number of these amendments that it is impossible to eliminate all risks to children—is it not?—if they are to have access to the online world, unless you ban them from the platforms completely. Is “eliminate” really helpful here?

Previously in Committee, I talked a lot about the potential dangers, psychologically and with respect to development, of overcoddling young people, of cotton wool kids, and so on. I noted an article over the weekend by the science journalist Tom Chivers, which included arguments from the Oxford Internet Institute and various psychologists that the evidence on whether social media is harmful, particularly for teenagers, is ambiguous.

I am very convinced by the examples brought forward by the noble Baroness, Lady Kidron—and I too wish her a happy birthday. We all know about the targeting of young people and so forth, but I am also aware of the positives. I always try to balance these things out and make sure that we do not deny young people access to the positives. In fact, I found myself cheering at the next group of amendments, which is unusual. First, they depend on whether you are four or 14—in other words, you have to be age-specific—and, secondly, they recognise that we do not want to pass anything in the Bill that actually denies children access to either their own privacy or the capacity to know more.

I also wanted to explore a little the idea of expanding the debate away from content to systems, because this is something that I think I am not quite understanding. My problem is that moving away from the discussion on whether content is removed or accessible, and focusing on systems, does not mean that content is not in scope. My worry is that the systems will have an impact on what content is available.

Let me give some examples of things that can become difficult if we think that we do not want young people to encounter violence and nudity—which makes it seem as though we know what we are talking about when we talk about “harmful”. We will all recall that, in 2018, Facebook removed content from the Anne Frank Centre posted by civil rights organisations because it included photographs of the Holocaust featuring undressed children among the victims. Facebook apologised afterwards. None the less, my worry is these kinds of things happening. Another example, in 2016, was the removal of the Pulitzer Prize-winning photograph “The Terror of War”, featuring fleeing Vietnamese napalm victims in the 1970s, because the system thought it was something dodgy, given that the photo was of a naked child fleeing.

I need to understand how system changes will not deprive young people of important educational information such as that. That is what I am trying to distinguish. The point made by the noble Lord, Lord Moylan, about “harmful” not being defined—I have endlessly gone on about this, and will talk more about it later—is difficult because we think that we know what we mean by “harmful” content.

Finally, on the amendments requiring compliance with Ofcom codes of practice, that would give an extraordinary amount of power to the regulator and the Secretary of State. Since I have been in this place, people have rightly drawn my attention to the dangers of delegating power to the Executive or away from any kind of oversight—there has been fantastic debate and discussion about that. It seems to me that these amendments advocate delegated powers being given to the Secretary of State and Ofcom, an unelected body —the Secretary of State could amend for reasons of public policy in order to protect children—and this is to be put through the negative procedure. In any other instance, I would have expected outcry from the usual suspects, but, because it involves children, we are not supposed to object. I worry that we need to have more scrutiny of such amendments and not less, because in the name of protecting children unintended consequences can occur.

I want to answer the point that amendments cannot be seen in isolation. Noble Lords will remember that we had a long and good debate about what constituted harms to children. There was a big argument and the Minister made some warm noises in relation to putting harms to children in the Bill. There is some alignment between many people in the Chamber whereby we and Parliament would like to determine what harm is, and I very much share the noble Baroness’s concern about pointing out what that is.

On the issue of the system versus the content, I am not sure that this is the exact moment but the idea of unintended consequences keeps getting thrown up when we talk about trying to point the finger at what creates harm. There are unintended consequences now, except neither Ofcom nor the Secretary of State or Parliament but only the tech sector has a say in what the unintended consequences are. As someone who has been bungee jumping, I am deeply grateful that there are very strict rules under which that is allowed to happen.

My Lords, I support the amendments in this group that, with regard to safety by design, will address functionality and harms—whatever exactly we mean by that—as well as child safety duties and codes of practice. The noble Lord, Lord Russell, and the noble Baronesses, Lady Harding and Lady Kidron, have laid things out very clearly, and I wish the noble Baroness, Lady Kidron, a happy birthday.

I also support Amendment 261 in the name of my right reverend friend the Bishop of Oxford and supported by the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville. This amendment would allow the Secretary of State to consider safety by design, and not just content, when reviewing the regime.

As we have heard, a number of the amendments would amend the safety duties to children to consider all harms, not just harmful content, and we have begun to have a very interesting debate on that. We know that service features create and amplify harms to children. These harms are not limited to spreading harmful content; features in and of themselves may cause harm—for example, beautifying filters, which can create unrealistic body ideals and pressure on children to look a certain way. In all of this, I want us to listen much more to the voices of children and young people—they understand this issue.

Last week, as part of my ongoing campaign on body image, including how social media can promote body image anxiety, I met a group of young people from two Gloucestershire secondary schools. They were very good at saying what the positives are, but noble Lords will also be very familiar with many of the negative issues that were on their minds, which I will not repeat here. While they were very much alive to harmful content and the messages it gives them, they were keen to talk about the need to address algorithms and filters that they say feed them strong messages and skew the content they see, which might not look harmful but, because of design, accentuates their exposure to issues and themes about which they are already anxious. Suffice to say that underpinning most of what they said to me was a sense of powerlessness and anxiety when navigating the online world that is part of their daily lives.

The current definition of content does not include design features. Building in a safety by design principle from the outset would reduce harms in a systematic way, and the amendments in this group would address that need.

My Lords, I support this group of amendments. Last week, I was lucky—that is not necessarily the right word—to participate in a briefing organised by the noble Lord, Lord Russell of Liverpool, with the 5Rights Foundation on its recent research, which the noble Lord referred to. As the mother of a 13 year-old boy, I came away wondering why on earth you would not want to ensure safety by design for children.

I am aware from my work with disabled children that we know, as Ofcom knows from its own research, that children—or indeed anyone with a long-term health impact or a disability—are far more likely to encounter and suffer harm online. As I say, I struggle to see why you would not want to have safety by design.

This issue must be seen in the round. In that briefing we were taken through how quickly you could get from searching for something such as “slime” to extremely graphic pornographic content. As your Lordships can imagine, I went straight back to my 13 year-old son and said, “Do you know about slime and where you have you seen it?” He said, “Yes, Mum, I’ve watched it on YouTube”. That echoes the point made by the noble Baroness, Lady Kidron—to whom I add my birthday wishes—that these issues have to be seen in the round because you do not just consume content; you can search on YouTube, shop on Google, search on Amazon and all the rest of it. I support this group of amendments.

I too wish my noble friend Lady Kidron a happy birthday.

I will speak to Amendment 261. Having sat through the Communications Committee’s inquiries on regulating the internet, it seemed to me that the real problem was the algorithms and the way they operated. We have heard that again and again throughout the course of the Bill. It is no good worrying just about the content, because we do not know what new services will be created by technology. This morning we heard on the radio from the Google AI expert, who said that we have no idea where AI will go or whether it will become cleverer than us; what we need to do is to keep an eye on it. In the Bill, we need to make sure that we are looking at the way technology is being developed and the possible harms it might create. I ask the Minister to include that in his future-proofing of the Bill, because, in the end, this is a very fast-moving world and ecosystem. We all know that what is present now in the digital world might well be completely changed within a few years, and we need to remain cognisant of that.

My Lords, we have already had some very significant birthdays during the course of the Bill, and I suspect that, over many more Committee days, there will be many more happy birthdays to celebrate.

This has been a fascinating debate and the Committee has thrown up some important questions. On the second day, we had a very useful discussion of risk which, as the noble Lord, Lord Russell, mentioned, was prompted by my noble friend Lord Allan. In many ways, we have returned to that theme this afternoon. The noble Baroness, Lady Fox, who I do not always agree with, asked a fair question. As the noble Baroness, Lady Kidron, said, it is important to know what harms we are trying to prevent—that is how we are trying to define risk in the Bill—so that is an absolutely fair question.

The Minister has shown flexibility. Sadly, I was not able to be here for the previous debate, and it is probably because I was not that he conceded the point and agreed to put children’s harms in the Bill. That takes us a long way further, and I hope he will demonstrate that kind of flexibility as we carry on through the Bill.

The noble Lord, Lord Moylan, and I have totally different views about what risk it is appropriate for children to face. I am afraid that I absolutely cannot share his view that there is this level of risk. I do not believe it is about eliminating risk—I do not see how you can—but the Bill should be about preventing online risk to children; it is the absolute core of the Bill.

As the noble Lord, Lord Russell, said, the Joint Committee heard evidence from Frances Haugen about the business model of the social media platforms. We listened to Ian Russell, the father of Molly, talk about the impact of an unguarded internet on his daughter. It is within the power of the social media companies to do something about that; this is not unreasonable.

I was very interested in what the noble Viscount, Lord Colville, said. He is right that this is about algorithms, which, in essence, are what we are trying to get to in all the amendments in this really important group. It is quite possible to tackle algorithms if we have a requirement in the Bill to do so, and that is why I support Amendment 261, which tries to address to that.

However, a lot of the rest of the amendments are trying to do exactly the same thing. There is a focus not just on moderating harmful content but on the harmful systems that make digital services systematically unsafe for children. I listened with great interest to what the noble Lord, Lord Russell, said about the 5Rights research which he unpacked. We tend to think that media platforms such as Reddit are relatively harmless but that is clearly not the case. It is very interesting that the use of avatars is becoming quite common in the advertising industry to track where advertisements are ending up—sometimes, on pornography sites. It is really heartening that an organisation such as 5Rights has been doing that and coming up with its conclusions. It is extremely useful for us as policymakers to see the kinds of risks that our children are undertaking.

We were reminded about the origins—way back, it now seems—of the Carnegie duty of care. In a sense, we are trying to make sure that that duty of care covers the systems. We have talked about the functionality and harms in terms of risk assessment, about the child safety duties and about the codes of practice. All those need to be included within this discussion and this framework today to make sure that that duty of care really sticks.

I am not going to go through all the amendments. I support all of them: ensuring functionalities for both types of regulated service, and the duty to consider all harms and not just harmful content. It is absolutely not just about the content but making sure that regulated services have a duty to mitigate the impact of harm in general, not just harms stemming from content.

The noble Baroness, Lady Harding, made a terrific case, which I absolutely support, for making sure that the codes of practice are binding and principle based. At the end of the day, that could be the most important amendment in this group. I must admit that I was quite taken with her description of the Government’s response, which was internally contradictory. It was a very weak response to what I, as a member of the Joint Committee, thought was a very strong and clear recommendation about minimum standards.

This is a really important group of amendments and it would not be a difficult concession for the Government to make. They may wish to phrase things in a different way but we must get to the business case and the operation of the algorithms; otherwise, I do not believe this Bill is going to be effective.

I very much take on board what about the noble Viscount said about looking to the future. We do not know very much about some of these new generative AI systems. We certainly do not know a great deal about how algorithms within social media companies operate. We will come, no doubt, to later amendments on the ability to find out more for researchers and so on, but transparency was one of the things our Joint Committee was extremely keen on, and this is a start.

My Lords, I too agree that this has been a really useful and interesting debate. It has featured many birthday greetings to the noble Baroness, Lady Kidron, in which I obviously join. The noble Lord, Lord Moylan, bounced into the debate that tested the elasticity of the focus of the group, and bounced out again. Like the noble Lord, Lord Clement-Jones, I was particularly struck by the speech from the noble Baroness, Lady Harding, on the non-mandatory nature of the codes. Her points about reducing Ofcom’s workload, and mandatory codes having precedent, were really significant and I look forward to the Minister’s response.

If I have understood it correctly, the codes will be generated by Ofcom, and the Secretary of State will then table them as statutory instruments—so they will be statutory, non-mandatory codes, but with statutory penalties. Trying to unravel that in my mind was a bit of a thing as I was sitting there. Undoubtedly, we are all looking forward to the Minister’s definition of harm, which he promised us at the previous meeting of the Committee.

I applaud the noble Lord, Lord Russell, for the excellent way in which he set out the issues in this grouping and—along with the Public Bill Office—for managing to table these important amendments. Due to the Bill’s complexity, it is an achievement to get the relatively simple issue of safety by design for children into amendments to Clause 10 on children’s risk assessment duties for user-to-user services; Clause 11 on the safety duties protecting children; and the reference to risk assessments in Clause 19 on record-keeping. There is a similar set of amendments applying to search; to the duties in Clause 36 on codes of practice duties; to Schedule 4 on the content of codes of practice; and to Clause 39 on the Secretary of State’s powers of direction. You can see how complicated the Bill is for those of us attempting to amend it.

What the noble Lord and his amendments try to do is simple enough. I listened carefully to the noble Baroness, Lady Fox, as always. The starting point is, when designing, to seek to eliminate harm. That is not to say that they will eliminate all potential harms to children, but the point of design is to seek to eliminate harms if you possibly can. It is important to be clear about that. Of course, it is not just the content but the systems that we have been talking about, and ensuring that the codes of practice that we are going to such lengths to legislate for are stuck to—that is the point made by the noble Baroness, Lady Harding—relieving Ofcom of the duty to assess all the alternative methods. We certainly support the noble Lord, Lord Russell, in his amendments. They reinforce that it is not just about the content; the algorithmic dissemination, in terms of volume and context, is really important, especially as algorithms are dynamic—they are constantly changing in response to the business models that underpin the user-to-user services that we are debating.

The business models want to motivate people to be engaged, regardless of safety in many ways. We have had discussion of the analogy on cars and planes from the noble Lord, Lord Allan. As I recall, in essence he said that in this space there are some things that you want to regulate like planes, to ensure that there are no accidents, and some where you trade off freedom and safety, as we do with the regulation of cars. In this case, it is a bit more like regulating for self-driving cars; in that context, you will design a lot more around trying to anticipate all the things that humans when driving will know instinctively, because they are more ethical individuals than you could ever programme an AI to be when driving a car. I offer that slight adjustment, and I hope that it helps the noble Lord, Lord Moylan, when he is thinking about trains, planes and automobiles.

In respect of the problem of the business models and their engagement over safety, I had contact this weekend and last week from friends much younger than I am, who are users of Snap. I am told that there is an AI chatbot on Snap, which I am sure is about engaging people for longer and collecting more data so that you can engage them even longer and, potentially, collect data to drive advertising. But you can pay to get rid of that chatbot, which is the business model moving somewhere else as and when we make it harder for it to make money as it is. Snap previously had location sharing, which you had to turn off. It created various harms and risks for children that their location was being shared with other people without them necessarily authorising it. We can all see how that could create issues.

Does the noble Lord have any reflections, talking about Snap, as to how the internet has changed in our time? It was once really for adults, when it was on a PC and it was only adults who had access to it. There has, of course, been a huge explosion in child access to the internet because of the mobile phone—as we have heard, two-thirds of 10 year-olds now have a mobile phone—and an app such as Snap now has a completely different audience from the one it had five or 10 years ago. Does the noble Lord have any reflections on what the consequences of the explosion of children’s access to applications such as Snap has been on those thinking about the harms and protection of children?

I am grateful to the noble Lord. In many ways, I am reminded of the article I read in the New York Times this weekend and the interview with Geoffrey Hinton, the now former chief scientist at Google. He said that as companies improve their AI systems, they become increasingly dangerous. He said of AI technology:

“Look at how it was five years ago and how it is now. Take the difference and propagate it forwards. That’s scary”.

Yes, the huge success of the iPhone, of mobile phones and all of us, as parents, handing our more redundant iPhones on to our children, has meant that children have huge access. We have heard the stats in Committee around the numbers who are still in primary school and on social media, despite the terms and conditions of those platforms. That is precisely why we are here, trying to get things designed to be safe as far as is possible from the off, but recognising that it is dynamic and that we therefore need a regulator to keep an eye on the dynamic nature of these algorithms as they evolve, ensuring that they are safe by design as they are being engineered.

My noble friend Lord Stevenson has tabled Amendment 27, which looks at targeted advertising, especially that which requires data collection and profiling of children. In that, he has been grateful to Global Action Plan for its advice. While advertising is broadly out of scope of the Bill, apart from in respect of fraud, it is significant for the Minister to reflect on the user experience for children. Whether it is paid or organic content, it is pertinent in terms of their safety as children and something we should all be mindful of. I say to the noble Lord, Lord Vaizey, that as I understand it, the age-appropriate design code does a fair amount in respect of the data privacy of children, but this is much more about preventing children encountering the advertising in the first place, aside from the data protections that apply in the age-appropriate design code. But the authority is about to correct me.

Just to add to what the noble Lord has said, it is worth noting that we had a debate, on Amendment 92, about aligning the age-appropriate design code likely to be accessed and the very important issue that the noble Lord, Lord Vaizey, raised about alignment of these two regimes. I think we can say that these are kissing cousins, in that they take a by-design approach. The noble Lord is completely right that the scope of the Bill is much broader than data protection only, but they take the same approach.

I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.

Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.

Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.

My Lords, I join in the chorus of good wishes to the bungee-jumping birthday Baroness, Lady Kidron. I know she will not have thought twice about joining us today in Committee for scrutiny of the Bill, which is testament to her dedication to the cause of the Bill and, more broadly, to protecting children online. The noble Lord, Lord Clement-Jones, is right to note that we have already had a few birthdays along the way; I hope that we get only one birthday each before the Bill is finished.

Very good—only one each, and hopefully fewer. I thank noble Lords for the points they raised in the debate on these amendments. I understand the concerns raised about how the design and operation of services can contribute to risk and harm online.

The noble Lord, Lord Russell, was right, when opening this debate, that companies are very successful indeed at devising and designing products and services that people want to use repeatedly, and I hope to reassure all noble Lords that the illegal and child safety duties in the Bill extend to how regulated services design and operate their services. Providers with services that are likely to be accessed by children will need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service. It also includes reviewing children’s use of higher-risk features, such as live streaming or private messaging. Service providers are also specifically required to consider the design of functionalities, algorithms and other features when delivering the child safety duties imposed by the Bill.

I turn first to Amendments 23 and 76 in the name of the noble Lord, Lord Russell. These would require providers to eliminate the risk of harm to children identified in the service’s most recent children’s risk assessment, in addition to mitigating and managing those risks. The Bill will deliver robust and effective protections for children, but requiring providers to eliminate the risk of harm to children would place an unworkable duty on providers. As the noble Baroness, Lady Fox, my noble friend Lord Moylan and others have noted, it is not possible to eliminate all risk of harm to children online, just as it is not possible entirely to eliminate risk from, say, car travel, bungee jumping or playing sports. Such a duty could lead to service providers taking disproportionate measures to comply; for instance, as noble Lords raised, restricting children’s access to content that is entirely appropriate for them to see.

Does the Minister accept that that is not exactly what we were saying? We were not saying that they would have to eliminate all risk: they would have to design to eliminate risks, but we accept that other risks will apply.

It is part of the philosophical ruminations that we have had, but the point here is that elimination is not possible through the design or any drafting of legislation or work that is there. I will come on to talk a bit more about how we seek to minimise, mitigate and manage risk, which is the focus.

Amendments 24, 31, 32, 77, 84, 85 and 295, from the noble Lord, Lord Russell, seek to ensure that providers do not focus just on content when fulfilling their duties to mitigate the impact of harm to children. The Bill already delivers on those objectives. As the noble Baroness, Lady Kidron, noted, it defines “content” very broadly in Clause 207 as

“anything communicated by means of an internet service”.

Under this definition, in essence, all communication and activity is facilitated by content.

I hope that the Minister has in his brief a response to the noble Baroness’s point about Clause 11(14), which, I must admit, comes across extraordinarily in this context. She quoted it, saying:

“The duties set out … are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.

Is not that exception absolutely at the core of what we are talking about today? It is surely therefore very difficult for the Minister to say that this applies in a very broad way, rather than purely to content.

I will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.

I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.

I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.

Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.

Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.

The Bill requires providers to specifically consider as part of their risk assessments how algorithms could affect children’s exposure to illegal content and content which is harmful to children on their service. Service providers will need specifically to consider the harm from content that arises from the manner of dissemination —for example, content repeatedly sent to someone by a person or persons, which is covered in Clause 205(3)(c). Providers will also need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet their illegal content and child safety duties. Ofcom will have a range of powers at its disposal to help it assess whether providers are fulfilling their duties. That includes the power to require information from providers about the operation of their algorithms.

Can the Minister assure us that he will take another look at this between Committee and Report? He has almost made the case for this wording to be taken out—he said that it is already covered by a whole number of different clauses in the Bill—but it is still here. There is still an exception which, if the Minister is correct, is highly misleading: it means that you have to go searching all over the Bill to find a way of attacking the algorithm, essentially, and the way that it amplifies, disseminates and so on. That is what we are trying to get to: how to address the very important issue not just of content but of the way that the algorithm operates in social media. This seems to be highly misleading, in the light of what the Minister said.

I do not think so, but I will certainly look at it again, and I am very happy to speak to the noble Lord as I do. My point is that it would not be workable or proportionate for a provider to prevent or protect all children from encountering every single instance of the sort of content that I have just outlined, which would be the effect of these amendments. I will happily discuss that with the noble Lord and others between now and Report.

Amendment 27, by the noble Lord, Lord Stevenson, seeks to add a duty to prevent children encountering targeted paid-for advertising. As he knows, the Bill has been designed to tackle harm facilitated through user-generated content. Some advertising, including paid-for posts by influencers, will therefore fall under the scope of the Bill. Companies will need to ensure that systems for targeting such advertising content to children, such as the use of algorithms, protect them from harmful material. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. The Bill is designed to reduce harm on services which host user-generated content, whereas online advertising poses a different set of problems, with different actors. The Government are taking forward work in this area through the online advertising programme, which will consider the full range of actors and sector-appropriate solutions to those problems.

I understand the Minister’s response, and I accept that there is a parallel stream of work that may well address this. However, we have been waiting for the report from the group that has been looking at that for some time. Rumours—which I never listen to—say that it has been ready for some time. Can the Minister give us a timescale?

I cannot give a firm timescale today but I will seek what further information I can provide in writing. I have not seen it yet, but I know that the work continues.

Amendments 28 and 82, in the name of the noble Lord, Lord Russell, seek to remove the size and capacity of a service provider as a relevant factor when determining what is proportionate for services in meeting their child safety duties. This provision is important to ensure that the requirements in the child safety duties are appropriately tailored to the size of the provider. The Bill regulates a large number of service providers, which range from some of the biggest companies in the world to small voluntary organisations. This provision recognises that what it is proportionate to require of providers at either end of that scale will be different.

Removing this provision would risk setting a lowest common denominator. For instance, a large multinational company could argue that it is required only to take the same steps to comply as a smaller provider.

Amendment 32A from the noble Lord, Lord Knight of Weymouth, would require services to have regard to the potential use of virtual private networks and similar tools to circumvent age-restriction measures. He raised the use of VPNs earlier in this Committee when we considered privacy and encryption. As outlined then, service providers are already required to think about how safety measures could be circumvented and take steps to prevent that. This is set out clearly in the children’s risk assessment and safety duties. Under the duty at Clause 10(6)(f), all services must consider the different ways in which the service is used and the impact of such use on the level of risk. The use of VPNs is one factor that could affect risk levels. Service providers must ensure that they are effectively mitigating and managing risks that they identify, as set out in Clause 11(2). The noble Lord is correct in his interpretation of the Bill vis-à-vis VPNs.

I am grateful to the noble Lord for engaging in dialogue while I am in a sedentary position, but I had better stand up. It is relevant to this Committee whether it is technically possible for providers to fulfil the duties we are setting out for them in statute in respect of people’s ability to use workarounds and evade the regulatory system. At some point, could he give us the department’s view on whether there are currently systems that could be used —we would not expect them to be prescribed—by platforms to fulfil the duties if people are using their services via a VPN?

This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.

The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.

Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.

Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.

As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.

I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.

My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—

My noble friend the Minister did not address the concern I set out that the Bill’s approach will overburden Ofcom. If Ofcom has to review the suitability of each set of alternative measures, we will create an even bigger monster than we first thought.

I do not think that it will. We have provided further resource for Ofcom to take on the work that this Bill will give it; it has been very happy to engage with noble Lords to talk through how it intends to go about that work and, I am sure, would be happy to follow up on that point with my noble friend to offer her some reassurance.

Responding to the point from my noble friend Lord Vaizey, the Bill is part of the UK’s overall digital regulatory landscape, which will deliver protections for children alongside the data protection requirements for children set out in the Information Commissioner’s age-appropriate design code. Ofcom has strong existing relationships with other bodies in the regulatory sphere, including through the Digital Regulation Co-operation Forum. The Information Commissioner has been added to this Bill as a statutory consultee for Ofcom’s draft codes of practice and relevant pieces of guidance formally to provide for the ICO’s input into its areas of expertise, especially relating to privacy.

Amendment 138 from the noble Lord, Lord Russell of Liverpool, would amend the criteria for non-designated content which is harmful to children to bring into scope content whose risk of harm derives from its potential financial impact. The Bill already requires platforms to take measures to protect all users, including children, from financial crime online. All companies in scope of the Bill will need to design and operate their services to reduce the risk of users encountering content amounting to a fraud offence, as set out in the list of priority offences in Schedule 7. This amendment would expand the scope of the Bill to include broader commercial harms. These are dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This amendment therefore risks creating regulatory overlap, which would cause confusion for business while not providing additional protections to consumers and internet users.

Amendment 261 in the name of the right reverend Prelate the Bishop of Oxford seeks to modify the existing requirements for the Secretary of State’s review into the effectiveness of the regulatory framework. The purpose of the amendment is to ensure that all aspects of a regulated service are taken into account when considering the risk of harm to users and not just content.

As we have discussed already, the Bill defines “content” very broadly and companies must look at every aspect of how their service facilitates harm associated with the spread of content. Furthermore, the review clause makes explicit reference to the systems and processes which regulated services use, so the review can already cover harm associated with, for example, the design of services.

Amendments 291, 292, and 293 seek to ensure that companies’ child safety duties apply to a broader range of functionalities which can facilitate harm online. The current list of functionalities in the Bill is not exhaustive. Services will therefore need to assess the risk from any feature or functionality of their service which enables user interaction and could cause harm to users.

The points raised in these amendments are covered already in the Bill in the places I have set out. I will consult the official record of this debate to see whether there are any areas which I have not followed up, but I invite noble Lords not to press their amendments in this group.

My Lords, I thank the Minister for his response. I think the entire Chamber will be thankful that I do not intend to respond in any great detail to almost one hour and three-quarters of debate on this series of amendments—I will just make a few points and suggestions.

The point that the noble Baroness made at the beginning about understanding the design and architecture of the systems and processes is fundamental, both for understanding why they are causing the sorts of harm that they are at the moment and for trying to ensure that they are designed better in future than they have been to date. Clearly, they are seriously remiss in the harms that they are inflicting on a generation of young people.

On the point made by the noble Baroness, Lady Harding, about trying to make Ofcom’s job easier— I can see the noble Lord, Lord Grade, in the corner— I would hope and anticipate that anything we could suggest that would lead the Government to make Ofcom’s job slightly easier and clearer would be very welcome. The noble Lord appears to be making an affirmatory gesture, so I will take that as a yes.

I say to the noble Lord, Lord Moylan, that I fully understand the importance of waving the flag of liberty and free speech, and I acknowledge its importance. I also acknowledge the always-incipient danger of unintentionally preventing things from happening that can and should happen when you are trying to make things safer and prevent harm. Trying to get the right balance is extraordinarily difficult, but I applaud the noble Lord for standing up and saying what he said. If one were to judge the balance of the contributions here as a very rough opinion poll, the noble Lord might find himself in the minority, but that does not necessarily mean that he is wrong, so I would encourage him to keep contributing.

I sympathise with the noble Baroness, Lady Fox, in trying to find the right balance; it is something that we are all struggling to do. One of the great privileges we have in this House is that we have the time to do it in a manner which is actively discouraged in the other place. Even if we go on a bit, we are talking about matters which are very important—in particular, the pre-legislative scrutiny committee was able to cover them in greater detail than the House of Commons was able to do.

The noble Lord, Lord Clement-Jones, was right. In the same way as they say, “Follow the money”, in this case it is “follow the algorithms”, because it is the algorithms which drive the business model.

On the points made by the noble Lord, Lord Knight, regarding the New York Times article about Geoffrey Hinton, one of the architects of AI in Google, I would recommend that all your Lordships read it to see somebody who has been at the forefront of developing artificial intelligence. Rather like a character in a Jules Verne novel suddenly being slightly aghast at what they have created—Frankenstein comes to mind—it makes one pause for thought. Even as we are talking about these things, AI is racing ahead like a greyhound in pursuit of a very fast rabbit, and there is no way that we will be able to catch up.

While I thank the noble Minister for his reply, as when we debated some of the amendments last week where the noble Baroness, Lady Harding, spoke about the train journey she took when she was trying to interrogate and interpret the different parts of the Bill and was trying to follow the trail and understand what was going on to the extent that she became so involved that she missed her station, I think there is a real point here about the fact that this Bill is very complex to follow and understand. Indeed, the way in which the Minster had to point to all the different points of the compass—so to speak—both within the Bill and without it in many of the answers that he gave to some of the amendments indicates to me that the Bill team is finding it challenging to respond to some of them. It is like filling in one of those diagrams where you join the dots, and you cannot quite see what it is until you have nearly finished. I find it slightly disturbing if the Bill team and some of the officials appear to be having a challenging time in trying to interpret, understand and explain some of the points we are raising; I would hope and expect that that could be done much more simply.

One of the pleas from all of us in a whole variety of these amendments is to get the balance right between legislating what it is that we want to legislate and making it simple enough to be understandable. At the moment, a criticism of this Bill is that it is extraordinary difficult to understand in many parts. I will not go through all the points, but there are some germane areas where it would be extremely helpful to pursue with the Minister and the Bill team some of the points we are trying to make. Many of them are raised by a variety of outside bodies which know infinitely more about it than I do, and which have genuine concerns. We have the time between Committee and Report to put some of those to bed or at least to understand them better than we do at the moment. We will probably be happy and satisfied with some of the responses that we receive from the department once we feel that we understand them, and perhaps more importantly, once we feel that the department and the Bill team themselves fully understand them. It is fair to say that at the moment we are not completely comfortable that they do. I do not blame the Minister for that. If I were in his shoes, I would be on a very long holiday and I would not be returning any time soon. However, we will request meetings—for one meeting, it would be too much, so we will try to put this into bit-size units and then try to dig into the detail in a manageable way without taking too much time to make sure that we understand each other.

With that, I beg leave to withdraw the amendment.

Amendment 23 withdrawn.

Amendment 24 not moved.

Amendment 25

Moved by

25: Clause 11, page 10, line 13, at end insert—

“(c) uphold children’s rights per the United Kingdom’s obligations as a signatory of the United Nations Convention on the Rights of the Child (UNCRC), with reference to General Comment No. 25 (2021) from the Committee on the Rights of the Child on children’s rights in relation to the digital environment.”Member’s explanatory statement

This amendment would mean regulated services would have to have regard for the UN Convention on the Rights of the Child to ensure children are treated according to their evolving capacities, in their best interests, in consideration of their wellbeing and are not locked out of spaces that they have a right to participate in and to access.

My Lords, I am sorry that it is me again—a bit like a worn 78. In moving Amendment 25, I will speak also to Amendments 78, 187 and 196, all of which speak to the principle of children’s rights as set out in the UN Convention on the Rights of the Child and, more specifically, how those rights are applied to the digital world as covered in the United Nations’ general comment No. 25, which was produced in 2021 and ratified by the UK Government. What we are suggesting and asking for is that the principles in this general comment are reflected in the Bill. I thank the noble Baronesses, Lady Harding, Lady Kennedy and Lady Bennett, and the noble Lord, Lord Alton—who is not with us—for adding their names to these amendments and for their support.

The general comment No. 25 that I mentioned recognises that children’s rights are applicable in the digital world as well as the real world. These amendments try to establish in the Bill the rights of children. Believe it or not, in this rather lengthy Bill there is not a single reference—as far as we can discern—specifically to children’s rights. There are a lot of other words, but that specific phrase is not used, amazingly enough. These amendments are an attempt to get children’s rights specifically into the Bill. Amendments 30 and 105 in the names of the noble Lords, Lord Clement-Jones and Lord Knight, also seek to preserve the well-being of children. Our aims are very similar, but we will try to argue that the convention would achieve them in a particularly effective and concise way.

The online world is not optional for children, given what we know—not least from some of the detailed and harrowing experiences related by various of your Lordships in the course of the Bill. The fact that the online world is not optional for children may be worrying to some adults. We have all heard about parents, grandparents and others who have direct experience of their beloved coming to harm. By contrast, it is also fascinating to note how many senior executives, and indeed founders, of digital companies forbid their own children from possessing and using mobile phones, typically until they are 12 or 14. That is telling us something. If they themselves do not allow their children to have access to some of the online world we are talking about so much, that should give us pause for reflection.

Despite the many harms online, there is undoubted good that all children can benefit from, including in terms of their cognitive and skills development, social development and relationships. There are some brilliant things which come from being online. It is also beneficial because having age-appropriate experiences when they are online is part of their fundamental rights. That, essentially, is what these amendments are about.

Throughout the many years that the Bill has been in gestation, we have heard a lot about freedom of speech and how it must be preserved. Indeed, in contrast to children’s rights not being mentioned once in the Bill, “freedom of expression” appears no less than 49 times. I venture to suggest to your Lordships that there is a degree of imbalance there which should cause us to pause and reflect on whether we have that balance quite right.

I will not go into detail, but the UNCRC is the most widely ratified human rights treaty in history, and it is legally binding on the states which are party to it. The UK is a signatory to this convention, yet if we do not get this right in the Bill, we are in danger of falling behind some of our global counterparts. Although I recognise that saying the name of this organisation may bring some members of the governing party out in a rather painful rash, the EU is incorporating the UNCRC into its forthcoming AI Act. Sweden has already incorporated it into law at a different level, and Canada, New Zealand and South Africa are all doing the same. It is not anything to be worried about. Even Wales incorporated it into its domestic law in 2004, and Scotland did so in 2021. This appears to be something that the English have a particular problem with.

These amendments would ensure that, very importantly, those reading the Bill absolutely know that they must give due consideration to children’s rights. It would not be optional. Amendments 25 and 78 would require services to uphold children’s rights when implementing safety measures. Amendment 187 would reflect children’s rights in Ofcom’s duties, and Amendment 196 would ensure that Ofcom takes into consideration children’s rights when it is making its assessments of risks.

In particular, we have tabled these amendments because one of the possible unintended consequences of the well-meaning and serious attempts by all of us to protect children better is that some of these companies and platforms may decide that having children access some of their services is too much bother. They may decide that it would be simpler to find means to exclude them completely because it would be too much trouble, money or regulatory hassle to try to build a platform or service which they know children will access, as that will impose a serious obligation on them for which they can be held legally accountable. That would be an unintended consequence. We do not want children locked out of services which are essential to their development, education and self-expression. That said, I have probably said enough. I beg to move.

My Lords, I rise on this group of amendments, particularly with reference to Amendments 25, 78, 187 and 196, to inject a slight note of caution—I hope in a constructive manner—and to suggest that it would be the wrong step to try to incorporate them into this legislation. I say at the outset that I think the intention behind these amendments is perfectly correct; I do not query the intention of the noble Lord, Lord Russell, and others. Indeed, one thing that has struck me as we have discussed the Bill is the commonality of approach across the Chamber. There is a strong common desire to provide a level of protection for children’s rights, but I question whether these amendments are the right vehicle by which to do that.

It is undoubtedly the case that the spirit of the UNCRC is very strongly reflected within the Bill, and I think it moves in a complementary fashion to the Bill. Therefore, again, I do not query the UNCRC in particular. It can act as a very strong guide to government as to the route it needs to take, and I think it has had a level of influence on the Bill. I speak not simply as someone observing the Bill but as someone who, in a previous existence, served as an Education Minister in Northern Ireland and had direct responsibility for children’s rights. The guidance we received from the UNCRC was, at times, very useful to Ministers, so I do not question any of that.

For three reasons, I express a level of concern about these amendments. I mentioned that the purpose of the UNCRC is to act as a guide—a yardstick—for government as to what should be there in terms of domestic protections. That is its intention. The UNCRC itself was never written as a piece of legislation, and I do not think it was the original intention to have it directly incorporated and implemented as part of law. The UNCRC is aspirational in nature, which is very worth while. However, it is not written in a legislative form. At times, it can be a little vague, particularly if we are looking at the roles that companies will play. At times, it sets out very important principles, but ones which, if left for interpretation by the companies themselves, could create a level of tension.

To give an example, there is within the UNCRC a right to information and a right to privacy. That can sometimes create a tension for companies. If we are to take the purpose of the UNCRC, it is to provide that level of guidance to government, to ensure that it gets it right rather than trying to graft UNCRC directly on to domestic law.

Secondly, the effect of these amendments would be to shift the interpretation and implementation of what is required of companies from government to the companies themselves. They would be left to try to determine this, whereas I think that the UNCRC is principally a device that tries to make government accountable for children’s rights. As such, it is appropriate that government has the level of responsibility to draft the regulations, in conjunction with key experts within the field, and to try to ensure that what we have in these regulations is fit for purpose and bespoke to the kind of regulations that we want to see.

To give a very good example, there are different commissioners across the United Kingdom. One of the key groups that the Government should clearly be consulting with to make sure they get it right is the Children’s Commissioners of the different jurisdictions in the United Kingdom. Through that process, but with that level of ownership still lying with government and Ofcom, we can create regulations that provide the level of protection for our children that we all desire to see; whereas, if the onus is effectively shifted on to companies simply to comply with what is a slightly vague, aspirational purpose in these regulations, that is going to lead to difficulties as regards interpretation and application.

Thirdly, there is a reference to having due regard to what is in the UNCRC. From my experience, both within government and even seeing the way in which government departments do that—and I appreciate that “due regard” has case law behind it—even different government departments have tended to interpret that differently and in different pieces of legislation. At one extreme, on some occasions that effectively means that lip service has been paid to that by government departments and, in effect, it has been largely ignored. Others have seen it as a very rigorous duty. If we see that level of disparity between government departments within the same Government, and if this is to be interpreted as a direct instruction to and requirement of companies of varying sizes—and perhaps with various attitudes and feelings of responsibility on this subject—that creates a level of difficulty in and of itself.

My final concern in relation to this has been mentioned in a number of debates on various groups of amendments. Where a lot of Peers would see either a weakness in the legislation or something else that needs to be improved, we need to have as much consistency and clarity as possible in both interpretation and implementation. As such, the more we move away from direct regulations, which could then be put in place, to relying on the companies themselves interpreting and implementing, perhaps in different fashions, with many being challenged by the courts at times, the more we create a level of uncertainty and confusion, both for the companies themselves and for users, particularly the children we are looking to protect.

While I have a lot of sympathy for the intention of the noble Lord, Lord Russell, and while we need to find a way to incorporate into the Bill in some form how we can drive children’s rights more centrally within this, the formulation of the direct grafting of the UNCRC on to this legislation, even through due regard, is the wrong vehicle for doing it. It is inappropriate. As such, it is important that we take time to try to find a better vehicle for the sort of intention that the noble Lord, Lord Russell, and others are putting forward. Therefore, I urge the noble Lord not to press his amendments. If he does, I believe that the Committee should oppose the amendments as drafted. Let us see if, collectively, we can find a better and more appropriate way to achieve what we all desire: to try to provide the maximum protection in a very changing world for our children as regards online safety.

My Lords, I support these amendments. We are in the process of having a very important debate, both in the previous group and in this one. I came to this really important subject of online safety 13 years ago, because I was the chief executive of a telecoms company. Just to remind noble Lords, 13 years ago neither Snap, TikTok nor Instagram—the three biggest platforms that children use today—existed, and telecoms companies were viewed as the bad guys in this space. I arrived, new to the telecoms sector, facing huge pressure—along with all of us running telecoms companies—from Governments to block content.

I often felt that the debate 13 years ago too quickly turned into what was bad about the internet. I was spending the vast majority of my working day trying to encourage families to buy broadband and to access this thing that you could see was creating huge value in people’s lives, both personal and professional. Sitting on these Benches, I fundamentally want to see a society with the minimum amount of regulation, so I was concerned that regulating internet safety would constrain innovation; I wanted to believe that self-regulation would work. In fact, I spent many hours in workshops with the noble Baroness, Lady Kidron, and many others in this Chamber, as we tried to persuade and encourage the tech giants—as everyone started to see that it was not the telecoms companies that were the issue; it was the emerging platforms—to self-regulate. It is absolutely clear that that has failed. I say that with quite a heavy heart; it has genuinely failed, and that is why the Bill is so important: to enshrine in law some hard regulatory requirements to protect children.

That does not change the underlying concern that I and many others—and everyone in this Chamber—have, that the internet is also potentially a force for good. All technology is morally neutral: it is the human beings who make it good or bad. We want our children to genuinely have access to the digital world, so in a Bill that is enshrining hard gates for children, it is really important that it is also really clear about the rights that children have to access that technology. When you are put under enormous pressure, it is too easy—I say this as someone who faced it 13 years ago, and I was not even facing legislation—to try to do what you think your Government want to do, and then end up causing harm to the individuals you are actually trying to protect. We need this counterbalance in this Bill. It is a shame that my noble friend Lord Moylan is not in his place, because, for the first time in this Committee, I find myself agreeing with him. It is hugely important that we remember that this is also about freedom and giving children the freedom to access this amazing technology.

Some parts of the Bill are genuinely ground-breaking, where we in this country are trying to work out how to put the legal scaffolding in place to regulate the internet. Documenting children’s rights is not something where we need to start from scratch. That is why I put my name to this amendment: I think we should take a leaf from the UN Convention on the Rights of the Child. I recognise that the noble Lord, Lord Weir of Ballyholme, made some very thought-provoking comments about how we have to be careful about the ambiguity that we might be creating for companies, but I am afraid that ambiguity is there whether we like it or not. These are not just decisions for government: the tension between offering services that will brighten the lives of children but risking them as well are exactly behind the decisions that technology companies take every day. As the Bill enshrines some obligations on them to protect children from the harms, I firmly believe it should also enshrine obligations on them to offer the beauty and the wonder of the internet, and in doing that enshrine their right to this technology.

My Lords, I have attached my name to Amendment 25 in the name of the noble Lord, Lord Russell, and I rise to speak primarily to that. It is a great pleasure to follow the noble Baroness, Lady Harding, and agree with every word she has just said. I will draw on two elements of my personal history that she reminded me of. As a journalist in country Australia in the early 1990s—pre-internet days—I worked the night shift, and at least once a week we would get a frantic phone call from a parent calling on behalf of a child along the lines of, “Do you know anything about dolphins?” A school project had just been discovered that needed to be done by the next morning, and the source of information that the parent thought of was, “The local newspaper—they might be able to tell us something!” I am slightly ashamed to say that we had a newspaper to get out and we very quickly told them to go away, so we were not a good source of information in that case. Most people in your Lordships’ House will remember—but most young people will have no recollection of—a time when there was little access to information outside the hours when the library was open or you could go to a bookshop. There were literally no other sources available. We have to consider this amendment in the light of that.

I also want to slightly disagree with the comments of the noble Lord, Lord Bethell, on the previous group. He suggested that it was only with the arrival of phones that the internet became primarily or significantly a children’s thing. The best I can date it is that either in 1979 or 1980 I was playing “Lemonade Stand” on one of the early Apples. This might have been considered to be a harmful game from some political perspectives, given that it very much encouraged a capitalist mindset, profit taking and indeed the Americanisation of culture—but none the less that was back in 1980, if not 1979, and children were there. If we look back over the history of the internet, we see that some of the companies started out with young people, under the age of 18 in some cases, who have been at the forefront of innovation and development of what we now think of as our social media or internet world. This is the children’s world as much as it is the adults’ world, and that is the reality.

I will pick up the points made by the noble Lord, Lord Weir of Ballyholme, who suggested that the UN Convention on the Rights of the Child was only a guide to government and not law. It is a great pity that the noble Baroness, Lady Kennedy of The Shaws, is not in her place, because she is far better equipped to deal with this angle than I am. But I will give it a go. Children’s rights are humans’ rights. The UN Convention on the Rights of the Child is the most backed and most ratified rights convention—

I appreciate what the noble Baroness is saying, but I made a slightly different point. I am suggesting not that what is there was not meant to be law but that it was not written in a form which should be simply directly put in as legislation. It was not drafted in that format on that basis, which is why a direct graft on to a domestic piece of legislation is not quite the way to do it. It is about using that as guidance as to what should be in the law, rather than simply a direct incorporation.

I thank the noble Lord for his clarification, although, speaking not as a lawyer, my understanding is that a human right is a legal right; it is a law—a most fundamental right. In addition, every country in the world has ratified this except for the United States—which is another issue. I also point out that it is particularly important that we include reference to children’s rights in this Bill, given the fact that we as a country currently treat our children very badly. There is a huge range of issues, and we should have a demonstration in this and every Bill that the rights of children are respected across all aspects of British society.

I will not get diverted into a whole range of those, but I point noble Lords to a report to the United Nations from the Equality and Human Rights Commission in February this year that highlighted a number of ways in which children’s rights are not being lived up to in the UK. The most relevant part of this letter that the EHRC sent to the UN stresses that it is crucial to preserve children’s rights to accessible information and digital connectivity. That comes from our EHRC.

I think it was the noble Lord, Lord Russell, who referred to the fact that we live in a global environment, and of course our social media and the internet is very much a global world. I urge everyone who has not done so to look at a big report done by UNICEF in 2019, Global Kids Online, which, crucially, involved a huge amount of surveys, consultation and consideration by young people. Later we will get to an amendment of mine which says that we should have the direct voice of young people overseeing the implementation of the Bill. I am talking not about the NGOs that represent them but specifically about children: we need to listen to the children and young people.

The UNICEF report said that it was quite easy to defend access to information and to reputable sources, but showed that accessing entertainment activities—some of the things that perhaps some grandparents in this Chamber might have trouble with—was associated with the positive development of digital skills. Furthermore, the report says:

“When parents restrict children’s internet use”—

of course, this could also apply to the Government restricting their internet use—

“this has a negative effect on children’s information-seeking and privacy skills”.

So, if you do not give children the chance to develop these skills to learn how to navigate the internet, and they suddenly go to it at age 18 and a whole lot of stuff is out there that they have not developed any skills to deal with, you are setting yourself up for a real problem. So UNICEF stresses the real need to have children’s access.

Interestingly, this report—which was a global report from UNICEF—said that

“fewer than one third of children had been exposed to”

something they had found uncomfortable or upsetting in the preceding year. That is on the global scale. Perhaps that is an important balance to some of the other debates we have had in your Lordships’ House on the Bill.

Other figures from this report that I think are worth noting—this is from 2019, so these figures will undoubtedly have gone up—include the finding that

“one in three children globally is … an internet user and …. one in three internet users is a child”.

We have been talking about this as though the internet is “the grown-ups’ thing”, but that is not the global reality. It was co-created, established and in some cases invented by people under the age of 18. I am afraid to say that your Lordships’ House is not particularly well equipped to deal with this, but we need to understand this as best we possibly can. I note that the report also said, looking at the sustainable development goals on quality of education, good jobs and reducing inequality, that internet access for children was crucial.

I will make one final point. I apologise; I am aware that I have been speaking for a while, but I am passionate about these issues. Children and young people have agency and the ability to act and engage in politics. In several nations on these islands, 16 and 17 year-olds have the vote. I very much hope that that will soon also be the case in England, and indeed I hope that soon children even younger than that that will have the vote. I was talking about that with a great audience of year nines at the Queen’s School in Bushey on Friday with Learn with the Lords. Those children would have a great opportunity—

My Lords, we have a very full order of business to get through, so I encourage the noble Baroness to remain on topic.

I think that is on topic. If 16 and 17 year-olds are voting, they have a right to access internet information about voting. I suggest that that is on topic.

My final point—for the pleasure of the noble Lord—is that historically we have seen examples where blocks and filters have denied children and young people who identify as LGBTQI+ access to crucial information for them. That is an example of the risk if we do not allow them right of access. On the most basic children’s right of all, we have also seen examples of blocks and filters that have stopped access to breastfeeding information on the internet. Access is a crucial issue, and what could be a more obvious way to allow it than by writing in the United Nations Declaration on the Rights of the Child?

My Lords, I welcome many of these amendments. I found reading them slightly more refreshing than the more dystopian images we have had previously. It is quite exciting, actually, because the noble Baroness, Lady Harding, sounded quite upbeat, which is in contrast to previous contributions on what the online world is like.

I want to defend the noble Baroness, Lady Bennett of Manor Castle, from the intervention that suggested that she was going off topic, because the truth is that these amendments are calling for children’s rights to be introduced into legislation via this Bill. I disagree with that, but we should at least talk about it if it is in the amendments.

Whereas I like the spirit of the amendments, it seems to me that children’s rights, which I consider to have huge constitutional implications, require a proper Bill to bring them in and not to be latched on to this one. My concern is that children’s rights can be used to undermine adult authority and are regularly cited as a way of undermining parents’ rights, and that children under 18 cannot enact political rights. Whether they have agency or capacity, they are not legally able to exercise their political rights, and therefore someone has to act on their behalf as an intermediary—as a third party—which is why it can become such a difficult, politicised area.

I say that because it would be a fascinating discussion to have. I do not think this is the Bill to have it on, but the spirit of the amendments raises issues that we should bear in mind for the rest of our discussion. During lockdown, we as a society stopped young people having any social interaction at all. They were isolated, and a lot of new reports suggest that young people’s mental health has suffered because they were on their own. They went online and, in many instances, it kept them sane. That is probably true not just of young people but of the rest of us, by the way, but I am making the point that it was not all bad.

Over recent years, as we have been concerned about children’s safety and protecting them, we have discouraged them from roaming far from home. They do not go out on their bikes or run around all the time; they are told, “Come back home, you’ll be safe”. Of course, they have gone into their room and gone online, and now we say, “That’s not safe either”.

I want to acknowledge that the online world has helped young people overcome the problems of isolation and lack of community that the adult world has sometimes denied them developing. That is important: it can be a source of support and solidarity. Children need spaces to talk, engage and interact with friends, mates, colleagues and so on where they can push boundaries, and all sorts of things, without grown-ups interfering. That is what we have always understood from child development. It is why you do not have spies wandering around all the time following them.

The main thing is that we know the difference between a four year-old and a 14 year-old. In the Bill, we call a child anyone under 18, but I was glad that the amendments acknowledge that distinction in terms of appropriateness is important. When young people are online, or if they are involved in encrypted messages, such as WhatsApp, that does not mean they are all planning to join county lines or are being groomed—it is not all dodgy. Appropriateness in terms of child age and not always imagining that the worst is happening are an important counter that these amendments bring to some of the pessimism that we have heard until now.

The noble Lord, Lord Russell, said that children’s rights are not mentioned in the Bill but freedom of expression has been mentioned 49 times. First, it is not a Bill about children’s rights, but when he says that freedom of expression has been mentioned 49 times, I assure him that quantity is not quality and the mention of it means nothing.

I want to challenge the noble Baroness’s assertion that the Bill is not about children’s rights. Anyone who has a teenage child knows that their right to access the internet is keenly held and fought out in every household in the country.

The quip works, but political rights are not quips. Political rights have responsibilities, and so on. If we gave children rights, they would not be dependent on adults and adult society. Therefore, it is a debate; it is a row about what our rights are. Guess what. It is a philosophical row that has been going on all around the world. I am just suggesting that this is not the place—

I am sorry, but I must point out that 16 and 17 year-olds in Scotland and Wales have the vote. That is a political right.

And it has been highly contentious whether the right to vote gives them independence. For example, you would still be accused of child exploitation if you did anything to a person under 18 in Scotland or Wales. In fact, if you were to tap someone and it was seen as slapping in Scotland and they were 17, you would be in trouble. Anyway, it should not be in this Bill. That is my point.

My Lords, perhaps I may intervene briefly, because Scotland and Wales have already been mentioned. My perception of the Bill is that we are trying to build something fit for the future, and therefore we need some broad underlying principles. I remind the Committee that the Well-being of Future Generations Act (Wales) Act set a tone, and that tone has run through all aspects of society even more extensively than people imagined in protecting the next generation. As I have read them, these amendments set a tone to which I find it difficult to understand why anyone would object, given that that is a core principle, as I understood it, behind building in future-proofing that will protect children, among others.

My Lords, I support the amendments in the name of the noble Lord, Lord Russell, to require regulated services to have regard to the UN Convention on the Rights of the Child. As we continue to attempt to strengthen the Bill by ensuring that the UK will be the safest place for children to be online, there is a danger that platforms may take the easy way out in complying with the new legislation and just block children entirely from their sites. Services must not shut children out of digital spaces altogether to avoid compliance with the child safety duties, rather than designing services with their safety in mind. Children have rights and, as the UN convention makes clear, they must be treated according to their evolving capacities and in their best interests in consideration of their well-being.

Being online is now an essential right, not an option, to access education, entertainment and friendship, but we must try to ensure that it is a safe space. As the 5Rights Foundation points out, the Bill risks infringing children’s rights online, including their rights to information and participation in the digital world, by mandating that services prevent children from encountering harmful content, rather than ensuring services are made age appropriate for children and safe by design, as we discussed earlier. As risk assessments for adults have been stripped from the Bill, this has had the unintended consequence of rendering a child user relative to an adult user even more costly, as services will have substantial safety duties to comply with to protect children. 5Rights Foundation warns that this will lead services to determine that it is not worth designing services with children’s safety in mind but that it could be more cost effective to lock them out entirely.

Ofcom must have a duty to have regard for the UNCRC in its risk assessments. Amendment 196 would ensure that children’s rights are reflected in Ofcom’s assessment of risks, so that Ofcom must have regard for children’s rights in balancing their rights to be safe against their rights to access age-appropriate digital spaces. This would ensure compliance with general comment No. 25, as the noble Lord, Lord Russell, mentioned, passed in 2021, to protect children’s rights to freedom of expression and privacy. I urge the Ministers to accept these amendments to ensure that the UK will be not only the safest place for children to be online but the best place too, by respecting and protecting their rights.

My Lords, I support all the amendments in this group, and will make two very brief points. Before I do, I believe that those who are arguing for safety by design and to put harms in the Bill are not trying to restrict the freedom of children to access the internet but to give the tech sector slightly less freedom to access children and exploit them.

My first point is a point of principle, and here I must declare an interest. It was my very great privilege to chair the international group that drafted general comment No. 25 on children’s rights in relation to the digital environment. We did so on behalf of the Committee on the Rights of the Child and, as my noble friend Lord Russell said, it was adopted formally in 2021. To that end, a great deal of work has gone into balancing the sorts of issues that have been raised in this debate. I think it would interest noble Lords to know that the process took three years, with 150 submissions, many by nation states. Over 700 children in 28 countries were consulted in workshops of at least three hours. They had a good shout and, unlike many of the other general comments, this one is littered with their actual comments. I recommend it to the Committee as a very concise and forceful gesture of what it might be to exercise children’s rights in a balancing way across all the issues that we are discussing. I cannot remember who, but somebody said that the online world is not optional for children: it is where they grow up; it is where they spend their time; it is their education; it is their friendships; it is their entertainment; it is their information. Therefore, if it is not optional, then as a signatory to the UNCRC we have a duty to respect their rights in that environment.

My second point is rather more practical. During the passage of the age-appropriate design code, of which we have heard much, the argument was made that children were covered by the amendment itself, which said they must be kept in mind and so on. I anticipate that argument being made here—that we are aligning with children’s rights, apart from the fact that they are indivisible and must be done in their entirety. In that case, the Government happily accepted that it should be explicit, and it was put in the Data Protection Act. It was one of the most important things that happened in relation to the age-appropriate design code. We might hope that, when this Bill is an Act, it will all be over—our job will be done and we can move on. However, after the Data Protection Act, the most enormous influx of lobbying happened, saying, “Please take the age down from 18 to 13”. The Government, and in that case the ICO, shrugged their shoulders and said, “We can’t; it’s on the face of the Bill”, because Article 1 of the UNCRC says that a child is anyone under the age of 18.

The evolving capacities of children are central to the UNCRC, so the concerns of the noble Baroness, Lady Fox, which I very much share, that a four year-old and a 14 year-old are not the same, are embodied in that document and in the general comment, and therefore it is useful.

These amendments are asking for that same commitment here—to children and to their rights, and to their rights to protection, which is at the heart of so much of what we are debating, and their well-being. We need their participation; we need a digital world with children in it. Although I agreed very much with the noble Baroness, Lady Bennett, and her fierce defending of children’s rights, there are 1 billion children online. If two-thirds of them have not seen anything upsetting in the last year, that rather means that one-third of 1 billion children have—and that is too many.

My Lords, I did not intend to speak in this debate but I have been inspired by it.

I was here for the encryption debate last week, which I did not speak in. One of the contributions was around unintended consequences of the legislation, and I am concerned about unintended consequences here.

I absolutely agree with the comments of the noble Baroness, Lady Bennett, around the need for children to engage on the internet. Due to a confidence and supply agreement with the then Government back in 2017, I ensured that children and adults alike in Northern Ireland have the best access to the internet in the United Kingdom, and I am very proud of that. Digital literacy is covered in a later amendment, Amendment 91, which I will be strongly supporting. It is something that everybody needs to be involved in, not least our young people—and here I declare an interest as the mother of a 16 year-old.

I have two concerns. The first was raised by my friend the noble Lord, Lord Weir, around private companies being legally accountable for upholding an international human rights treaty. I am much more comfortable with Amendments 187 and 196, which refer to Ofcom. I think that is where the duty should be. I have an issue not with the convention but with private companies being held responsible for it; Ofcom should be the body responsible.

Secondly, I listened very carefully to what the noble Baroness, Lady Kidron, said about general comment No. 25. If what I say is incorrect, I hope she will say so. Is general comment No. 25 a binding document on the Government? I understood that it was not.

We need to see the UNCRC included in the Bill. The convention is never opened up again, and how it makes itself relevant to the modern world is through the general comments; that is how the Committee on the Rights of the Child would interpret it.

So it is an interpretive document. The unintended consequences piece was around general comment No. 25 specifically having reference to children being able to seek out content. That is certainly something that I would be concerned about. I am sure that we will discuss it further in the next group of amendments, which are on pornography. If young people were able to seek out harmful content, that would concern me greatly.

I support Amendments 187 and 196, but I have some concerns about the unintended consequences of Amendment 25.

My Lords, I think this may have been a brief interlude of positivity. I am not entirely convinced, in view of some of the points that have been made, but certainly I think that it was intended to be.

I will speak first to Amendments 30 and 105. I do not know what the proprieties are, but I needed very little prompting from the LEGO Group to put forward amendments that, in the online world, seek to raise the expectation that regulated services must go beyond purely the avoidance of risk of harm and consider the positive benefits that technology has for children’s development and their rights and overall well-being. It has been extremely interesting to hear that aspect of today’s debate.

It recognises that through the play experience of children, both offline and online, it has an impact on the lives of millions of children that it engages with around the world, and it recognises the responsibility to ensure that, wherever it engages with them, the impact is positive and that it protects and upholds the rights of children and fosters their well-being as part of its mission.

We have heard about UN general comment 25 on children’s rights in the digital environment. The Government’s response to the drafting process recognised the collective responsibility of all Governments and stakeholders to ensure

“that children can benefit from digital opportunities, and protecting them from online harms”.

In line with this, the Bill now offers the opportunity to require regulated services to not only mitigate and manage risk in their service design but to consider the benefits of the service to children’s rights and well-being. I am extending it rather further than some of the earlier discussions.

I agree that it is important to include reference to both rights and well-being in the Bill. An individual child may have low well-being even if all their rights are respected. For example, if a child does not feel socially connected or empowered in a positive online environment, they may experience low well-being even if their right to participate online is being respected. As drafted, the Bill instructs regulated services to have regard

“to the importance of protecting the rights of users and interested persons”

and give due consideration to benefits such as freedom of expression

“when deciding on, and implementing, safety measures and policies”

to comply with the regime.

I believe that, if the Bill is to fully deliver for children, it needs to ensure that there is consideration of the benefits of the service to children’s rights and well-being. Without this inclusion, there is a risk that the design of online services will disproportionately restrict children’s rights to participate in the online environment and the benefit it brings to their well-being. By instructing service providers to design for the benefits that technology can bring to children’s rights and well-being alongside the mitigation of risk, which we have heard so much about, we have a real opportunity in the Bill to create a blueprint for the online environment that can both protect and nurture children’s potential by supporting and empowering them, unleashing their creativity and helping them learn. We have heard many positive comments around the House on that. I hope the Minister will understand the clear intention here and take on board the positive intent of these amendments.

Briefly, many noble Lords have emphasised the importance of the UN Convention on the Rights of the Child. I am not going to add greatly to that debate, but children have a right to be safe and to privacy. They also have rights to information and participation in free speech, both online and offline. It was very interesting to hear, in particular from the noble Baroness, Lady Healy, and the noble Lord, Lord Russell, about their view that services may shut children out of digital spaces altogether to avoid compliance with the child safety duties, rather than designing services with their safety in mind. That is because the Bill focuses on content moderation rather than system design: we are back, in a sense, into that loop.

I believe that the reference to the UNCRC general comment 25 would be very useful. I understand the points made by the noble Lord, Lord Weir, and certainly the spirit in which he made them, but I cannot see why “having regard to” the UNCRC could not be in the Bill. I do not see that that is unduly prescriptive or difficult to interpret in those circumstances, or overly vague. So, on these Benches, we support those amendments.

My Lords, we too support the spirit of these amendments very much and pay tribute to the noble Lord, Lord Russell, for tabling them.

In many ways, I do not need to say very much. I think the noble Baroness, Lady Kidron, made a really powerful case, alongside the way the group was introduced in respect of the importance of these things. We do want the positivity that the noble Baroness, Lady Harding, talked about in respect of the potential and opportunity of technology for young people. We want them to have the right to freedom of expression, privacy and reliable information, and to be protected from exploitation by the media. Those happen to be direct quotes from the UN Convention on the Rights of the Child, as some of the rights they would enjoy. Amendments 30 and 105, which the noble Lord, Lord Clement-Jones, tabled—I attached my name to Amendment 30—are very much in that spirit of trying to promote well-being and trying to say that there is something positive that we want to see here.

In particular, I would like to see that in respect of Ofcom. Amendment 187 is, in some ways, the more significant amendment and the one I most want the Minister to reflect on. That is the one that applies to Ofcom: that it should have reference to the UN Convention on the Rights of the Child. I think even the noble Lord, Lord Weir, could possibly agree. I understand his thoughtful comments around whether or not it is right to encumber business with adherence to the UN convention, but Ofcom is a public body in how it carries out its duties as a regulator. There are choices for regulation. Regulation can just be about minimum standards, but it can also be about promoting something better. What we are seeking here in trying to have reference to the UN convention is for Ofcom to regulate for something more positive and better, as well as police minimum standards. On that basis, we support the amendments.

My Lords, I will start in the optimistic spirit of the debate we have just had. There are many benefits to young people from the internet: social, educational and many other ways that noble Lords have mentioned today. That is why the Government’s top priority for this legislation has always been to protect children and to ensure that they can enjoy those benefits by going online safely.

Once again, I find myself sympathetic to these amendments, but in a position of seeking to reassure your Lordships that the Bill already delivers on their objectives. Amendments 25, 78, 187 and 196 seek to add references to the United Nations Convention on the Rights of the Child and general comment 25 on children’s rights in relation to the digital environment to the duties on providers and Ofcom in the Bill.

As I have said many times before, children’s rights are at the heart of this legislation, even if the phrase itself is not mentioned in terms. The Bill already reflects the principles of the UN convention and the general comment. Clause 207, for instance, is clear that a “child” means a person under the age of 18, which is in line with the convention. All providers in scope of the Bill need to take robust steps to protect users, including children, from illegal content or activity on their services and to protect children from content which is harmful to them. They will need to ensure that children have a safe, age-appropriate experience on services designed for them.

Both Ofcom and service providers will also have duties in relation to users’ rights to freedom of expression and privacy. The safety objectives will require Ofcom to ensure that services protect children to a higher standard than adults, while also making sure that these services account for the different needs of children at different ages, among other things. Ofcom must also consult bodies with expertise in equality and human rights, including those representing the interests of children, for instance the Children’s Commissioner. While the Government fully support the UN convention and its continued implementation in the UK, it would not be appropriate to place obligations on regulated services to uphold an international treaty between state parties. We agree with the reservations that were expressed by the noble Lord, Lord Weir of Ballyholme, in his speech, and his noble friend Lady Foster.

The convention’s implementation is a matter for the Government, not for private businesses or voluntary organisations. Similarly, the general comment acts as guidance for state parties and it would not be appropriate to refer to that in relation to private entities. The general comment is not binding and it is for individual states to determine how to implement the convention. I hope that the noble Lord, Lord Russell, will feel reassured that children’s rights are baked into the Bill in more ways than a first glance may suggest, and that he will be content to withdraw his amendment.

The noble Lord, Lord Clement-Jones, in his Amendments 30 and 105, seeks to require platforms and Ofcom to consider a service’s benefits to children’s rights and well-being when considering what is proportionate to fulfil the child safety duties of the Bill. They also add children’s rights and well-being to the online safety objectives for user-to-user services. The Bill as drafted is focused on reducing the risk of harm to children precisely so that they can better enjoy the many benefits of being online. It already requires companies to take a risk-based and proportionate approach to delivering the child safety duties. Providers will need to address only content that poses a risk of harm to children, not that which is beneficial or neutral. The Bill does not require providers to exclude children or restrict access to content or services that may be beneficial for them.

Children’s rights and well-being are already a central feature of the existing safety objectives for user-to-user services in Schedule 4 to the Bill. These require Ofcom to ensure that services protect children to a higher standard than adults, while making sure that these services account for the different needs of children at different ages, among other things. On this basis, while I am sympathetic to the aims of the amendments the noble Lord has brought forward, I respectfully say that I do not think they are needed.

More pertinently, Amendment 30 could have unintended consequences. By introducing a broad balancing exercise between the harms and benefits that children may experience online, it would make it more difficult for Ofcom to follow up instances of non-compliance. For example, service providers could take less effective safety measures to protect children, arguing that, as their service is broadly beneficial to children’s well-being or rights, the extent to which they need to protect children from harm is reduced. This could mean that children are more exposed to more harmful content, which would reduce the benefits of going online. I hope that this reassures the noble Lord, Lord Russell, of the work the Bill does in the areas he has highlighted, and that it explains why I cannot accept his amendments. I invite him to withdraw Amendment 25.

My Lords, I thank all noble Lords for taking part in this discussion. I thank the noble Lord, Lord Weir, although I would say to him that his third point—that, in his experience, the UNCRC is open to different interpretations by different departments—is my experience of normal government. Name me something that has not been interpreted differently by different departments, as it suits them.

I entirely take that point. I was making the slightly wider point—not specifically with regard to the UNCRC—that, whenever legislative provision has been made that a particular department has to have due regard to something, while there is case law, “due regard” has tended to be treated very differently by different departments. So, if even departments within the same Government treat that differently, how much more differently would private companies treat it?

I would simply make the point that it would probably be more accurate to say that the departments treat it with “due disregard”;

This has been a wide ranging debate and I am not going to go through all the different bits and pieces. I recommend that noble Lords read United Nations general comment 25 as it goes, in great detail, right to the heart of the issues we are talking about. For example —this is very pertinent to the next group of amendments—it explicitly protects children from pornography, so I absolutely recommend that it be mentioned in the next group of amendments.

As I expected, the Minister said, “We are very sympathetic but this is not really necessary”. He said that children’s rights are effectively baked into the Bill already. But what is baked into something that children—for whom this is particularly relevant—or even adults might decide to consume is not always immediately obvious. There are problems with an approach whereby one says, “It’s fine because, if you really understood this rather complicated legislation, it would become completely clear to you what it means”. That is a very accurate and compelling demonstration of exactly why some of us have concerns about this well-intentioned Bill. We fear that it will become a sort of feast, enabling company lawyers and regulators to engage in occasionally rather arcane discourse at great expense, demonstrating that what the Government claim is clearly baked in is not so clearly baked in.

A common theme in many of these amendments on children’s rights is that it is important that these rights are not implicitly covered in the Bill, as they are in myriad cases, but that it should be stated more clearly in key places in the Bill that it explicitly is about helping children and protecting their rights. It should be about protecting their right to be online, but also their right not to be abused or suffer harm online. That is at the heart of what we are trying to do. I suspect there is rich room for further discussion to see if we can make some of this slightly less “baked in” and find some form of legislative icing, with hundreds and thousands, which makes it completely clear which children’s rights are being protected and how they will be protected. With that, I beg leave to withdraw the amendment.

Amendment 25 withdrawn.

Amendments 26 and 27 not moved.

Amendment 27A

Moved by

27A: Clause 11, page 11, line 19, at end insert—“(10A) A duty to summarise in the terms of service the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).”Member’s explanatory statementThis amendment requires providers of Category 1 services to summarise (in their terms of service) the findings of their latest children’s risk assessment. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.

Amendment 27A agreed.

Amendment 28 not moved.

Amendment 29

Moved by

29: Clause 11, page 11, line 25, at end insert—“, except for pornographic content where age verification must always be applied, notwithstanding section 3(3)(a) of the Communications Act 2003.”Member’s explanatory statementThis amendment would require a user-to-user service to apply age verification for pornographic content regardless of their size or capacity.

My Lords, I am very happy to move Amendment 29 and to speak to Amendments 83 and 103, which are also in my name. We have just had a debate about the protection of children online, and this clearly follows on from that.

The intention of the Bill is to set general parameters through which different content types can be regulated. The problem with that approach, as the sheer number of amendments highlights, is this: not all content and users are the same, and therefore cannot be treated in the same way. Put simply, not all content online should be legislated for in the same way. That is why the amendments in this group are needed.

Pornography is a type of content that cannot be regulated in general terms; it needs specific provisions. I realise that some of these issues were raised in the debate last Tuesday on amendments in my name, and again on Thursday when we discussed harms to children. I recognise too that, during his response to Thursday’s debate, the Minister made a welcome announcement on primary priority content which I hope will be set out in the Bill, as we have been asking for during this debate. While we wait to see the detail of what that announcement means, I think it safe to assume that pornography will be one of the harms named on the Bill, which makes discussion of these amendments that bit more straightforward.

Given that context, in Clause 11(3), user-to-user services that fall under the scope of Part 3 of the Bill have a duty to prevent children from accessing primary priority content. This duty is repeated in Clause 25(3) for search services. That duty is, however, qualified by the words,

“using proportionate systems and processes”.

It is the word “proportionate” and how that would apply to the regulation of pornography that is at the heart of the issue.

Generally speaking, acting in a proportionate way is a sensible approach to legislation and regulation. For the most part, regulation and safeguards should ensure that a duty is not onerous or that it does not place a disproportionate cost on the service provider that may make their business unviable. While that is the general principle, proportionality is not an appropriate consideration for all policy decisions.

In the offline world, legislation and regulation is not always proportionate. This is even more stark when regulating for children. The noble Lord, Lord Bethell, raised the issue of the corner shop last Tuesday, and that example is apt to highlight my point today. We do not take a proportional approach to the sale of alcohol or cigarettes. We do not treat a corner shop differently from a supermarket. It would be absurd if I were to suggest that a small shop should apply different age checks for children when selling alcohol, compared to the age checks we expect a large supermarket to apply. Therefore, in the same way, we already do not apply proportionality to some online activities. For example, gambling is an activity that is age-verified for children. Indeed, gambling companies are not allowed to make their product attractive to children and must advertise in a regulated way to avoid harm to children and young people. The harm caused to children by gambling is significant, so the usual policy considerations of proportionality do not apply. Clearly, both online and offline, there are some goods and services to which a proportionality test is not applied; there is no subjectivity. A child cannot buy alcohol or gamble and should not be able to access pornography.

In the UK, there is a proliferation of online gambling sites. It would be absurd to argue that the size of a gambling company or the revenue that company makes should be a consideration in whether it should utilise age verification to prevent children placing a bet. In the same way, it would be absurd to argue that the size or revenue of a pornographic website could be used as an argument to override a duty to ensure that age verification is employed to ensure that children do not access that website.

This is not a grey area. It is beyond doubt that exposing children to pornography is damaging to their health and development. The Children’s Commissioner’s report from this year has been much quoted already in Committee but it is worth reminding your Lordships what she found: that pornography was “widespread and normalised”, to the extent that children cannot opt out. The average age at which children first see pornography is 13. By age nine, 10% had seen it, 27% had seen it by age 11 and half had seen it by age 13. The report found that frequent users of pornography are more likely to engage—unfortunately and sadly—in physically aggressive sex acts.

There is nothing proportionate about the damage of pornographic content. The size, number of visitors, financial budget or technical know-how must not be considerations as to whether or not to deploy age checks. If a platform is incapable for any reason of protecting children from harmful exposure to pornography, it must remove that content. The Bill should be clear: if there is pornography on a website, it must use age verification. We know that pornographic websites will do all they can to evade age verification. In France and Germany, which are ahead of us in passing legislation to protect minors from pornography, regulators are tangled up in court action as the pornographic sites they first targeted for enforcement action argue against the law.

We must also anticipate the response of websites that are not dedicated exclusively to pornography, especially social media—a point we touched on during Tuesday’s debate. Reuters reported last year that an internal Twitter presentation stated that 13% of tweets were pornographic. Indeed, the Children’s Commissioner has found that Twitter is the platform where young people are most likely to encounter pornographic content. I know that some of your Lordships are concerned about age-gating social media. No one is suggesting that social media should exclude children, a point that has been made already. What I am suggesting is that pornography on that platform should be subject to age verification. The capabilities already exist to do this. New accounts on Twitter have to opt in to view pornographic content. Why cannot the opt-in function be age-gated? Twitter is moving to subscription content. Why can it not make pornographic content subscription based, with the subscription being age-verified. The solutions exist.

The Minister may seek to reassure the House that the Bill as drafted would not allow any website or search facility regulated under Part 3 that hosts pornographic content to evade its duties because of size, capacity or cost. But, as we have seen in France, these terms will be subject to court action. I therefore trust that the Government will bring forward an amendment to ensure that any platform that hosts pornographic content will employ age verification, regardless of any other factors. Perhaps the Minister in his wind-up can provide us with some detail or a hint of a future amendment at Report. I look forward to hearing and considering the Minister’s response. I beg to move.

My Lords, I wish to speak in support of Amendments 29, 83 and 103 in the name of the noble Baroness, Lady Ritchie. I am extremely pleased that the Minister said last Tuesday that pornography will be within primary priority content; he then committed on Thursday to naming primary priority content in the Bill. This is good news. We also know that pornography will come within the child safety duties in Clause 11. This makes me very happy.

In the document produced for the Government in January 2021, the BBFC said that there were millions of pornographic websites—I repeat, millions—and many of these will come within Part 3 of the Bill because they allow users to upload videos, make comments on content and chat with other users. Of course, some of these millions of websites will be very large, which means by definition that we expect them to come within the scope of the Bill. Under Clause 11(3) user-to-user services have a duty to prevent children accessing primary priority content. The duty is qualified by the phrase

“using proportionate systems and processes”.

The facts of deciding what is proportionate are set out in Clause 11(11): the potential harm of the content based on the children’s risk assessment, and the size and capacity of the provider of the service. Amendments 29, 83 and 103 tackle the issue of size and capacity.

With millions of sites on the internet, it is not unreasonable to think that some sites will argue that, despite the potential harm to children, they are not of a size to have the capacity to invest in technology. The amendment would require all user-to-user sites with pornographic content to use age verification to determine that the person accessing the content was aged 18 years or older, regardless of size and capacity. This issue was touched upon on Tuesday in the amendments tabled by the noble Baroness, Lady Ritchie, which said there should be a level playing field for websites that contain pornographic content regardless of which part of the Bill they fall within. Websites that come within the scope of Part 5 do not have any exceptions and must have age verification to meet the duty in Clause 72, and that should also apply to Part 3 services.

The Government have said there is a significant risk of harm posed by children’s access to pornography online since exposure to pornography may impact children’s perception of sex and relationships, increase the likelihood of engaging in sexual activities and harmful or aggressive behaviour, and reduce concern about consent from partners. For those reasons alone, all sites with pornographic content should have age verification.

I know that we will have further debates on age verification in due course, but I hope the Government’s announcement that pornographic content will be in the Bill means that age verification for pornography on Part 3 and Part 5 services will come into force at the same time. I urge the Government to support these amendments.

House resumed.