Committee (2nd Day)
Relevant document: 28th Report from the Delegated Powers Committee
Clause 3: “Regulated service”, “Part 3 service” etc
Amendment 2
Moved by
2: Clause 3, page 3, line 14, at end insert—
“(d) an internet service, other than a regulated user-to-user service or search service, that meets the child user condition and enables or promotes harmful activity and content as set out in Schedule (Online harms to children).”Member’s explanatory statement
This amendment would mean any service that meets the 'child user condition' and enables or promotes harmful activity and content to children, as per a new Schedule, would be in scope of the regulation of the bill.
My Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.
The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.
There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.
Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,
“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.
It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.
Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.
Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.
The Bill requires
“a significant number of children”
to use the service, or for the service to be
“likely to attract a significant number of users who are children”.
“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.
Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is
“likely to be accessed by children”
if
“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking into consideration … the nature and content of the service and whether that has particular appeal for children … the way in which the service is accessed and any measures in place to prevent children gaining access … market research, current evidence on user behaviour, the user base of similar or existing services”
that are likely to be accessed.
Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.
Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.
When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.
Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.
Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.
Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.
My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.
My Lords, I rise to speak to Amendments 19, 22, 298 and 299 in my name and those of the noble Baroness, Lady Stowell, and the noble Lords, Lord Knight and Lord Clement-Jones. I will also briefly add at the end of my speech my support for the amendments in the name of my friend, the noble Baroness, Lady Kidron. It has been a huge privilege to be her support act all the way from the beginnings of the age-appropriate design code; it feels comfortable to speak after her.
I want briefly to set out what my amendments would do. Their purpose is to bring app stores into the child protection elements of the Bill. Amendment 19 would require app stores to prepare
“risk assessments equal to user-to-user services due to their role in distributing online content through apps to children and as a primary facilitator of user-to-user”
services reaching children. Amendment 22 would mandate app stores
“to use proportionate and proactive measures, such as age assurance, to prevent children”
coming into contact with
“primary priority content that is harmful to children”.
Amendments 298 and 299 would simply define “app” and “app stores”.
Let us be clear what app stores do. They enable customers to buy apps and user-to-user services. They enable customers to download free apps. They offer up curated content in the app store itself and decide what apps someone would like to see. They enable customers to search for apps for user-to-user content. They provide age ratings; as the noble Baroness, Lady Kidron, said, they may be different age ratings in different app stores for the same app. They sometimes block the download of apps based on the age rating and their assessment of someone’s age, but not always, and it is different for different app stores.
Why should they be included in this Bill—if it is not obvious from what I have already said? First, two companies are profiting from selling user-to-user products to children. Two app stores account for some 98%-plus of all downloads of user-to-user services, with no requirements to assess the risk of selling those products to children or to mitigate those risks. We do not allow that in the physical world so we should not allow it in the digital world.
Secondly, parents and teenagers tell us that this measure would help. A number of different studies have been done; I will reference just two. One was by FOSI, the Family Online Safety Institute, which conducted an international research project in which parents consistently said that having age assurance at the app store level would make things simpler and more effective for them; ironically, the FOSI research was conducted with Google.
A second research study, conducted by Internet Matters and TikTok, unambiguously shows that teenagers themselves would prefer having app store age assurance. Neither of those research projects suggests that the age assurance should be instead of age assurance in the apps themselves. They view it as additive, as an addition that would make it simpler for them and ensure that fewer children reach the point of downloading apps that they should not.
The third reason why this is necessary is that, as the noble Baroness, Lady Kidron, said, Google and Apple are already doing some of this. They are doing it differently and should be commended, to some extent, for the progress that they have made over the past five years. Google Family Link and the family functionality on the Apple store are better than they were five years ago. However, we should be troubled that this is currently not regulated. They are age-rating apps differently. Can you imagine, in the physical world, Sainsbury’s deciding that alcohol was suitable for 17 year-olds and above, Tesco deciding that it was suitable for 18 year-olds and above, and government not being able to intervene? That is the world which we are in with access to pornography today.
I am the mother of a 17 year-old girl. I went into her iPhone last night and searched on the Apple App Store. Pornography apps come up as age appropriate for 17+. This is the consequence of an unregulated app store world. Today, as I said, the vast majority is with Google and Apple. On the day that the Government launch their digital competition Bill, we should hope that over time there will be further app stores. What is to say that those app stores will do anything to protect children as they try to compete with Google and Apple?
The final reason why we should do this is that a number of app developers, particularly small ones, have expressed to me a concern that app stores might abuse their power of age-gating the internet to block apps that compete with their own. That is exactly why we should regulate this space, rather than leaving it for Google and Apple to decide what an age gate should or should not look like. Self-regulation has failed to protect children online over the past 15 years. Many of us in the Chamber today have been working in this space for at least that long. There is no reason to believe that self-regulation would be any more successful for app stores than it has been for the rest of the internet.
I have tabled these amendments and ask my noble friend the Minister to recognise that I have done so in the spirit of starting the conversation on how we regulate app stores. It is unambiguously clear that we should regulate them. The last thing that I would want to do is have my amendment slow down the progress of this Bill. The last thing that I would want is to slow down Ofcom’s implementation of the Bill. However, we keep being told that this is a framework Bill to focus on systems and processes, and it is an essential part of that framework that app stores are included.
Very briefly, I will speak in support of the amendments tabled by the noble Baroness, Lady Kidron, by telling you a story. One of my first jobs in the retail world was as the commercial director for Woolworths—we are all old enough in this Chamber to remember Woolworths —which was the leading retailer of toys. One of my first category directors for the toy category had come from outside the toy industry. I will never forget the morning when he came to tell me that an own-label Woolworths toy had caused a near-fatal accident with a child. He was new to the industry and had not worked in toys before. He said, “It’s only one child; don’t worry, it’ll be okay”. I remember saying, “That is not how health and safety with children works. This is one incident; we need to delist the product immediately; we need to treat this incredibly seriously. Imagine if that was your child”. I do not begrudge his reaction; he had never worked in that sector before.
However, the reality is that if we do not look at the impact of the digital world on every child, then we are adopting a different standard in the digital world than we do in the physical world. That is why the “likely to be accessed by children” definition that has been tried and tested, not just in this House but in legislatures around the world, should be what is used in this Bill.
My Lords, it is a pleasure to follow the two noble Baronesses. I remind the Committee of my background as a board member of the Centre for Data Ethics and Innovation. I also declare an indirect interest, as my oldest son is the founder and studio head of Mediatonic, which is now part of Epic Games and is the maker of “Fall Guys”, which I am sure is familiar to your Lordships.
I speak today in support of Amendments 2 and 92 and the consequent amendments in this group. I also support the various app store amendments proposed by the noble Baroness, Lady Harding, but I will not address them directly in these remarks.
I was remarkably encouraged on Wednesday by the Minister’s reply to the debate on the purposes of the Bill, especially by the priority that he and the Government gave to the safety of children as its primary purpose. The Minister underlined this point in three different ways:
“The main purposes of the Bill are: to give the highest levels of protection to children … The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children … Children’s safety is prioritised throughout this Bill”.—[Official Report, 19/4/23; col. 724.]
The purpose of Amendments 2 and 92 and consequent amendments is to extend and deepen the provisions in the Bill to protect children against a range of harms. This is necessary for both the present and the future. It is necessary in the present because of the harms to which children are exposed through a broad range of services, many of which are not currently in the Bill’s scope. Amendment 2 expands the scope to include any internet service that meets the child user condition and enables or promotes harmful activity and content as set out in the schedule provided. Why would the Government not take this step, given the aims and purposes of the Bill to give the highest protection to children?
Every day, the diocese of Oxford educates some 60,000 children in our primary and secondary schools. Almost all of them have or will have access to a smartphone, either late in primary, hopefully, or early in secondary school. The smartphone is a wonderful tool to access educational content, entertainment and friendship networks, but it is also a potential gateway for companies, children and individuals to access children’s inner lives, in secret, in the dead of night and without robust regulation. It therefore exposes them to harm. Sometimes that harm is deliberate and sometimes unintentional. This power for harm will only increase in the coming years without these provisions.
The Committee needs to be alert to generational changes in technology. When I was 16 in secondary school in Halifax, I did a computer course in the sixth form. We had to take a long bus ride to the computer building in Huddersfield University. The computer filled several rooms in the basement. The class learned how to program using punch cards. The answers to our questions came back days later, on long screeds of printed paper.
When my own children were teenagers and my oldest was 16, we had one family computer in the main living room of the house. The family was able to monitor usage. Access to the internet was possible, but only through a dial-up modem. The oldest of my grandchildren is now seven and many of his friends have smartphones now. In a few years, he will certainly carry a connected device in his pocket and, potentially, have access to the entire internet 24/7.
I want him and millions of other children to have the same protection online as he enjoys offline. That means recognising that harms come in a variety of shapes and sizes. Some are easy to spot, such as pornography. We know the terrible damage that porn inflicts on young lives. Some are more insidious and gradual: addictive behaviours, the promotion of gambling, the erosion of confidence, grooming, self-harm and suicidal thoughts, encouraging eating disorders, fostering addiction through algorithms and eroding the barriers of the person.
The NSPCC describes many harms to children on social networks that we are all now familiar with, but it also highlights online chat, comments on livestream sites, voice chat in games and private messaging among the vectors for harm. According to Ofcom, nine in 10 children in the UK play video games, and they do so on devices ranging from computers to mobile phones to consoles. Internet Matters says that most children’s first interaction with someone they do not know online is now more likely to be in a video game such as “Roblox” than anywhere else. It also found that parents underestimate the frequency with which their children are contacted by strangers online.
The Gambling Commission has estimated that 25,000 children in the UK aged between 11 and 16 are problem gamblers, with many of them introduced to betting via computer games and social media. Families have been left with bills, sometimes of more than £3,000, after uncontrolled spending on loot boxes.
Online companies, we know, design their products with psychological principles of engagement firmly in view, and then refine their products by scraping data from users. According to the Information Commissioner, more than 1 million underage children could have been exposed to underage content on TikTok alone, with the platform collecting and using their personal data.
As the noble Baroness, Lady Kidron, has said, we already have robust and tested definitions of scope in the ICO’s age-appropriate design code—definitions increasingly taken up in other jurisdictions. To give the highest protection to children, we need to build on these secure definitions in this Bill and find the courage to extend robust protection across the internet now.
We also need to future-proof this Bill. These key amendments would ensure that any development, any new kind of service not yet imagined which meets the child user condition and enables or promotes harmful activity and content, would be in scope. This would give Ofcom the power to develop new guidance and accountabilities for the applications that are certain to come in the coming years.
We have an opportunity and a responsibility, as the Minister has said, to build the highest protection into this Bill. I support the key amendments standing in my name.
My Lords, first, I beg the indulgence of the Committee to speak briefly at this juncture. I know that no one from the Lib Dem or Labour Benches has spoken yet, but I need to dash over to the Moses Room to speak to some amendments I am moving on the Bill being considered there. Secondly, I also ask the Committee that, if I do not get back in time for the wind-ups, I be forgiven on this occasion.
I simply wanted to say something briefly in support of Amendments 19, 22, 298 and 299, to which I have added my name. My noble friend Lady Harding has already spoken to them comprehensively, so there little I want to add; I just want to emphasise a couple of points. But first, if I may, I will pick up on something the right reverend Prelate said. I think I am right in saying that the most recent Ofcom research shows that 57% of 7 year-olds such as his grandchild have their own phone, and by the time children reach the age of 12 they pretty much all have their own phone. One can only imagine that the age at which children possess their own device is going to get lower.
Turning to app stores, with which these amendments are concerned, currently it is the responsibility of parents and developers to make sure that children are prevented from accessing inappropriate content. My noble friend’s amendments do not dilute in any way the responsibility that should be held by those two very important constituent groups. All we are seeking to do is ensure that app stores, which are currently completely unregulated, take their share of responsibility for making sure that those seeking to download and then use such apps are in the age group the apps are designed for.
As has already been very powerfully explained by my noble friend and by the noble Baroness, Lady Kidron, different age ratings are being given by the two different app stores right now. It is important for us to understand, in the context of the digital markets and competition Bill, which is being introduced to Parliament today—I cannot tell noble Lords how long we have waited for that legislation and how important it is, not least because it will open up competition, particularly in app stores—that the more competition there will be across app stores and the doorways through which children can go to purchase or download apps, the more important it is that there is consistency and some regulation. That is why I support my noble friend and was very happy to add my name to her amendments.
My Lords, it falls to me to inject some grit into what has so far been a very harmonious debate, as I will raise some concerns about Amendments 2 and 22.
I again declare my interest: I spent 10 years working for Facebook, doing the kind of work that we will regulate in this Bill. At this point noble Lords are probably thinking, “So it’s his fault”. I want to stress that, if I raise concerns about the way the regulation is going, it is not that I hold those views because I used to work for the industry; rather, I felt comfortable working in the industry because I always had those views, back to 2003 when we set up Ofcom. I checked the record, and I said things then that are remarkably consistent with how I feel today about how we need to strike the balance between the power of the state and the power of the citizen to use the internet.
I also should declare an interest in respect of Amendment 2, in that I run a blog called regulate.tech. I am not sure how many children are queueing up to read my thoughts about regulation of the tech industry, but they would be welcome to do so. The blog’s strap- line is:
“How to regulate the internet without breaking it”.
It is very much in that spirit that I raise concerns about these two amendments.
I certainly understand the challenges for content that is outside of the user-to-user or search spaces. I understand entirely why the noble Baroness, Lady Kidron, feels that something needs to be done about that content. However, I am not sure that this Bill is the right vehicle to address that kind of content. There are principled and practical reasons why it might be a mistake to extend the remit here.
The principle is that the Bill’s fundamental purpose is to restrict access to speech by people in the United Kingdom. That is what legislation such as this does: it restricts speech. We have a framework in the Human Rights Act, which tells us that when we restrict speech we have to pass a rigorous test to show that those restrictions are necessary and proportionate to the objective we are trying to achieve. Clearly, when dealing with children, we weight very heavily in that test whether something is necessary and proportionate in favour of the interest of the welfare of the children, but we cannot do away with the test altogether.
It is clear that the Government have applied that test over the years that they have been preparing this Bill and determined that there is a rationale for intervention in the context of user-to-user services and search services. At the same time, we see in the Bill that the Government’s decision is that intervention is not justified in all sorts of other contexts. Email and SMS are excluded. First-party publisher content is excluded, so none of the media houses will be included. We have a Bill that is very tightly and specifically framed around dealing with intermediaries, whether that is user-to-user intermediaries who intermediate in user-generated content, or search as an intermediary, which scoops up content from across the internet and presents it to you.
This Bill is about regulating the regulators; it is not about regulating first-party speakers. A whole world of issues will come into play if we move into that space. It does not mean that it is not important, just that it is different. There is a common saying that people are now bandying around, which is that freedom of speech is not freedom of reach. To apply a twist to that, restrictions on reach are not the same as restrictions on speech. When we talk about restricting intermediaries, we are talking about restricting reach. If I have something I want to say and Facebook or Twitter will not let me say it, that is a problem and I will get upset, but it is not the same as being told that I cannot say it anywhere on the internet.
My concern about Amendment 2 is that it could lead us into a space where we are restricting speech across the internet. If we are going to do that—there may be a rationale for doing it—we will need to go back and look at our necessity and proportionality test. It may play out differently in that context from user-to-user or intermediary-based services.
From a practical point of view, we have a Bill that, we are told, will give Ofcom the responsibility of regulating 25,000 more or less different entities. They will all be asked to pay money to Ofcom and will all be given a bunch of guidance and duties that they have to fulfil. Again, those duties, as set out in painful length in the Bill, are very specifically about the kind of things that an intermediary should do to its users. If we were to be regulating blogs or people’s first-party speech, or publishers, or the Daily Telegraph, or whoever else, I think we would come up with a very different set of duties from the duties laid out in the Bill. I worry that, however well-motivated, Amendment 2 leads us into a space for which this Bill is not prepared.
I have a lot of sympathy with the views of the noble Baroness, Lady Harding, around the app stores. They are absolutely more like intermediaries, or search, but again the tools in the Bill are not necessarily dedicated to how one would deal with app stores. I was interested in the comments of the noble Baroness, Lady Stowell, on what will be happening to our competition authorities; a lot will be happening in that space. On app stores, I worry about what is in Amendment 22: we do not want app stores to think that it is their job to police the content of third-party services. That is Ofcom’s job. We do not want the app stores to get in the middle, not least because of these commercial considerations. We do not want Apple, for instance, thinking that, to comply with UK legislation, it might determine that WhatsApp is unsafe while iMessage is safe. We do not want Google, which operates Play Store, to think that it would have a legal rationale for determining that TikTok is unsafe while YouTube is safe. Again, I know that this is not the noble Baroness’s intention or aim, but clearly there is a risk that we open that up.
There is something to be done about app stores but I do not think that we can roll over the powers in the Bill. When we talk about intermediaries such as user-to-user services and search, we absolutely want them to block bad content. The whole thrust of the Bill is about forcing them to restrict bad content. When it comes to app stores, the noble Baroness set out some of her concerns, but I think we want something quite different. I hesitate to say this, as I know that my noble friend is supportive of it, but I think that it is important as we debate these issues that we hear some of those concerns.
Could it not be argued that the noble Lord is making a case for regulation of app stores? Let us take the example of Apple’s dispute with “Fortnite”, where Apple is deciding how it wants to police things. Perhaps if this became a more regulated space Ofcom could help make sure that there was freedom of access to some of those different products, regardless of the commercial interests of the people who own the app stores.
The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.
There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.
Would the noble Lord acknowledge that app stores are already undertaking these age-rating and blocking decisions? Google has unilaterally decided that, if it assesses that you are under 18, it will not serve up over-18 apps. My concern is that this is already happening but it is happening indiscriminately. How would the noble Lord address that?
The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.
I want to reassure the noble Lord: I have his blog in front of me and he was quite right—there were not a lot of children on that site. It is a very good blog, which I read frequently.
I want to make two points. First, age-rating and age-gating are two different things, and I think the noble Lord has conflated them. There is a lot of age- rating going on, and it is false information. We need good information, and we have not managed to get it by asking nicely. Secondly, I slightly dispute his idea that we have a very structured Bill regarding user-to-user and so on. We have a very structured Bill from a harms perspective that describes the harms that must be prevented—and then we got to commercial porn, and we can also get to these other things.
I agree with the noble Lord’s point about freedom of speech, but we are talking about a fixed set of harms that will, I hope, be in the Bill by the end. We can then say that if children are likely to be accessed by this test, and known harm is there, that is what we are looking at. We are certainly not looking at the noble Lord’s blog.
I appreciate the intervention by the noble Baroness; I hope through this grit we may conjure up a pearl of some sort. The original concept of the Bill, as championed by the noble Baroness, would have been a generalised set of duties of care which could have stretched much more broadly. It has evolved in a particular direction and become ever more specific and tailored to those three services: user-to-user, search, and pornography services. Having arrived at that point, it is difficult to then open it back up and stretch it to reach other forms of service.
My intention in intervening in this debate is to raise some of those concerns because I think they are legitimate. I may be at the more sceptical end of the political world, but I am at the more regulation-friendly end of the tech community. This is said in a spirit of trying to create a Bill that will actually work. I have done the work, and I know how hard Ofcom’s job will be. That sums up what I am trying to say: my concern is that we should not give Ofcom an impossible job. We have defined something quite tight—many people still object to it, think it is too loose and do not agree with it—but I think we have something reasonably workable. I am concerned that, however tempting it is, by re-opening Pandora’s box we may end up creating something less workable.
That does not mean we should forget about app stores and non-user-to-user content, but we need to think of a way of dealing with those which does not necessarily just roll over the mechanism we have created in the Online Safety Bill to other forms of application.
I strongly support the amendments in the name of the noble Baroness, Lady Kidron, because I want to see this Bill implemented but strengthened in order to fulfil the admirable intention that children must be safe wherever they are online. This will not be the case unless child safety duties are applicable in all digital environments likely to be accessed by children. This is not overly ambitious or unrealistic; the platforms need clarity as to these new responsibilities and Ofcom must be properly empowered to enforce the rules without worrying about endless legal challenges. These amendments will give that much-needed clarity in this complex area.
As the Joint Committee recommended, this regulatory alignment would simplify compliance with businesses while giving greater clarity to people who use the service and greater protection for children. It would give confidence to parents and children that they need not work out if they are in a regulated or unregulated service while online. The Government promised that the onus for keeping young people safe online would sit squarely on the tech companies’ shoulders.
Without these amendments, there is a real danger that a loophole will remain whereby some services, even those that are known to harm, are exempt, leaving thousands of children exposed to harm. They would also help to future-proof the Bill. For example, some parts of the metaverse as yet undeveloped may be out of scope, but already specialist police units have raised concerns that abuse rooms, limited to one user, are being used to practise violence and sexual violence against women and girls.
We can and must make this good Bill even better and support all the amendments in this group.
My Lords, as I listen to the words echoing around the Chamber, I try to put myself in the shoes of parents or children who, in one way or another, have suffered as a result of exposure to things happening online. Essentially, the world that we are talking about has been allowed to grow like Topsy, largely unregulated, at a global level and at a furious pace, and that is still happening as we do this. The horses have not just bolted the stable; they are out of sight and across the ocean. We are talking about controlling and understanding an environment that is moving so quickly that, however fast we move, we will be behind it. Whatever mousetraps we put in place to try to protect children, we know there are going to be loopholes, not least because children individually are probably smarter than we are collectively at knowing how to get around well-meaning safeguards.
There are ways of testing what is happening. Certain organisations have used what they term avatars. Essentially, you create mythical profiles of children, which are clearly stated as being children, and effectively let them loose in the online world in various directions on various platforms and observe what happens. The tests that have been done on this—we will go into this in more detail on Thursday when we talk about safety by design—are pretty eye-watering. The speed with which these avatars, despite being openly stated as being profiles of children, are deluged by a variety of content that should be nowhere near children is dramatic and incredibly effective.
I put it to the Minister and the Bill team that one of the challenges for Ofcom will be not to be so far behind the curve that it is always trying to catch up. It is like being a surfer: if you are going to keep going then you have to keep on the front side of the wave. The minute you fall behind it, you are never going to catch up. I fear that, however well-intentioned so much of the Bill is, unless and until His Majesty’s Government and Ofcom recognise that we are probably already slightly behind the crest of the wave, whatever we try to do and whatever safeguards we put in place are not necessarily going to work.
One way we can try to make what we do more effective is the clever, forensic use of approaches such as avatars, not least because I suspect their efficacy will be dramatically increased by the advent and use of AI.
Tim Cook, the CEO of Apple, put it very well:
“Kids are born digital, they’re digital kids now … And it is, I think, really important to set some hard rails around it”.
The truth is that in the area of app stores, Google and Apple, which, as we have heard, have a more than 95% share of the market, are just not voluntarily upholding their responsibilities in making the UK a safe place for children online. There is an air of exceptionalism about the way they behave that suggests they think the digital world is somehow different from the real world. I do not accept that, which is why I support the amendments in the name of my noble friend Lady Harding and others—Amendments 19, 22, 298, 299 and other connected amendments.
There are major holes in the app stores’ child safety measures, which mean that young teens can access adult apps that offer dating, random chats, casual sex and gambling, even when Apple and Google emphatically know that the user is a minor. I will give an example. Using an Apple ID for a simulated 14 year-old, the Tech Transparency Project looked at 80 apps in the App Store that are theoretically limited to 17 and older. It found that underage users could very easily evade age restrictions in the vast majority of cases. There is a dating app that opens directly into pornography before ever asking the user’s age; adult chat apps filled with explicit images that never ask the user’s age, and a gambling app that lets the minor account deposit and withdraw money.
What kind of apps are we talking about here? We are talking about apps such as UberHoney; Eros, the hook-up and adult chat app; Hahanono—Chat & Get Naughty, and Cash Clash Games: Win Money. The investigation found that Apple and other apps essentially pass the buck to each other when it comes to blocking underage users, making it easy for young teens to slip through the system. My day-to-day experience as a parent of four children completely echoes that investigation, and it is clear to me that Apple and Google just do not share age data with the apps in their app stores, or else children would not be able to download those apps.
There is a wilful blindness to minors tweaking their age. Parental controls on mobile phones are, to put it politely, a joke. It takes a child a matter of minutes to circumvent them—I know from my experience—and I have wasted many hours fruitlessly trying to control these arrangements. That is just not good enough for any business. It is not good enough because so many teenagers have mobile phones, as we discussed—two-thirds of children have a smartphone by the age of 10. Moreover, it is not good enough because they are accessing huge amounts of filthy content, dodgy services and predatory adults, things that would never be allowed in the real world. The Office of the Children’s Commissioner for England revealed that one in 10 children had viewed pornography by the time they were nine years old. The impact on their lives is profound: just read the testimony on the recent Mumsnet forums about the awful impact of pornography on their children’s lives.
To prevent minors from accessing adult-only apps, the most efficient measure would be, as my noble friend Lady Harding pointed out, to check users’ ages during the distribution step, which means directly in the app store or on the web browser, prior to the app store or the internet browser initiating the app or the platform download. This can be done without the developer knowing the user’s specific age. Developing a reliable age-verification regime applied at that “distribution layer” of the internet supply chain would significantly advance the UK’s objective of creating a safer online experience and set a precedent that Governments around the world could follow. It would apply real-world principles to the internet.
This would not absolve any developer, app or platform of their responsibilities under existing legislation—not at all: it would build on that. Instead, it would simply mandate that every player in the ecosystem, right from the app store distribution layer, was legally obliged to promote a safer experience online. That is completely consistent with the principles and aims of the Online Safety Bill.
These amendments would subject two of the biggest tech corporations to the same duties regarding their app stores as we do the wider digital ecosystem and the real world. It is all about age assurance and protecting children. To the noble Lord, Lord Allan, I say that I cannot understand why my corner shop requires proof of age to buy cigarettes, pornography or booze, but Apple and Google think it is okay to sell apps with inappropriate content and services without proper age-verification measures and with systems that are wilfully unreliable.
There is a tremendous amount that is very good about Tim Cook’s commitment to privacy and his objections to the data industrial complex; but in this matter of the app stores, the big tech companies have had a blind spot to child safety for decades and a feeling of exceptionalism that is just no longer relevant. These amendments are an important step in requiring that app store owners step up to their responsibilities and that we apply the same standards to shopkeepers in the digital world as we would to shopkeepers in the real world.
My Lords, I enter this Committee debate with great trepidation. I do not have the knowledge and expertise of many of your Lordships, who I have listened to with great interest. What I do have is experience working with children, for over 40 years, and as a parent myself. I want to make what are perhaps some innocent remarks.
I was glad that the right reverend Prelate the Bishop of Oxford raised the issue of online gaming. I should perhaps declare an interest, in that I think Liverpool is the third-largest centre of online gaming in terms of developing those games. It is interesting to note that over 40% of the entertainment industry’s global revenue comes from gaming, and it is steadily growing year on year.
If I am an innocent or struggle with some of these issues, imagine how parents must feel when they try to cope every single day. I suppose that the only support they currently have, other than their own common sense of course, are rating verifications or parental controls. Even the age ratings confuse them, because there are different ratings for different situations. We know that films are rated by the British Board of Film Classification, which also rates Netflix and now Amazon. But it does not rate Disney, which has its own ratings system.
We also know that the gaming industry has a different ratings system: the PEGI system, which has a number linked to an age. For example PEGI 16, if a parent knew this, says that that rating is required when depiction of violence or sexual activity reaches a stage where it looks realistic. The PEGI system also has pictures showing that.
Thanks to the Video Recordings Act 1984, the PEGI 12, PEGI 16 and PEGI 18 ratings became legally enforceable in the UK, meaning that retailers cannot sell those video games to those below those ages. If a child or young person goes in, they could not be sold those games. However, the Video Recordings Act does not currently apply to online games, meaning that children’s safety in online gaming relies primarily on parents setting up parental controls.
I will listen with great interest to the tussles between various learned Lords, as all these issues show to me that perhaps the most important issue will come several Committee days down the path, when we talk about media literacy. That is because it is not just about enforcement, regulation or ratings; it is about making sure that parents have the understanding and the capacity. Let us not forget this about young people: noble Lords have talked about them all having a phone and wanting to go on pornographic sites, but I do not think that is the case at all. Often, young people, because of peer pressure and because of their innocence, are drawn into unwise situations. Then there are the risks that gaming can lead to: for example, gaming addiction was mentioned by the right reverend Prelate the Bishop of Oxford. There is also the health impact and maybe a link with violent behaviour. There is the interactive nature of video game players, cyber bullying and the lack of a feeling of well-being. All these things can happen, which is why we need media literacy to ensure that young people know of those risks and how to cope with them.
The other thing that we perhaps need to look at is standardising some of the simple gateposts that we currently have, hence the amendment.
My Lords, it is a pleasure to follow the noble Lord, Lord Storey. I support Amendments 19, 22 and so on in the name of my noble friend Lady Harding, on app stores. She set it out so comprehensively that I am not sure there is much I can add. I simply want to thank her for her patience as she led me through the technical arguments.
I support these amendments as I accept, reluctantly, that children are becoming more and more independent on the internet. I have ummed and ahhed about where parental responsibility starts and ends. I have a seven year-old, a 10 year-old and a 12 year-old. I do not see why any seven year-old, frankly, should have a smartphone. I do not know why any parent would think that is a good idea. It might make me unpopular, but there we are. I accept that a 12 year-old, realistically, has to have a smartphone in this day and age.
I said at Second Reading that Covid escalated digital engagement. It had to, because children had to go onto “Seesaw” and various other apps to access education. As a result, their social lives changed. They became faster and more digital. It seems to be customary to stand up and say that this Bill is very complicated, but at the end, when it passes after all this time, the Government will rightly want to go to parents and say, “We’ve done it; we’ve made this the safest place in the world to be online”.
Unless we support my noble friend’s amendments and can say to parents that we have been holistic about this and recognised a degree of parental responsibility but also the world that children will go into and how it may change—we have heard about the possibility of more app stores, creating a more confusing environment for parents and young people—I do not think we can confidently, hand on heart, say that we achieved what this Bill set out to achieve. On that note, I wholeheartedly support my noble friend’s amendments.
My Lords, one of our clergy in the diocese of Guildford has been campaigning for more than a decade, as have others in this Committee, on children’s access to online pornography. With her, I support the amendments in the names of the noble Baronesses, Lady Kidron and Lady Harding.
Her concerns eventually made their way to the floor of the General Synod of the Church of England in a powerful debate in July last year. The synod voted overwhelmingly in favour of a motion, which said that we
“acknowledge that our children and young people are suffering grave harm from free access to online pornography”
and urged us to
“have in place age verification systems to prevent children from having access to those sites”.
It asked Her Majesty’s Government to use their best endeavours to secure the passage and coming into force of legislation requiring age-verification systems preventing access by people under the age of 18. It also recommended more social and educational programmes to increase awareness of the harms of pornography, including self-generated sexually explicit images.
Introducing the motion, my chaplain, Reverend Jo Winn-Smith, said that age verification
“ought to be a no-brainer … Exposure to sexualised material is more likely to lead to young people engaging in more sexualised behaviour and to feel social pressure to have sex”,
as well as normalising sexual violence against girls and women. A speech from the chaplain-general of the Prison Service towards the end of the debate highlighted just where such behaviours and pressures could lead in extreme circumstances.
One major theme that emerged during the debate is highlighted by the amendments this afternoon: that access to online pornography goes far beyond materials that fall into what the Bill defines as Part 5 services. Another is highlighted in a further group of amendments: age assurance needs to be both mandatory and effective beyond reasonable doubt.
It was also commented on how this whole area has taken such an age to get on to the statute book, given David Cameron’s proposals way back in 2013 and further legislation proposed in 2018 that was never enacted. Talk of secondary legislation to define harmful content in that regard is alarming, as a further amendment indicates, given the dragging of feet that has now been perpetuated for more than a decade. That is a whole generation of children and young people.
In an imaginative speech in the synod debate, the most reverend Primate the Archbishop of York, Archbishop Stephen, reminded us that the internet is not a platform; it is a public space, where all the rights and norms you would expect in public should apply. In the 1970s, he continued, we famously put fluoride in the water supply, because we knew it would be great for dental health; now is the opportunity to put some fluoride into the internet. I add only this: let us not water down the fluoride to a point where it becomes feeble and ineffective.
My Lords, I will speak in support of the amendments in this group in the names of the intrepid noble Baroness, Lady Kidron, the noble Baroness, Lady Harding, and my noble friend Lord Storey—we are kindred spirits.
As my noble friend said, the expectations of parents are clear: they expect the Bill to protect their children from all harm online, wherever it is encountered. The vast majority of parents do not distinguish between the different content types. To restrict regulation to user-to-user services, as in Part 3, would leave a great many websites and content providers, which are accessed by children, standing outside the scope of the Bill. This is a flagship piece of legislation; there cannot be any loopholes leaving any part of the internet unregulated. If there is a website, app, online game, educational platform or blog—indeed, any content that contains harmful material—it must be in the scope of the Bill.
The noble Baroness, Lady Kidron, seeks to amend the Bill to ensure that it aligns with the Information Commissioner’s age-appropriate design code—it is a welcome amendment. As the Bill is currently drafted, the threshold for risk assessment is too high. It is important that the greatest number of children and young people are protected from harmful content online. The amendments achieve that to a greater degree than the protection already in the Bill.
While the proposal to align with the age-appropriate design code is welcome, I have one reservation. Up until recently, it appears that the ICO was reluctant to take action against pornography platforms that process children’s data. It has perhaps been deemed that pornographic websites are unlikely to be accessed by children. Over the years, I have shared with this House the statistics of how children are accessing pornography and the harm it causes. The Children’s Commissioner also recently highlighted the issue and concerns. Pornography is being accessed by our children, and we must ensure that the provisions of the Bill are the most robust they can be to ensure that children are protected online.
I am concerned with ensuring two things: first, that any platform that contains harmful material falls under the scope of the Bill and is regulated to ensure that children are kept safe; and, secondly, that, as far as possible, what is harmful offline is regulated in the same way online. The amendments in the name of my noble friend Lord Storey raise the important question of online-offline equality. Amendments 33A and 217A seek to regulate online video games to ensure they meet the same BBFC ratings as would be expected offline, and I agree with that approach. Later in Committee, I will raise this issue in relation to pornographic content and how online content should be subject to the same BBFC guidance as content offline. I agree with what my noble friend proposes: namely, that this should extend to video game content as well. Video games can be violent and sexualised in nature, and controls should be in place to ensure that children are protected. The BBFC guidelines used offline appear to be the best way to regulate online as well.
Children must be kept safe wherever they are online. This Bill must have the widest scope possible to keep children safe, but ensuring online/offline alignment is crucial. The best way to keep children safe is to legislate for regulation that is as far reaching as possible but consistently applied across the online/offline world. These are the reasons why I support the amendments in this group.
My Lords, I will lend my support to Amendments 19 and 22. It is a pleasure to speak after the noble Baroness, Lady Benjamin. I may be one of those people in your Lordships’ House who relies significantly on the British Board of Film Classification for movie watching, as I am one of the faint-hearted.
In relation to app stores, it is not just children under 18 for whom parents need the age verification. If you are a parent of a child who has significant learning delay, the internet is a wonderful place where they can get access to material and have development that they might not ordinarily have had. But, of course, turning 17 or 18 is not the threshold for them. I have friends who have children with significant learning delay. Having that assurance, so they know which apps are which in the app store, goes well beyond 18 for them. Obviously it will not be a numerical equivalent for their child—now a young adult—but it is important to them to know that the content they get on a free app or an app purchased from the app store is suitable.
I just wanted to raise that with noble Lords, as children and some vulnerable adults—not all—would benefit from the kind of age verification that we have talked about. I appreciate the points that the noble Lord, Lord Allan, raised about where the Bill has ended up conceptually and the framework that Ofcom will rely on. Like him, I am a purist sometimes but, pragmatically, I think that the third concept raised by the noble Baroness, Lady Kidron, about protection and putting this in the app store and bringing it parallel with things such as classification for films and other video games is really important.
My Lords, this has been a really fascinating debate and I need to put a stake in the ground pretty early on by saying that, although my noble friend Lord Allan has raised some important points and stimulated an important debate, I absolutely agree with the vast majority of noble Lords who have spoken in favour of the amendment so cogently put forward by the noble Baronesses, Lady Kidron and Lady Harding.
Particularly as a result of the Bill’s being the subject of a Joint Committee, it has changed considerably over time in response to comment, pressure, discussion and debate and I believe very much that during Committee stage we will be able to make changes, and I hope the Minister will be flexible enough. I do not believe that the framework of the Bill is set in concrete. There are many things we can do as we go through, particularly in the field of making children safer, if we take some of the amendments that have been put forward on board. In particular, the noble Baroness, Lady Kidron, set out why the current scope of the Bill will fail to protect children if it is kept to user-to-user and search services. She talked about blogs with limited functionalities, gaming without user functionalities and mentioned the whole immersive environment, which the noble Lord, Lord Russell, described as eye-watering. As she said, it is not fair to leave parents or children to work out whether they are on a regulated service. Children must be safe wherever they are online.
As someone who worked with the noble Baroness, Lady Kidron, in putting the appropriate design code in place in that original Data Protection Act, I am a fervent believer that it is perfectly appropriate to extend in the way that is proposed today. I also support her second amendment, which would bring the Bill’s child user condition in line with the threshold of the age-appropriate design code. It is the expectation—I do not think it an unfair expectation—of parents, teachers and children themselves that the Bill will apply to children wherever they are online. Regulating only certain services will mean that emerging technologies that do not fit the rather narrow categories will not be subject to safety duties.
The noble Baroness talked about thousands of children being potentially at risk of not having the protection of the Bill. That is absolutely fair comment. Our Joint Committee report said:
“We recommend that the ‘likely to be accessed by children’ test in the draft Online Safety Bill should be the same as the test underpinning the Age Appropriate Design Code’.
The Government responded:
“The government considers that the approach taken in the Bill is aligned with the Age Appropriate Design Code and will ensure consistency for businesses. In addition, the status of the legislative test in the Online Safety Bill is binding in a way that the test in the Age Appropriate Design Code is not”.
In that case, since both those statements in the current Bill are patently not the case, it is incumbent on the Government to change the Bill in the direction that the noble Baroness has asked for.
My noble friend stimulated a very important debate about the amendments of the noble Baroness, Lady Harding, in particular. That is another major potential omission in the Bill. The tech giants responsible for the distribution of nearly all apps connecting smartphone users to the internet are not currently covered in the scope of the Bill. She said that the online safety regime must look at this whole area much more broadly. App stores should be added to the list of service providers who will be mandated by the Bill to protect children and all users online. I am not going to go into all the arguments that have been made so well by noble Lords today, but of course Google and app stores have a monopoly on app distribution, yet they do not control users’ ages. They have the technical ability to prevent minors accessing certain applications reserved for adults, as evidenced by the existing parental control functions on both smartphone operating systems and their corresponding app stores, and of course, as the noble Baroness, Lady Berridge, said, this applies not just to children but to vulnerable adults as well.
I thought the noble Lord, Lord Bethell, put it very well: other sectors of the economy have already implemented such control in the distribution of goods and services in the offline world; alcohol consumption provides a good example for understanding those issues. Why cannot Google and Apple have duties that a corner store can adhere to? App stores do not have age assurance systems in place and do not actually seem to wish to take any responsibility for the part they can play in permitting harms. I say to my noble friend that the word “store” is the clue: these are products being sold through the app store and there should be age-gating on those apps. The only way to improve safety is to make sure that app developers and companies that distribute these apps do more to ensure that children and vulnerable adults are appropriately kept away from adult applications and content. That is an entirely reasonable duty to place on them: it is an essential part, I think, of the framework of the Bill that we should take these sets of amendments on board.
The right reverend Prelate the Bishop of Oxford talked about the fact that harms will only increase in coming years, particularly, as he said, with ever younger children having access to mobile technology. Of course, I agree with my noble friend about the question of media literacy. This goes hand in hand with regulation, as we will discover when we talk about this later on. These amendments will not, in the words of my noble friend, break the internet: I think they will add substantially and beneficially to regulation.
I say to my noble friend Lord Storey that I support his amendments too; they are more like probing amendments. There is a genuine gap that I think many of us were not totally aware of. I assumed that, in some way, the PEGI classifications applied here, but if age ratings do not apply to online games, that is a major gap. We need to look at that very carefully, alongside these amendments, which I very much hope the Minister will accept.
My Lords, I echo the comments of the noble Lord, Lord Clement-Jones. This is an important group of amendments, and it has been a useful debate. I was slightly concerned when I heard the noble Baroness, Lady Harding, talk about using her daughter’s device to see whether it could access porn sites in terms of what that is going to do to her daughter’s algorithm and what it will now feed her. I will put that concern to one side, but any future report on that would be most welcome.
Amendments 2, 3 and 5, introduced so well by the noble Baroness, Lady Kidron, test what should be in scope to protect children. Clearly, we have a Bill that has evolved over some time, with many Ministers, to cover unambiguously social media, as user-to-user content, and search. I suspect that we will spend a lot more time discussing social media than search, but I get the rationale that those are perhaps the two main access points for a lot of the content we are concerned about. However, I would argue that apps are also main access points. I will come on to discuss the amendments in the name of the noble Baroness, Lady Harding, which I have also signed. If we are going to go with access points, it is worth probing and testing the Government’s intent in excluding some of these other things. The noble Lord, Lord Storey, raises in his amendments the issue of games, as others have done. Games are clearly a point of access for lots of children, as well as adults, and there is plenty of harm that can be created as a result of consuming them.
Along with some other noble Lords, some time ago I attended an all-party group which looked at the problems related to incel harm online and how people are breadcrumbed from mainstream sites to quite small websites to access the really problematic, most hateful and most dangerous content. Those small websites, as far as I can see, are currently excluded from the regime in the Bill, but the amendments in the name of the noble Baroness, Lady Kidron, potentially would bring them into scope. That meeting also discussed cloud services and the supply chain of the technical infrastructure that such risks, including incels and other things, use. Why are cloud services not included in some context in terms of the harms that might be created?
Questions have been asked about large language model AIs such as ChatGPT. These are future technologies that have now arrived, which lots of people are talking about and variously freaking out about or getting excited by. There is an important need to bring those quite quickly into the scope of regulation by Ofcom. ChatGPT is a privately owned platform—a privately owned technology—that is offering up not only access to the range of knowledge that is online but, essentially, the range of human concepts that are online in interaction with that knowledge—privately owned versions of truth.
What is to stop any very rich individual deciding to start their own large language model with their own version of the truth, perhaps using their own platform? Former President Trump comes to mind as someone who could do that and I suggest that, if truth is now a privatised thing, we might want to have some regulation here.
The future-proofing issues are why we should be looking very seriously at the amendments in the name of the noble Baroness, Lady Kidron. I listened carefully to the noble Lord, Lord Allan, as always, and I have reflected a lot on his very useful car safety and plane safety regulation analogy from our previous day in Committee. The proportionality issue that he raised in his useful contribution this time is potentially addressed by the proposed new clause we discussed last time. If the Bill sets out quite clearly the aim of the legislation, that would set the frame for the regulator and for how it would regulate proportionately the range of internet services that might be brought into scope by this set of amendments.
I also support Amendment 92, on bringing in safety by design and the regime that has been so successful in respect of the age-related design code and the probability of access by children, rather than what is set out in the Bill.
I turn to Amendments 19, 22, 298 and 299 in the names of the noble Baronesses, Lady Harding and Lady Stowell, the noble Lord, Lord Clement-Jones, and myself. Others, too, have drawn the analogy between app stores and corner shops selling alcohol, and it makes sense to think about the distribution points in the system—the pinch points that all users go through—and to see whether there is a viable way of protecting people and regulating through those pinch points. The Bill seeks to protect us via the platforms that host and promote content having regulation imposed on them, and risk assessments and so on, but it makes a lot of sense to add app stores, given how we now consume the internet.
I remember, all those years ago, having CD drives—floppy disk drives, even—in computers, and going off to buy software from a retail store and having to install it. I do not go quite as far back as the right reverend Prelate the Bishop of Oxford, but I remember those days well. Nowadays as consumers almost all of us access our software through app stores, be it software for our phones or software for our laptops. That is the distribution point for mobiles and essentially it is, as others have said, a duopoly that we hope will be addressed by the Digital Markets, Competition and Consumers Bill.
As others have said, 50% of children under 10 in this country use smartphones and tablets. When you get to the 12 to 15 bracket, you find that 97% of them use mobile phones and tablets. We have, as noble Lords have also said, Google Family Link and the Apple Family Sharing function. That is something we use in my family. My stepdaughter is 11—she will be 12 in June—and I appear to be in most cases the regulator who has to give her the Family Link code to go on to Google Classroom when she does her homework, and who has to allow her to download an app or add another contact—there is a whole range of things on her phone for which I provide the gatekeeper function. But you have to be relatively technically competent and confident to do all those things, and to manage her screen time, and I would like to see more protection for those who do not have that confidence—and indeed for myself as well, because maybe I would not have to be bothered quite as often.
It is worth noting that the vast majority of children in this country who have smartphones—the last time I looked at the stats, it was around 80%—have iPhones; there must be a lot of old iPhones that have been recycled down the family. To have an iCloud account, if you are under 13, you have to go through a parent or other suitable adult. However, if you are over 13, you can get on with it; that raises a whole set of issues and potential harms for children over the age of 13.
I am less familiar with the user journey and how it works on Google Play—we are more of an Apple family—but my understanding is that, for both Google Play and the Apple App Store, in order to set up an account you need credit card billing information. This creates ID verification, and the assurance that many of us are looking for is then provided as an additional safeguard for children. This is not something that anyone is arguing should replace the responsibilities set out in the Bill for internet service providers—for example, that they should carry out risk assessments and be regulated. This is about having additional safeguards at the point of distribution. We are not asking Apple and Google, in this case, to police the apps. We are asking them to ensure that the publishers of the applications set an age limit and then facilitate ensuring that that age limit is adhered to, according to everything that they know about the user of that device and their age. I am grateful to the noble Baroness, Lady Harding, for her amendments on this important issue.
Finally, let me say this in anticipation of the Minister perhaps suggesting that this might be a good idea but we are far down the road with the Bill and Ofcom is ready to go and we want to get on with implementing it, so maybe let us not do this now but perhaps in another piece of legislation. Personally, I am interested in having a conversation about the sequence of implementation. It might be that we can implement the regime that Ofcom is good to go on but with the powers there in the Bill for it to cover app stores and some other wider internet services, according to a road map that it sets out and that we in Parliament can scrutinise. However, my general message is, as the noble Baroness, Lady Kidron, said, that we should get this right in this legislation and grab the opportunity, particularly with app stores, to bring other internet services in—given that we consume so much through applications—and to provide a safer environment for our children.
My Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.
As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.
As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.
I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.
These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.
Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—
I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?
If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.
On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.
The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.
I know it has been said that the large language models, such as that used by ChatGPT, will be in scope when they are embedded in search, but are they in scope generally?
They are when they apply to companies enabling users to share content online and interact with each other or in terms of search. They apply in the context of the other duties set out in the Bill.
Amendments 19, 22, 298 and 299, tabled by my noble friend Lady Harding of Winscombe, seek to impose child safety duties on application stores. I am grateful to my noble friend and others for the collaborative approach that they have shown and for the time that they have dedicated to discussing this issue since Second Reading. I appreciate that she has tabled these amendments in the spirit of facilitating a conversation, which I am willing to continue to have as the Bill progresses.
As my noble friend knows from our discussions, there are challenges with bringing application stores—or “app stores” as they are popularly called—into the scope of the Bill. Introducing new duties on such stores at this stage risks slowing the implementation of the existing child safety duties, in the way that I have just outlined. App stores operate differently from user-to-user and search services; they pose different levels of risk and play a different role in users’ experiences online. Ofcom would therefore need to recruit different people, or bring in new expertise, to supervise effectively a substantially different regime. That would take time and resources away from its existing priorities.
We do not think that that would be a worthwhile new route for Ofcom, given that placing child safety duties on app stores is unlikely to deliver any additional protections for children using services that are already in the scope of the Bill. Those services must already comply with their duties to keep children safe or will face enforcement action if they do not. If companies do not comply, Ofcom can rely on its existing enforcement powers to require app stores to remove applications that are harmful to children. I am happy to continue to discuss this matter with my noble friend and the noble Lord, Lord Knight, in the context of the differing implementation timelines, as he has asked.
The Minister just said something that was material to this debate. He said that Ofcom has existing powers to prevent app stores from providing material that would have caused problems for the services to which they allow access. Can he confirm that?
Perhaps the noble Lord could clarify his question; I was too busy finishing my answer to the noble Lord, Lord Knight.
It is a continuation of the point raised by the noble Baroness, Lady Harding, and it seems that it will go part of the way towards resolving the differences that remain between the Minister and the noble Baroness, which I hope can be bridged. Let me put it this way: is it the case that Ofcom either now has powers or will have powers, as a result of the Bill, to require app stores to stop supplying children with material that is deemed in breach of the law? That may be the basis for understanding how you can get through this. Is that right?
Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.
My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.
I will be very happy to set that out in more detail.
Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.
The Government work closely with the industry and with the Video Standards Council to promote and encourage the displaying of Pan-European Games Information—PEGI—age ratings online. That has contributed to almost 3 million online games being given such ratings, including Roblox, mentioned by the right reverend Prelate the Bishop of Oxford. Most major online storefronts have made it mandatory for game developers supplying digital products on their platforms to obtain and display PEGI ratings. These include Google Play, Microsoft, PlayStation, Nintendo, Amazon Luna and Epic. Apple uses its own age ratings, rather than PEGI ratings, on all the video games available on its App Store.
Online games in the UK can obtain PEGI ratings by applying directly to the Video Standards Council or via the international age rating coalition system, which provides ratings based on answers to a questionnaire when a game is uploaded. That system ensures that, with unprecedented volumes of online video games—and the noble Lord is right to point to the importance of our creative industries—all digital content across most major digital storefronts can carry a PEGI rating. These ratings are regularly reviewed by international regulators, including our own Video Standards Council, and adjusted within hours if found to be incorrect.
I hope that gives the noble Lord the reassurance that the points he is exploring through his amendments are covered. I invite him not to press them and, with a promise to continue discussions on the other amendments in this group, I invite their proposers to do the same.
I thank the Minister for an excellent debate; I will make two points. First, I think the Minister was perhaps answering on my original amendment, which I have narrowed considerably to services
“likely to be accessed by children”
and with proven harm on the basis of the harms described by the Bill. It is an “and”, not an “or”, allowing Ofcom to go after places that have proven to be harmful.
Secondly, I am not sure the Government can have it both ways—that it is the same as the age-appropriate design code but different in these ways—because it is exactly in the ways that it is different that I am suggesting the Government might improve. We will come back to both those things.
Finally, what are we asking here? We are asking for a risk assessment. The Government say there is no risk assessment, no harm, no mitigation, nothing to do. This is a major principle of the conversations we will have going forward over a number of days. I also believe in proportionality. It is basic product safety; you have a look, you have standards, and if there is nothing to do, let us not make people do silly things. I think we will return to these issues, because they are clearly deeply felt, and they are very practical, and my own feeling is that we cannot risk thousands of children not benefiting from all the work that Ofcom is going to do. With that, I beg leave to withdraw.
Amendment 2 withdrawn.
Amendment 3 not moved.
Amendment 4
Moved by
4: Clause 3, page 3, line 17, leave out paragraphs (a) and (b) and insert “the service has at least one million monthly United Kingdom users.”
Member’s explanatory statement
This amendment replaces the two tests currently set out in subsection (5) of clause 3, relating to a service’s links with the United Kingdom, with a requirement that the service have at least a million monthly United Kingdom users.
My Lords, in moving Amendment 4, I will also speak to Amendments 6 to 8 and 12 and consequential Amendments 288 and 305, largely grouped under the heading “exemptions”. In this group I am also particularly sympathetic to Amendment 9 in the names of the noble Lords, Moylan and Lord Vaizey, and I will leave them to motivate that. I look forward to hearing from the noble Lord, Lord Knight, an explanation for his Amendment 9A.
Last Wednesday we discussed the purposes of the Bill, and there was much agreement across the Chamber on one issue at least: that we need to stay focused and make sure that an already highly complex piece of legislation does not become even more unwieldy. My concern in general is that the Bill already suffers throughout from being overly broad in its aims, resulting in restricting the online experience and expressions of everyone. This series of amendments is about trying to rein in the scope, allowing us to focus on clear targets rather than a one-size-fits-all Bill that sweeps all in its wake with perhaps unintended and damaging consequences.
The Bill creates an extraordinary set of regulatory burdens on tens of thousands of British businesses, micro-communities and tech platforms, no matter the size. The impact assessment claims that 25,000 businesses are in scope, and that is considered a conservative estimate. This implies that an extraordinary range of platforms, from Mumsnet and Wikipedia to whisky-tasting forums and Reddit, will be caught up in this Bill. Can we find a way of removing the smaller platforms from scope? It will destroy too many of them if they have to comply with the regulatory burden created with huge Silicon Valley behemoths in mind.
Let us consider some of the regulatory duties that these entities are expected to comply with. They will need to undertake extensive assessments that must be repeated whenever a product changes. They will need to proactively remove certain types of content, involving assessing the risk of users encountering each type of illegal content, the speed of dissemination and functionality, the design of the platform and the nature and severity of the risk of harms presented to individual users. This will mean assessing their user base and implementing what are effectively surveillance systems to monitor all activity on their platforms.
Let us consider what a phrase such as “prevent from encountering” would mean to a web host such as Wikipedia. It would mean that it would need to scan and proactively analyse millions of edits across 250 languages for illegality under UK-specific law and then block content in defiance of the wishes of its own user community. There is much more, of course. Rest assured, Ofcom’s guidance and risk assessment will, over time, increase the regulatory complexity and the burdens involved.
Those technological challenges do not even consider the mountain of paperwork and administrative obligations that will be hugely costly and time consuming. All that might be achievable, if onerous, for larger platforms. But for smaller ones it could prove a significant problem, with SMEs and organisations working with a public benefit remit particularly vulnerable. Platforms with the largest profits and the most staff dedicated to compliance will, as a consequence, dominate at the expense of start-ups, small companies and community-run platforms.
No doubt the Government and the Minister will assure us that the duties are not so onerous and that they are manageable and proportionate. The impact assessment estimates that implementing the Bill will cost businesses £2.5 billion over the first 10 years, but all the commentators I have read think this is likely to be a substantial underestimate, especially when we are told in the same impact assessment that the legal advice is estimated to cost £39.23 per hour. I do not know what lawyers the Government hang out with, but they appear not to have a clue about the going rate for specialist law firms.
Also, what about the internal staff time? Again, the impact assessment assumes that staff will require only 30 minutes to familiarise themselves with the requirements of the legislation and 90 minutes to read, assess and change the terms and conditions in response to the requirements. Is this remotely serious? Even working through the groups of amendments has taken me hours. It has been like doing one of those 1,000-piece jigsaws, but at least at the end of those you get to see the complete picture. Instead, I felt as though somebody had come in and thrown all the pieces into the air again. I was as confused as ever.
If dealing with groups of amendments to this Bill is complex, that is nothing on the Bill itself, which is dense and often impenetrable. Last week, the Minister helpfully kept telling us to read the Explanatory Notes. I have done that several times and I am still in a muddle, yet somehow the staff of small tech companies will conquer all this and the associated regulatory changes in an hour and a half.
Many fear that this will replicate the worst horrors of GDPR, which, according to some estimates, led to an 8% reduction in the profits of smaller firms while it had little or no effect on the profits of large tech companies. That does not even take into account the cost of the near nervous breakdowns that GDPR caused small organisations, as I know from my colleagues at the Academy of Ideas.
These amendments try to tackle this disproportionate burden on smaller platforms—those companies that are, ironically, often useful challenges and antidotes to big tech’s dominance. The amendments would exempt them unless there is a good reason for specific platforms to be in scope. Of course, cutting out those in scope may not appeal to everyone here. From looking at the ever-increasing amendments list, it seems that some noble Lords have an appetite for expanding the number of services the legislation will apply to; we have already heard the discussion about app stores and online gaming. But we should note that the Government have carved out other exemptions for certain services that are excluded from the new regulatory system. They have not included emails, SMS messages, one-to-one oral communications and so on. I am suggesting some extra exemptions and that we remove services with fewer than 1 million monthly UK users. Ofcom would have the power to issue the provider with a notice bringing them into scope, but only based on reasonable grounds, having identified a safety risk and with 30 days’ notice.
If we do not tackle this, I fear that there is a substantial, serious and meaningful risk that smaller platforms based outside and inside the UK will become inaccessible to British users. It is notable that over 1,000 US news websites blocked European users during the EU’s introduction of GDPR, if noble Lords remember. Will there be a similar response to this law? What, for example, will the US search engine DuckDuckGo conclude? The search engine emphasises privacy and refuses to gather information on its users, meaning that it will be unable to fulfil the duties contained in the Bill of identifying or tailoring search results to users based on their age. Are we happy for it to go?
I fear that this Bill will reduce the number of tech platforms operating in the UK. This is anti-competitive. I do not say that because I have a particular commitment to competition and the free market, by the way. I do so because competition is essential and important for users’ choice and empowerment, and for free speech—something I fear the Bill is threatening. Indeed, the Lords’ Communications and Digital Committee’s extensive inquiry into the implications of giving large tech companies what is effectively a monopoly on defining which speech is free concluded:
“Increasing competition is crucial to promoting freedom of expression online. In a more competitive market, platforms would have to be more responsive to users’ concerns about freedom of expression and other rights”.
That is right. If users are concerned that a platform is failing to uphold their freedom of expression, they can join a different platform with greater ease if there is a wide choice. Conversely, users who are concerned that they do not want to view certain types of material would be more easily able to choose another platform that proscribes said material in its terms and conditions.
I beg to move the amendment as a way of defending diversity, choice and innovation—and as a feeble attempt to make the Bill proportionate.
My Lords, before I speak to my Amendment 9, which I will be able to do fairly briefly because a great deal of the material on which my case rests has already been given to the Committee by the noble Baroness, Lady Fox of Buckley, I will make the more general and reflective point that there are two different views in the Committee that somehow need to be reconciled over the next few weeks. There is a group of noble Lords who are understandably and passionately concerned about child safety. In fact, we all share that concern. There are others of us who believe that this Bill, its approach and the measures being inserted into it will have massive ramifications outside the field of child safety, for adults, of course, but also for businesses, as the noble Baroness explained. The noble Baroness and I, and others like us, believe that these are not sufficiently taken into account either by the Bill or by those pressing for measures to be harsher and more restrictive.
Some sort of balance needs to be found. At Second Reading my noble friend the Minister said that the balance had been struck in the right place. It is quite clear that nobody really agrees with that, except on the principle, which I think is always a cop-out, that if everyone disagrees with you, you must be right, which I have never logically understood in any sense at all. I hope my noble friend will not resort to claiming that he has got it right simply because everyone disagrees with him in different ways.
My amendment is motivated by the considerations set out by the noble Baroness, which I therefore do not need to repeat. It is the Government’s own assessment that between 20,000 and 25,000 businesses will be affected by the measures in this Bill. A great number of those—some four-fifths—are small businesses or micro-businesses. The Government appear to think in their assessment that only 120 of those are high risk. The reason they think they are high risk is not that they are engaged in unpleasant activities but simply that they are engaged in livestreaming and contacting new people. That might be for nefarious purposes but equally, it might not, so the 120 we need to worry about could actually be a very small number. We handle this already through our own laws; all these businesses would still be subject to existing data protection laws and complying with the law generally on what they are allowed to publish and broadcast. It would not be a free-for-all or a wild west, even among that very small number of businesses.
My Amendment 9 takes a slightly different approach to dealing with this. I do not in any way disagree with or denigrate the approach taken by the noble Baroness, Lady Fox, but my approach would be to add two categories to the list of exemptions in the schedules. The first of these is services provided by small and medium-sized enterprises. We do not have to define those because there is already a law that helps define them for us: Section 33 of the Small Business, Enterprise and Employment Act 2015. My proposal is that we take that definition, and that those businesses that comply with it be outside the scope of the Bill.
The second area that I would propose exempting was also referred to by the noble Baroness, Lady Fox of Buckley: community-based services. The largest of these, and the one that frequently annoys us because it gets things wrong, is Wikipedia. I am a great user of Wikipedia but I acknowledge that it does make errors. Of course, most of the errors it makes, such as saying, “Lord Moylan has a wart on the end of his nose”, would not be covered by the Bill anyway. Nothing in the Bill will force people to correct factual statements that have been got wrong—my year of birth or country of birth, or whatever. That is not covered. Those are the things they usually get wrong and that normally annoy us when we see them.
However, I do think that these services are extremely valuable. Wikipedia is an immense achievement and a tremendous source of knowledge and information for people. The fact that it has been put together in this organic, community-led way over a number of years, in so many languages, is a tremendous advantage and a great human advance. Yet, under the proposed changes, Wikipedia would not be able to operate its existing model of people posting their comments.
Currently, you go on Wikipedia and you can edit it. Now, I know this would not apply to any noble Lords but, in the other place, it has been suggested that MPs have discovered how to do this. They illicitly and secretly go on to and edit their own pages, usually in a flattering way, so it is possible to do this. There is no prior restraint, and no checking in advance. There are moderators at Wikipedia—I do not know whether they are employed—who review what has been done over a period, but they do not do what this Bill requires, which is checking in advance.
It is not simply about Wikipedia; there are other community sites. Is it sensible that Facebook should be responsible if a little old lady alters the information on a community Facebook page about what is happening in the local parish? Why should Facebook be held responsible for that? Why would we want it to be responsible for it—and how could it do it without effectively censoring ordinary activities that people want to carry out, using the advantages of the internet that have been so very great?
What I am asking is not dramatic. We have many laws in which we very sensibly create exemptions for small and medium-sized enterprises. I am simply asking that this law be considered under that heading as well, and similarly for Wikipedia and community-based sites. It is slightly unusual that we have had to consider that; it is not normal, but it is very relevant to this Bill and I very much hope the Government will agree to it.
The answer that I would not find satisfactory—I say this in advance for the benefit of my noble friend the Minister, in relation to this and a number of other amendments I shall be moving in Committee—is that it will all be dealt with by Ofcom. That would not be good enough. We are the legislators and we want to know how these issues will be dealt with, so that the legitimate objectives of the Bill can be achieved without causing massive disruption, cost and disbenefit to adults.
My Lords, I rise to speak in support of Amendment 9, tabled by the noble Lord, Lord Moylan, and in particular the proposed new paragraph 10A to Schedule 1. I hope I will find myself more in tune with the mood of the Committee on this amendment than on previous ones. I would be interested to know whether any noble Lords believe that Ofcom should be spending its limited resources supervising a site like Wikipedia under the new regime, as it seems to me patently obvious that that is not what we intend; it is not the purpose of the legislation.
The noble Lord, Lord Moylan, is right to remind us that one of the joys of the internet is that you buy an internet connection, plug it in and there is a vast array of free-to-use services which are a community benefit, produced by the community for the community, with no harm within them. What we do not want to do is interfere with or somehow disrupt that ecosystem. The noble Baroness, Lady Fox, is right to remind us that there is a genuine risk of people withdrawing from the UK market. We should not sidestep that. People who try to be law-abiding will look at these requirements and ask themselves, “Can I meet them?” If the Wikimedia Foundation that runs Wikipedia does not think it can offer its service in a lawful way, it will have to withdraw from the UK market. That would be to the detriment of children in the United Kingdom, and certainly not to their benefit.
There are principle-based and practical reasons why we do not want Ofcom to be operating in this space. The principle-based one is that it makes me uncomfortable that a Government would effectively tell their regulator how to manage neutral information sites such as Wikipedia. There are Governments around the world who seek to do that; we do not want to be one of those.
The amendment attempts to define this public interest, neutral, informational service. It happens to be user-to-user but it is not like Facebook, Instagram or anything similar. I would feel much more comfortable making it clear in law that we are not asking Ofcom to interfere with those kinds of services. The practical reason is the limited time Ofcom will have available. We do not want it to be spending time on things that are not important.
Definitions are another example of how, with the internet, it can often be extremely hard to draw bright lines. Functionalities bleed into each other. That is not necessarily a problem, until you try to write something into law; then, you find that your definition unintentionally captures a service that you did not intend to capture, or unintentionally misses out a service that you did intend to be in scope. I am sure the Minister will reject the amendment because that is what Ministers do; but I hope that, if he is not willing to accept it, he will at least look at whether there is scope within the Bill to make it clear that Wikipedia is intended to be outside it.
Paragraph 4 of Schedule 1 refers to “limited functionality services”. That is a rich vein to mine. It is clear that the intention is to exclude mainstream media, for example. It refers to “provider content”. In this context, Encyclopaedia Britannica is not in scope but Wikipedia is, the difference being that Wikipedia is constructed by users, while Encyclopaedia Britannica is regarded as being constructed by a provider. The Daily Mail is outside scope; indeed, all mainstream media are outside scope. Anyone who declares themselves to be media—we will debate this later on—is likely to be outside scope.
Such provider exemption should be offered to other, similar services, even if they happen to be constructed from the good will of users as opposed to a single professional author. I hope the Minister will be able to indicate that the political intent is not that we should ask Ofcom to spend time and energy regulating Wikipedia-like services. If so, can he point to where in the legislation we might get that helpful interpretation, in order to ensure that Ofcom is focused on what we want it to be focused on and not on much lower priority issues?
I will speak to a couple of the amendments in this group. First, small is not safe, and you cannot necessarily see these platforms in isolation. For example, there is an incel group that has only 4,000 active users, but it posts a great deal on YouTube and has 24.2 million users in that context. So we have to be clear that small and safe are not the same thing.
However, I am sympathetic to the risk-based approach. I should probably have declared an interest as someone who has given money to Wikipedia on several occasions to keep it going. I ask the Minister for some clarity on the systems and processes of the Bill, and whether the risk profile of Wikipedia—which does not entice you in and then follow you for the next six months once you have looked at something—is far lower than something very small that gets hold of you and keeps on going. I say that particularly in relation to children, but I feel it for myself also.
Finally, to the noble Lords who are promoting this group of amendments I say that I would be very supportive if they could find some interventions that simplify the processes companies have to do in the early stages to establish levels of risk, and then we can get heavy on the mitigation of harm. That is something upon which we all agree; if we could find a very low bar of entry, check whether there is harm and then escalate, I believe that would be something we could all work on together.
My Lords, I will speak to Amendment 4 in the name of the noble Baroness, Lady Fox of Buckley.
At Second Reading, my noble friend Lord Morrow raised the point that the Bill needs to cover all online pornography. A factsheet on the Bill, helpfully circulated to Peers last week by the Government, says:
“The Bill’s regulatory framework will cover all online sites with pornographic content, including commercial pornography sites, social media, video-sharing platforms and fora. It will also cover search engines, which play a significant role in enabling children to access pornography”.
This is a welcome commitment but I would like to explore it further.
The Government say “all”, but the definition of which services are in scope of the Bill, as set out in Clause 3(5) and Clause 71(4), requires that there are either
“a significant number of United Kingdom users, or … United Kingdom users form one of the target markets for the service (or the only target market)”.
At Second Reading, my noble friend Lord Morrow asked the Minister what will be considered as “significant”. Is it significant in terms of the total UK adult users who could use a service, or significant in terms of potential global users?
The noble Baroness, Lady Fox of Buckley, is exploring the same issue in her Amendment 4. She is proposing that the Bill’s current definition be replaced with something much easier to understand: that a site must have at least 1 million users per month in the UK to be within the scope of the Bill. That definition is certainly clear. However, I am looking forward to hearing whether it reflects the Government’s intention. For my part, I am concerned about what it might mean for clarifying which pornographic websites would fall into Part 3.
In December, the Government published an analysis carried out in January 2021 by the British Board of Film Classification on the top 200 pornographic websites. It reported that these 200 sites received 76% of the total UK visits to adult sites, based on data during August 2020. Ofcom published a similar list of the top 10 sites visited in September 2020—the site at number 10 had 3.8 million visitors. We do not know how many visitors there were to websites 100 or 200, but it is not unreasonable to speculate that it could be less than a million and would therefore fall outside the definition proposed by the noble Baroness, and nor is it clear whether those websites would fall within the Government’s original definition.
It is important for the Minister to tell the Committee quite clearly whether he expects the top 200 pornographic websites to be within the scope of Parts 3 and 5 of the Bill. If he does, I ask him to explain how that will be possible within the current definition in the Bill, not because I am trying to trip him up but as a genuine inquiry that the Bill does what we are expect it to do. If he does not expect the top 200 pornographic websites to be in scope, how many does he estimate would fall within Parts 3 and 5? Either way, it seems to me that there could be pornographic websites accessed in the United Kingdom that are not required to have age verification to protect those aged under 18 from accessing this content.
As I said, I doubt that this is what parents expect from this flagship Bill, especially as the Government set out in their factsheet that their own commissioned evidence says,
“exposure to pornography may impact children's perceptions of sex and relationships, may lead to replication of practices found in pornography, increased likelihood of engaging in sexual activities and harmful or aggressive behaviour, and reduced concern for consent from partners”.
It seems to me that “significant” should focus on the significant harm a website or content provider would cause if it were accessed in the UK. The number of visitors or popularity of the site should be irrelevant when considering whether or not children should be allowed to access it. My view is quite simple: if a website, social media or content provider wishes to host pornographic material, that is of potential significant harm to children and should be age-verified. I am therefore interested, given what the Government have said previously, to know whether the Minister agrees that all pornographic content must be age-verified if it is to be accessed in the UK. That is certainly what I believe most parents expect, and I will listen carefully to the Minister’s response.
I will speak in support of my noble friend Lord Moylan and Amendment 9. I declare an interest as an author and publisher.
Last week, we had the London Book Fair, and proposed new paragraph 10A could read almost like an executive summary of the main talking point, which was how AI will influence all aspects of the media but particularly publishing. For the sake of future-proofing, paragraph 10A would be a particularly useful step to adopt. Proposed new paragraph 10B would be in the interest of fairness because publishing, and a lot of media, is made up of micro-businesses, often one-man or one-woman companies. This is certain to happen with AI as well, as the intermediary roles are taken up by these. In the interest of future-proofing and fairness, I recommend this amendment.
My Lords, as my name is on Amendment 9, I speak to support these amendments and say that they are worthy of debate. As your Lordships know, I am extremely supportive of the Bill and hope that it will be passed in short order. It is much needed and overdue that we have the opportunity for legislation to provide us with a regulator that is able to hold platforms to account, protect users where it can and enhance child safety online. I can think of no better regulator for that role than Ofcom.
I have listened to the debate with great interest. Although I support the intentions of my noble friend Lord Moylan’s amendment, I am not sure I agree with him that there are two cultures in this House, as far as the Bill is concerned; I think everybody is concerned about child safety. However, these amendments are right to draw attention to the huge regulatory burden that this legislation can potentially bring, and to the inadvertent bad consequences it will bring for many of the sites that we all depend upon and use.
I have not signed many amendments that have been tabled in this Committee because I have grown increasingly concerned, as has been said by many others, that the Bill has become a bit like the proverbial Christmas tree where everyone hangs their own specific concern on to the legislation, turning it into something increasingly unwieldy and difficult to navigate. I thought the noble Baroness, Lady Fox, put it extremely well when she effectively brought to life what it would be like to run a small website and have to comply with this legislation. That is not to say that certain elements of micro-tweaking are not welcome—for example, the amendment by the noble Baroness, Lady Kidron, on giving coroners access to data—but we should be concerned about the scope of the Bill and the burden that it may well put on individual websites.
This is in effect the Wikipedia amendment, put forward and written in a sort of wiki way by this House—a probing amendment in Committee to explore how we can find the right balance between giving Ofcom the powers it needs to hold platforms to account and not unduly burdening websites that all of us agree present a very low risk and whose provenance, if you like, does not fit easily within the scope of the Bill.
I keep saying that I disagree with my noble friend Lord Moylan. I do not—I think he is one of the finest Members of this House—but, while it is our job to provide legislation to set the framework for how Ofcom regulates, we in this House should also recognise that in the real world, as I have also said before, this legislation is simply going to be the end of the beginning. Ofcom will have to find its way forward in how it exercises the powers that Parliament gives it, and I suspect it will have its own list of priorities in how it approaches these issues, who it decides to hold to account and who it decides to enforce against. A lot of its powers will rest not simply on the legislation that we give it but on the relationship that it builds with the platforms it is seeking to regulate.
For example, I have hosted a number of lunches for Google in this House with interested Peers, and it has been interesting to get that company’s insight into its working relationship with Ofcom. By the way, I am by no means suggesting that that is a cosy relationship, but it is at least a relationship where the two sides are talking to each other, and that is how the effectiveness of these powers will be explored.
I urge noble Lords to take these amendments seriously and take what the spirit of the amendments is seeking to put forward, which is to be mindful of the regulatory burden that the Bill imposes; to be aware that the Bill will not, simply by being passed, solve the kinds of issues that we are seeking to tackle in terms of the most egregious content that we find on the internet; and that, effectively, Ofcom’s task once this legislation is passed will be the language of priorities.
My Lords, this is not the first time in this Committee, and I suspect it will not be the last, when I rise to stand somewhere between my noble friend Lord Vaizey and the noble Baroness, Lady Kidron. I am very taken by her focus on risk assessments and by the passionate defences of Wikipedia that we have heard, which really are grounded in a sort of commoner’s risk assessment that we can all understand.
Although I have sympathy with the concerns of the noble Baroness, Lady Fox, about small and medium-sized businesses being overburdened by regulation, I am less taken with the amendments on that subject precisely because small tech businesses become big tech businesses extremely quickly. It is worth pointing out that TikTok did not even exist when Parliament began debating this Bill. I wonder what our social media landscape would have been like if the Bill had existed in law before social media started. We as a country should want global tech companies to be born in the UK, but we want their founders—who, sadly, even today, are predominantly young white men who do not yet have children—to think carefully about the risks inherent in the services they are creating, and we know we need to do that at the beginning of those tech companies’ journeys, not once they have reached 1 million users a month.
While I have sympathy with the desire of the noble Baroness, Lady Fox, not to overburden, just as my noble friend Lord Vaizey has said, we should take our lead from the intervention of the noble Baroness, Lady Kidron: we need a risk assessment even for small and medium-sized businesses. It just needs to be a risk assessment that is fit for their size.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. If one is permitted to say this in the digital age, I am on exactly the same page as she is.
There are two elements to the debate on this group. It is partly about compliance, and I absolutely understand the point about the costs of that, but I also take comfort from some of the things that the noble Lord. Lord Vaizey, said about the way that Ofcom is going to deliver the regulation and the very fact that this is going to be largely not a question of interpretation of the Act, when it comes down to it, but is going to be about working with the codes of practice. That will be a lot more user-friendly than simply having to go to expensive expert lawyers, as the noble Baroness, Lady Fox, said—not that I have anything against expensive expert lawyers.
I am absolutely in agreement with the noble Baroness, Lady Kidron, that small is not safe. As the noble Baroness, Lady Harding, described, small can become big. We looked at this in our Joint Committee and recommended to the Government that they should take a more nuanced approach to regulation, based not just on size and high-level functionality but on factors such as risk, reach, user base, safety performance and business model. All those are extremely relevant but risk is the key, right at the beginning. The noble Baroness, Lady Fox, also said that Reddit should potentially be outside, but Reddit has had its own problems, as we know. On that front, I am on absolutely the same page as those who have spoken about keeping us where we are.
The noble Lord, Lord Moylan, has been very cunning in the way that he has drawn up his Amendment 9. I am delighted to be on the same page as my noble friend —we are making progress—but I agree only with the first half of the amendment because, like the noble Baroness, Lady Kidron, I am a financial contributor to Wikipedia. A lot of us depend on Wikipedia; we look up the ages of various Members of this House when we see them in full flight and think, “Good heavens!” Biographies are an important part of this area. We have all had Jimmy Wales saying, as soon as we get on to Wikipedia, “You’ve already looked at Wikipedia 50 times this month. Make a contribution”, and that is irresistible. There is quite a strong case there. It is risk-based so it is not inconsistent with the line taken by a number of noble Lords in all this. I very much hope that we can get something out of the Minister—maybe some sort of sympathetic noises for a change—at this stage so that we can work up something.
I must admit that the briefing from Wikimedia, which many of us have had, was quite alarming. If the Bill means that we do not have users in high-risk places then we will find that adults get their information from other sources that are not as accurate as Wikipedia —maybe from ChatGPT or GPT-4, which the noble Lord, Lord Knight, is clearly very much an expert in—and that marginalised websites are shut down.
For me, one of the features of the schedule’s list of exempted sites is foreign state entities. Therefore, we could end up in the absurd situation where you could not read about the Ukraine war on Wikipedia, but you would be able to read about the Ukraine war on the Russian Government website.
My Lords, if we needed an example of something that gave us cause for concern, that would be it; but a very good case has been made, certainly for the first half of the amendment in the name of the noble Lord, Lord Moylan, and we on these Benches support it.
My Lords, it has certainly been an interesting debate, and I am grateful to noble Lords on all sides of the Committee for their contributions and considerations. I particularly thank the noble Lords who tabled the amendments which have shaped the debate today.
In general, on these Benches, we believe that the Bill offers a proportionate approach to tackling online harms. We feel that granting some of the exemptions proposed in this group would be unintentionally counterproductive and would raise some unforeseen difficulties. The key here—and it has been raised by a number of noble Lords, including the noble Baronesses, Lady Harding and Lady Kidron, and, just now, the noble Lord, Lord Clement-Jones, who talked about the wider considerations of the Joint Committee and factors that should be taken into account—is that we endorse a risk-based approach. In this debate, it is very important that we take ourselves back to that, because that is the key.
My view is that using other factors, such as funding sources or volunteer engagement in moderation, cuts right across this risk-based approach. To refer to Amendment 4, it is absolutely the case that platforms with fewer than 1 million UK monthly users have scope to create considerable harm. Indeed, noble Lords will have seen that later amendments call for certain small platforms to be categorised on the basis of the risk—and that is the important word—that they engender, rather than the size of the platform, which, unfortunately, is something of a crude measure. The point that I want to make to the noble Baroness, Lady Fox, is that it is not about the size of the businesses and how they are categorised but what they actually do. The noble Baroness, Lady Kidron, rightly said that small is not safe, for all the reasons that were explained, including by the noble Baroness, Lady Harding.
Amendment 9 would exempt small and medium-sized enterprises and certain other organisations from most of the Bill’s provisions. I am in no doubt about the well-meaning nature of this amendment, tabled by the noble Lord, Lord Moylan, and supported by the noble Lord, Lord Vaizey. Indeed, there may well be an issue about how start-ups and entrepreneur unicorns cope with the regulatory framework. We should attend to that, and I am sure that the Minister will have something to say about it. But I also expect that the Minister will outline why this would actually be unhelpful in combating many of the issues that this Bill is fundamentally designed to deal with if we were to go down the road of these exclusions.
In particular, granting exemptions simply on the basis of a service’s size could lead to a situation where user numbers are capped or perhaps even where platforms are deliberately broken up to avoid regulation. This would have an effect that none of us in this Chamber would want to see because it would embed harmful content and behaviour rather than helping to reduce them.
Referring back to the comments of the noble Lord, Lord Moylan, I agree with the noble Lord, Lord Vaizey, in his reflection. I, too, have not experienced the two sides of the Chamber that the noble Lord, Lord Moylan, described. I feel that the Chamber has always been united on the matter of child safety and in understanding the ramifications for business. It is the case that good legislation must always seek a balance, but, to go back to the point about excluding small and medium-sized enterprises, to call them a major part of the British economy is a bit of an understatement when they account for 99.9% of the business population. In respect of the exclusion of community-based services, including Wikipedia—and we will return to this in the next group—there is nothing for platforms to fear if they have appropriate systems in place. Indeed, there are many gains to be had for community-based services such as Wikipedia from being inside the system. I look forward to the further debate that we will have on that.
I turn to Amendment 9A in the name of my noble friend Lord Knight of Weymouth, who is unable to participate in this section of the debate. It probes how the Bill’s measures would apply to specialised search services. Metasearch engines such as Skyscanner have expressed concern that the legislation might impose unnecessary burdens on services that pose little risk of hosting the illegal content targeted by the Bill. Perhaps the Minister, in his response, could confirm whether or not such search engines are in scope. That would perhaps be helpful to our deliberations today.
While we on these Benches are not generally supportive of exemptions, the reality is that there are a number of online search services that return content that would not ordinarily be considered harmful. Sites such as Skyscanner and Expedia, as we all know, allow people to search for and book flights and other travel services such as car hire. Obviously, as long as appropriate due diligence is carried out on partners and travel agents, the scope for users to encounter illegal or harmful material appears to be minimal and returns us to the point of having a risk-based approach. We are not necessarily advocating for a carve-out from the Bill, but it would perhaps be helpful to our deliberations if the Minister could outline how such platforms will be expected to interact with the Ofcom-run online safety regime.
My Lords, I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, but I cannot accept the amendments tabled by the noble Baroness, Lady Fox, and others. Doing so would greatly reduce the strong protections that the Bill offers to internet users, particularly to children. I agree with the noble Baroness, Lady Merron, that that has long been the shared focus across your Lordships’ House as we seek to strike the right balance through the Bill. I hope to reassure noble Lords about the justification for the existing balance and scope, and the safeguards built in to prevent undue burdens to business.
I will start with the amendments tabled by the noble Baroness, Lady Fox of Buckley—Amendments 4, 6 to 8, 12, 288 and 305—which would significantly narrow the definition of services in scope of regulation. The current scope of the Bill reflects evidence of where harm is manifested online. There is clear evidence that smaller services can pose a significant risk of harm from illegal content, as well as to children, as the noble Baroness, Lady Kidron, rightly echoed. Moreover, harmful content and activity often range across a number of services. While illegal content or activity may originate on larger platforms, offenders often seek to move to smaller platforms with less effective systems for tackling criminal activity in order to circumvent those protections. Exempting smaller services from regulation would likely accelerate that process, resulting in illegal content being displaced on to smaller services, putting users at risk.
These amendments would create significant new loopholes in regulation. Rather than relying on platforms and search services to identify and manage risk proactively, they would require Ofcom to monitor smaller harmful services, which would further annoy my noble friend Lord Moylan. Let me reassure the noble Baroness, however, that the Bill has been designed to avoid disproportionate or unnecessary burdens on smaller services. All duties on services are proportionate to the risk of harm and the capacity of companies. This means that small, low-risk services will have minimal duties imposed on them. Ofcom’s guidance and codes of practice will set out how they can comply with their duties, in a way that I hope is even clearer than the Explanatory Notes to the Bill, but certainly allowing for companies to have a conversation and ask for areas of clarification, if that is still needed. They will ensure that low-risk services do not have to undertake unnecessary measures if they do not pose a risk of harm to their users.
In addition, the Bill includes explicit exemptions for many small and medium-sized enterprises, through the low-risk functionality exemptions in Schedule 1. This includes an exemption for any service that offers users the ability only to post comments or reviews on digital content published by it, which will exempt many online retailers, news sites and web logs. The Bill also provides the Secretary of State with a power to exempt further types of user-to-user or search services from the Bill if the risk of harm presented by a particular service is low, ensuring that other low-risk services are not subject to unnecessary regulation. There was quite a lot of talk about Wikipedia—
My Lords, while my noble friend is talking about the possibility of excessive and disproportionate burden on businesses, can I just ask him about the possibility of excessive and disproportionate burden on the regulator? He seems to be saying that Ofcom is going to have to maintain, and keep up to date regularly, 25,000 risk assessments—this is on the Government’s own assessment, produced 15 months ago, of the state of the market then—even if those assessments carried out by Ofcom result in very little consequence for the regulated entity.
We know from regulation in this country that regulators already cannot cope with the burdens placed on them. They become inefficient, sclerotic and unresponsive; they have difficulty in recruiting staff of the same level and skills as the entities that they regulate. We have a Financial Services and Markets Bill going through at the moment, and the FCA is a very good example of that. Do we really think that this is a sensible burden to place on a regulator that is actually able to discharge it?
The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.
I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.
Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be
“working to benefit the public”.
I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.
Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.
Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to
“search for … products or services … in a particular sector”.
It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.
The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.
The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.
I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.
My Lords, I thank people for such a wide-ranging and interesting set of contributions. I take comfort from the fact that so many people understood what the amendments were trying to do, even if they did not fully succeed in that. I thought it was quite interesting that in the first debate the noble Lord, Lord Allan of Hallam, said that he might be a bit isolated on the apps, but I actually agreed with him—which might not do his reputation any good. However, when he said that, I thought, “Welcome to my world”, so I am quite pleased that this has not all been shot down in flames before we started. My amendment really was a serious attempt to tackle something that is a real problem.
The Minister says that the Bill is designed to avoid disproportionate burdens on services. All I can say is, “Sack the designer”. It is absolutely going to have a disproportionate burden on a wide range of small services, which will not be able to cope, and that is why so many of them are worried about it. Some 80% of the companies that will be caught up in this red tape are small and micro-businesses. I will come to the small business point in a moment.
The noble Baroness, Lady Harding, warned us that small tech businesses become big tech businesses. As far as I am concerned, that is a success story—it is what I want; is it not what we all want? Personally, I think economic development and growth is a positive thing—I do not want them to fail. However, I do not think it will ever happen; I do not think that small tech businesses will ever grow into big tech businesses if they face a disproportionate burden in the regulatory sense, as I have tried to describe. That is what I am worried about, and it is not a positive thing to be celebrated.
I stress that it is not small tech and big tech. There are also community sites, based on collective moderation. Wikipedia has had a lot of discussion here. For a Bill that stresses that it wants to empower users, we should think about what it means when these user-moderated community sites are telling us that they will not be able to carry on and get through. That is what they are saying. It was interesting that the noble Lord, Lord Clement-Jones, said that he relies on Wikipedia—many of us do, although please do not believe what it says about me. There are all of these things, but then there was a feeling that, well, Reddit is a bit dodgy. The Bill is not meant to be deciding which ones to trust in quite that way, or people’s tastes.
I was struck that the noble Baroness, Lady Kidron, said that small is not safe, and used the incel example. I am not emphasising that small is safe; I am saying that the small entities will not survive this process. That is my fear. I do not mean that the big ones are nasty and dangerous and the small ones are cosy, lovely and Wikipedia-like. I am suggesting that smaller entities will not be able to survive the regulatory onslaught. That is the main reason I raised this.
The noble Baroness, Lady Merron, said that these entities can cause great harm. I am worried about a culture of fear, in which we demonise tens of thousands of innocent tech businesses and communities and end up destroying them when we do not intend to. I tried to put in the amendment an ability for Ofcom, if there are problematic sites that are risky, to deal with them. As the Minister kept saying, low-risk search engines have been exempted. I am suggesting that low-risk small and micro-businesses are exempted, which is the majority of them. That is what I am suggesting, rather than that we assume they are all guilty and then they have to get exempted.
Interestingly, the noble Lord, Lord McCrea, asked how many pornography sites are in scope and which pornographic websites have a million or fewer users. I am glad I do not know the answer to that, otherwise people might wonder why I did. The point is that there are always going to be sites that are threatening or a risk to children, as we are discussing. But we must always bear in mind—this was the important point that the noble Lord, Lord Moylan, made—that in our absolute determination to protect children via this Bill we do not unintendedly damage society as a whole. Adult access to free speech, for example, is one of my concerns, as are businesses and so on. We should not have that as an outcome.
I am sure that my amendments could be majorly improved. The approach of the noble Lord, Lord Moylan, might be better. I am happy to look at the metric and whether or not it is 1 million monthly users. However, I am insistent that the bipartisan approach to risk from the Minister and the Opposition will not help us achieve what we want from this Bill and will cause unnecessary problems. We have to avoid a recipe for risk aversion that will hold back the progressive and wonderful aspects of the online world, or at least the educational and in some instances business aspects.
I am obviously not going to push the amendments now, but I will come back to this. If it is not me, I hope somebody does, because the fact that some people said that half the points the noble Lord, Lord Moylan, made were correct was a step forward. I have no interest in noble Lords supporting my amendments, as long as we take seriously the content of my concerns and those expressed by the noble Lords, Lord Vaizey and Lord Moylan, particularly. I beg leave to withdraw my amendment.
Amendment 4 withdrawn.
Amendments 5 to 8 not moved.
Clause 3 agreed.
Schedule 1: Exempt user-to-user and search services
Amendments 9 and 9A not moved.
Schedule 1 agreed.
Schedule 2 agreed.
Clause 4: Disapplication of Act to certain parts of services
Amendment 10
Moved by
10: Clause 4, page 4, line 8, at end insert—
“(2A) This Act does not apply in relation to moderation actions taken, or not taken, by users of a Part 3 service.”Member’s explanatory statement
The drafting of some Bill provisions, such as Clauses 17(4)(c) or 65(1), leaves room for debate as to whether community moderation gives rise to liability and obligations for the provider. This amendment, along with the other amendment to Clause 4 in the name of Lord Moylan, clarifies that moderation carried out by the public, for example on Wikipedia, is not fettered by this Bill.
My Lords, I have to start with a slightly unprofessional confession. I accepted the Bill team’s suggestion on how my amendments might be grouped after I had grouped them rather differently. The result is that I am not entirely clear why some of these groupings are quite as they are. As my noble friend the Minister said, my original idea of having Amendments 9, 10 and 11 together would perhaps have been better, as it would have allowed him to give a single response on Wikipedia. Amendments 10 and 11 in this group relate to Wikipedia and services like it.
I am, I hope, going to cause the Committee some relief as I do not intend to repeat remarks made in the previous group. The extent to which my noble friend wishes to amplify his comments in response to the previous group is entirely a matter for him, since he said he was reserving matter that he would like to bring forward but did not when commenting on the previous group. If I do not speak further on Amendments 10 and 11, it is not because I am not interested in what my noble friend the Minister might have to say on the topic of Wikipedia.
To keep this fairly brief, I turn to Amendment 26 on age verification. I think we have all agreed in the Chamber that we are united in wanting to see children kept safe. On page 10 of the Bill, in Clause 11(3), it states that there will be a duty to
“prevent children of any age from encountering”
this content—“prevent” them “encountering” is extremely strong. We do not prevent children encountering the possibility of buying cigarettes or encountering the possibility of being injured crossing the road, but we are to prevent children from these encounters. It is strongly urged in the clause—it is given as an example—that age verification will be required for that purpose.
Of course, age verification works only if it applies to everybody: one does not ask just the children to prove their age; one has to ask everybody online. Unlike when I go to the bar in a pub, my grey hair cannot be seen online. So this provision will almost certainly have to extend to the entire population. In Clause 11(3)(b), we have an obligation to protect. Clearly, the Government intend a difference between “prevent” and “protect”, or they would not have used two different verbs, so can my noble friend the Minister explain what is meant by the distinction between “prevent” and “protect”?
My amendment would remove Clause 11(3) completely. But it is, in essence, a probing amendment and what I want to hear from the Government, apart from how they interpret the difference between “prevent” and “protect”, is how they expect this duty to be carried out without having astonishingly annoying and deterring features built into every user-to-user platform and website, so that every time one goes on Wikipedia—in addition to dealing with the GDPR, accepting cookies and all the other nonsense we have to go through quite pointlessly—we then have to provide age verification of some sort.
What mechanism that might be, I do not know. I am sure that there are many mechanisms available for age verification. I do not wish to get into a technical discussion about what particular techniques might be used—I accept that there will be a range and that they will respond and adapt in the light of demand and technological advance—but I would like to know what my noble friend the Minister expects and how wide he thinks the obligation will be. Will it be on the entire population, as I suspect? Focusing on that amendment—and leaving the others to my noble friend the Minister to respond to as he sees fit—and raising those questions, I think that the Committee would like to know how the Government imagine that this provision will work. I beg to move.
My Lords, I will speak to the amendments in the name of the noble Lord, Lord Moylan, on moderation, which I think are more important than he has given himself credit for—they go more broadly than just Wikipedia.
There is a lot of emphasis on platform moderation, but the reality is that most moderation of online content is done by users, either individually or in groups, acting as groups in the space where they operate. The typical example, which many Members of this House have experienced, is when you post something and somebody asks, “Did you mean to post that?”, and you say, “Oh gosh, no”, and then delete it. A Member in the other place has recently experienced a rather high-profile example of that through the medium of the newspaper. On a much smaller scale, it is absolutely typical that people take down content every day, either because they regret it or, quite often, because their friends, families or communities tell them that it was unwise. That is the most effective form of moderation, because it is the way that people learn to change their behaviour online, as opposed to the experience of a platform removing content, which is often experienced as the big bad hand of the platform. The person does not learn to change their behaviour, so, in some cases, it can reinforce bad behaviour.
Community moderation, not just on Wikipedia but across the internet, is an enormous public good, and the last thing that we want to do in this legislation is to discourage people from doing it. In online spaces, that is often a volunteer activity: people give up their time to try to keep a space safe and within the guidelines they have set for that space. The noble Lord, Lord Moylan, has touched on a really important area: in the Bill, we must be absolutely clear to those volunteers that we will not create all kinds of new legal operations and liabilities on them. These are responsible people, so, if they are advised that they will incur all kinds of legal risk when trying to comply with the Online Safety Bill, they will stop doing the moderation—and then we will all suffer.
On age-gating, we will move to a series of amendments where we will discuss age assurance, but I will say at the outset, as a teaser to those longer debates, that I have sympathy with the points made by the noble Lord, Lord Moylan. He mentioned pubs—we often talk about real-world analogies. In most of the public spaces we enter in the real world, nobody does any ID checking or age checking; we take it on trust, unless and until you carry out an action, such as buying alcohol, which requires an age check.
It is legitimate to raise this question, because where we fall in this debate will depend on how we see public spaces. I see a general-purpose social network as equivalent to walking into a pub or a town square, so I do not expect to have my age and ID checked at the point at which I enter that public space. I might accept that my ID is checked at a certain point where I carry out various actions. Others will disagree and will say that the space should be checked as soon as you go into it—that is the boundary of the debate we will have across a few groups. As a liberal, I am certainly on the side that says that it is incumbent on the person wanting to impose the extra checks to justify them. We should not just assume that extra checks are cost-free and beneficial; they have a cost for us all, and it should be imposed only where there is a reasonable justification.
Far be it for me to suggest that all the amendments tabled by the noble Lord, Lord Moylan, are in the wrong place, but I think that Amendment 26 might have been better debated with the other amendments on age assurance.
On community moderation, I underscore the point that Ofcom must have a risk profile as part of its operations. When we get to that subject, let us understand what Ofcom intends to do with it—maybe we should instruct Ofcom a little about what we would like it to do with it for community moderation. I have a lot of sympathy—but do not think it is a get-out clause—with seeing some spaces as less risky, or, at least, for determining what risky looks like in online spaces, which is a different question. This issue belongs in the risk profile: it is not about taking things out; we have to build it into the Bill we have.
On age assurance and AV, I do not think that today is the day to discuss it in full. I disagree with the point that, because we are checking kids, we have to check ourselves—that is not where the technology is. Without descending into technical arguments, as the noble Lord, Lord Moylan, asked us not to, we will bring some of those issues forward.
The noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford have a package of amendments which are very widely supported across the Committee. They have put forward a schedule of age assurance that says what the rules of the road are. We must stop pretending that age assurance is something that is being invented now in this Bill. If you log into a website with your Facebook login, it shares your age—and that is used by 42% of people online. However, if you use an Apple login, it does not share your age, so I recommend using Apple—but, interestingly, it is harder to find that option on websites, because websites want to know your age.
So, first, we must not treat age assurance as if it has just been invented. Secondly, we need to start to have rules of the road, and ask what is acceptable, what is proportionate, and when we will have zero tolerance. Watching faces around the Committee, I say that I will accept zero tolerance for pornography and some other major subjects, but, for the most part, age assurance is something that we need to have regulated. Currently, it is being done to us rather than in any way that is transparent or agreed, and that is very problematic.
My Lords, I hesitated to speak to the previous group of amendments, but I want to speak in support of the issue of risk that my noble friend Lady Kidron raised again in this group of amendments. I do not believe that noble Lords in the Committee want to cut down the amount of information and the ability to obtain information online. Rather, we came to the Bill wanting to avoid some of the really terrible harms promoted by some websites which hook into people’s vulnerability to becoming addicted to extremely harmful behaviours, which are harmful not only to themselves but to other people and, in particular, to children, who have no voice at all. I also have a concern about vulnerable people over the age of 18, and that may be something we will come to later in our discussions on the Bill.
It seems really important that we stick with the principle that if it is profoundly illegal in the offline world then we cannot allow it to be perpetrated in the online world. That compass needle has been behind some of the thinking of a lot of us in trying to grapple with this issue, which is very complex for those of us who are outside the world of tech and internet and coming new to it, but who have seen the results of some of those harms perpetrated. That is where the problem arises.
My Lords, I violently agree with my noble friend Lord Moylan that the grouping of this amendment is unfortunate. For that reason I am not going to plunge into the issue in huge detail. but there are a couple of things I would like to reassure my noble friend on, and I have a question for the Minister.
The noble Baroness, Lady Kidron, said there is a package of amendments around age verification and that we will have a lot of time to dive into this, and I think that is probably the right format for doing it. However, I reassure my noble friend Lord Moylan that he is absolutely right. The idea is not in any way to shut off the town square from everyone simply because there might be something scary there.
Clause 11(3) refers to priority content, which the noble Lord will know is to do with child abuse and fraudulent and severely violent content. This is not just any old stuff; this is hardcore porn and the rest. As in the real world, that content should be behind an age-verification barrier. At the moment we have a situation on the internet where, because it has not been well-managed for a generation, this content has found itself everywhere: on Twitter and Reddit, and all sorts of places where really it should not be because there are children there. We envisage a degree of tidying up of social media and the internet to make sure that the dangerous content is put behind age verification. What we are not seeking to do, and what would not be a benign or positive action, is to put the entire internet behind some kind of age-verification boundary. From that point of view, I completely agree with my noble friend.
My Lords, as might be expected, I will speak against Amendment 26 and will explain why.
The children’s charity Barnardo’s—here I declare an interest as vice-president—has said, as has been said several times before, that children are coming across pornographic content from as young as seven. Often they stumble across the content accidentally, unwittingly searching for terms such as “sex” or “porn”, without knowing what they mean. The impact that this is having on children is huge. It is harming their mental health and distorting their perception of healthy sexual relationships and consent. That will go with them into adulthood.
Age verification for pornography and age assurance to protect children from other harms are crucial to protect children from this content. In the offline world, children are rightly not allowed to buy pornographic DVDs in sex shops but online they can access this content at the click of a button. This is why I will be supporting the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, and am fully supportive of their age assurance and age verification schedule.
My Lords, to go back not just to the age question, the noble Lord, Lord Allan of Hallam, reminded us that community-led moderation is not just Wikipedia. What I tried to hint at earlier is that that is one of the most interesting, democratic aspects of the online world, which we should protect.
We often boast that we are a self-regulating House and that that makes us somehow somewhat superior to up the road—we are all so mature because we self-regulate; people do behave badly but we decide. It is a lesson in democracy that you have a self-regulating House, and there are parts of the online world that self-regulate. Unless we think that the citizens of the UK are less civilised than Members of the House of Lords, which I would refute, we should say that it is positive that there are self-moderating, self-regulating online sites. If you can say something and people can object and have a discussion about it, and things can be taken down, to me that is the way we should deal with speech that is inappropriate or wrong. The bulk of these amendments—I cannot remember how many there are now—are right.
I was glad that the noble Lord, Lord Moylan, said he could not understand why this grouping had happened, which is what I said earlier. I had gone through a number of groupings thinking: “What is that doing there? Am I missing something? Why is that in that place?” I think we will come back to the age verification debate and discussion.
One thing to note is that one of the reasons organisations such as Wikipedia would be concerned about age verification—and they are—is anonymity. It is something we have to consider. What is going to happen to anonymity? It is so important for journalists, civil liberty activists and whistleblowers. Many Wikipedia editors are anonymised, maybe because they are politically editing sites on controversial issues. Imagine being a Wikipedia editor from Russia at the moment—you would not want to have to say who you are. We will come back to it but it is important to understand that Amendment 26, and those who are saying that we should look at the question of age verification, are not doing so because they do not care about children and are not interested in protecting them. However, the dilemmas of any age-gating or age verification for adult civil liberties have to be considered. We have to worry that, because of an emphasis on checking age, some websites will decide to sanitise what they allow to be published to make it suitable for children, just in case they come across it. Again, that will have a detrimental impact on adult access to all knowledge.
These will be controversial issues, and we will come back to them, but it is good to have started the discussion.
My Lords, this has been a very strange debate. It has been the tail end of the last session and a trailer for a much bigger debate coming down the track. It was very odd.
We do not want to see everything behind an age-gating barrier, so I agree with my noble friend. However, as the noble Baroness, Lady Kidron, reminded us, it is all about the risk profile, and that then leads to the kind of risk assessment that a platform is going to be required to carry out. There is a logic to the way that the Bill is going to operate.
When you look at Clause 11(3), you see that it is not disproportionate. It deals with “primary priority content”. This is not specified in the Bill but it is self-harm and pornography—major content that needs age-gating. Of course we need to have the principles for age assurance inserted into the Bill as well, and of course it will be subject to debate as we go forward.
There is technology to carry out age verification which is far more sophisticated than it ever was, so I very much look forward to that debate. We started that process in Part 3 of the Digital Economy Act. I was described as an internet villain for believing in age verification. I have not changed my view, but the debate will be very interesting. As regards the tail-end of the previous debate, of course we are sympathetic on these Benches to the Wikipedia case. As we said on the last group, I very much hope that we will find a way, whether it is in Schedule 1 or in another way, of making sure that Wikipedia is not affected overly by this—maybe the risk profile that is drawn up by Ofcom will make sure that Wikipedia is not unduly impacted.
Like others, I had prepared quite extensive notes to respond to what I thought the noble Lord was going to say about his amendments in this group, and I have not been able to find anything left that I can use, so I am going to have to extemporise slightly. I think it is very helpful to have a little non-focused discussion about what we are about to talk about in terms of age, because there is a snare and a delusion in quite a lot of it. I was put in mind of that in the discussions on the Digital Economy Act, which of course precedes the Minister but is certainly still alive in our thinking: in fact, we were talking about it earlier today.
The problem I see is that we have to find a way of squaring two quite different approaches. One is to prevent those who should not be able to see material, because it is illegal for them to see it. The other is to find a way of ensuring that we do not end up with an age-gated internet, which I am grateful to find that we are all, I think, agreed about: that is very good to know.
Age is very tricky, as we have heard, and it is not the only consideration we have to bear in mind in wondering whether people should be able to gain access to areas of the internet which we know will be bad and difficult for them. That leads us, of course, to the question about legal but harmful, now resolved—or is it? We are going to have this debate about age assurance and what it is. What is age verification? How do they differ? How does it matter? Is 18 a fixed and final point at which we are going to say that childhood ends and adulthood begins, and therefore one is open for everything? It is exactly the point made earlier about how to care for those who should not be exposed to material which, although legal for them by a number called age, is not appropriate for them in any of the circumstances which, clinically, we might want to bring to bear.
I do not think we are going to resolve these issues today—I hope not. We are going to talk about them for ever, but at this stage I think we still need a bit of thinking outside a box which says that age is the answer to a lot of the problems we have. I do not think it is, but whether the Bill is going to carry that forward I have my doubts. How we get that to the next stage, I do not know, but I am looking forward to hearing the Minister’s comments on it.
My Lords, I agree that this has been a rather unfortunate grouping and has led to a slightly strange debate. I apologise if it is the result of advice given to my noble friend. I know there has been some degrouping as well, which has led to slightly odd combinations today. However, as promised, I shall say a bit more about Wikipedia in relation to my noble friend’s Amendments 10 and 11.
The effect of these amendments would be that moderation actions carried out by users—in other words, community moderation of user-to-user and search services —would not be in scope of the Bill. The Government support the use of effective user or community moderation by services where this is appropriate for the service in question. As I said on the previous group, as demonstrated by services such as Wikipedia, this can be a valuable and effective means of moderating content and sharing information. That is why the Bill does not impose a one-size-fits-all requirement on services, but instead allows services to adopt their own approaches to compliance, so long as these are effective. The noble Lord, Lord Allan of Hallam, dwelt on this. I should be clear that duties will not be imposed on individual community moderators; the duties are on platforms to tackle illegal content and protect children. Platforms can achieve this through, among other things, centralised or community moderation. Ultimately, however, it is they who are responsible for ensuring compliance and it is platforms, not community moderators, who will face enforcement action if they fail to do so.
The amendments in the name of my noble friend Lord Moylan appear to intend to take services which rely only on user moderation entirely out of the scope of the Bill, so that those services are not subject to the new regulatory framework in any way. That would create a gap in the protections created by the Bill and would create incentives for services to adopt nominal forms of user moderation to avoid being subject to the illegal content and child safety duties. This would significantly undermine the efficacy of the Bill and is therefore not something we could include in it. His Amendment 26 would remove the duties on providers in Clause 11(3) to prevent children encountering primary priority content, and to protect children in age groups at risk of harm from other content that is harmful to children. This is a key duty which must be retained.
Contrary to what some have said, there is currently no requirement in the Bill for users to verify their age before accessing search engines and user-to-user services. We expect that only services which pose the highest risk to children will use age-verification technologies, but this is indeed a debate to which we will return in earnest and in detail on later groups of amendments. Amendment 26 would remove a key child safety duty, significantly weakening the Bill’s protections for children. The Bill takes a proportionate approach to regulation, which recognises the diverse range of services that are in scope of it. My noble friend’s amendments run counter to that and would undermine the protections in the Bill. I hope he will feel able not to press them and allow us to return to the debates on age verification in full on another group.
My Lords I am grateful to all noble Lords who have contributed to this slightly disjointed debate. I fully accept that there will be further opportunities to discuss age verification and related matters, so I shall say no more about that. I am grateful, in particular, to the noble Lord, Lord Allan of Hallam, for supplying the deficiency in my opening remarks about the importance of Amendments 10 and 11, and for explaining just how important that is too. I also thank the noble Lord, Lord Stevenson. It was good of him to say, in the open approach he took on the question of age, that there are issues still to be addressed. I do not think anybody feels that we have yet got this right and I think we are going to have to be very open in that discussion, when we get to it. That is also true about what the noble Lord, Lord Allan of Hallam, said: we have not yet got clarity as to where the age boundary is—I like his expression—for the public space. Where is the point at which, if checks are needed, those checks are to be applied? These are all matters to discuss and I hope noble Lords will forgive me if I do not address each individual contribution separately.
I would like to say something, I hope not unfairly or out of scope, about what was said by the noble Baronesses, Lady Finlay of Llandaff and Lady Kidron, when they used, for the first time this afternoon, the phrase “zero tolerance”, and, at the same time, talked about a risk-based approach. I have, from my own local government experience, a lot of experience of risk-based approaches taken in relation to things—very different, of course, from the internet—such as food safety, where local authorities grade restaurants and food shops and take enforcement action and supervisory action according to their assessment of the risk that those premises present. That is partly to do with their assessment of the management and partly to do with their experience of things that have gone wrong in the past. If you have been found with mouse droppings and you have had to clean up the shop, then you will be examined a great deal more frequently until the enforcement officers are happy; whereas if you are always very well run, you will get an inspection visit maybe only once a year. That is what a risk-based assessment consists of. The important thing to say is that it does not produce zero tolerance or zero outcomes.
I just want to make the point that I was talking about zero tolerance at the end of a ladder of tolerance, just to be clear. Letting a seven-year-old child into an 18-plus dating app or pornographic website is where the zero tolerance is—everything else is a ladder up to that.
I beg the noble Baroness’s pardon; I took that for granted. There are certain things—access to pornography, material encouraging self-harm and things of that sort—where one has to have zero tolerance, but not everything. I am sorry I took that for granted, so I fully accept that I should have made that more explicit in my remarks. Not everything is to be zero-toleranced, so to speak, but certain things are. However, that does not mean that they will not happen. One has to accept that there will be leakage around all this, just as some of the best-run restaurants that have been managed superbly for years will turn out, on occasion, to be the source of food poisoning. One has to accept that this is never going to be as tight as some of the advocates wanted, but with that, I hope I will be given leave to withdraw—
May I intervene, because I have also been named in the noble Lord’s response? My concern is about the most extreme, most violent, most harmful and destructive things. There are some terrible things posted online. You would not run an open meeting on how to mutilate a child, or how to stab somebody most effectively to do the most harm. It is at this extreme end that I cannot see anyone in society in the offline world promoting classes for any of these terrible activities. Therefore, there is a sense that exposure to these things is of no benefit but promotes intense harm. People who are particularly vulnerable at a formative age in their development should not be exposed to them, because they would not be exposed to them elsewhere. I am speaking personally, not for anybody else, but I stress that this is the level at which the tolerance should be set to zero because we set it to zero in the rest of our lives.
Everything the noble Baroness has said is absolutely right, and I completely agree with her. The point I simply want to make is that no form of risk-based assessment will achieve a zero-tolerance outcome, but—
I am so sorry, but may I offer just one final thought from the health sector? While the noble Lord is right that where there are human beings there will be error, there is a concept in health of the “never event”—that when that error occurs, we should not tolerate it, and we should expect the people involved in creating that error to do a deep inspection and review to understand how it occurred, because it is considered intolerable. I think the same exists in the digital world in a risk assessment framework, and it would be a mistake to ignore it.
My Lords, I am now going to attempt for the third time to beg the House’s leave to withdraw my amendment. I hope for the sake of us all, our dinner and the dinner break business, for which I see people assembling, that I will be granted that leave.
Amendment 10 withdrawn.
Amendment 11 not moved.
Clause 4 agreed.
Amendment 12 not moved.
Clause 5 agreed.
Clause 6: Providers of user-to-user services: duties of care
Amendment 12A
Moved by
12A: Clause 6, page 5, line 11, at end insert “(2) to (8)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 9 below (because the new duty to summarise illegal content risk assessments in the terms of service is only imposed on providers of Category 1 services).
My Lords, this group of government amendments relates to risk assessments; it may be helpful if I speak to them now as the final group before the dinner break.
Risk management is at the heart of the Bill’s regulatory framework. Ofcom and services’ risk assessments will form the foundation for protecting users from illegal content and content which is harmful to children. They will ensure that providers thoroughly identify the risks on their own websites, enabling them to manage and mitigate the potential harms arising from them. Ofcom will set out the risks across the sector and issue guidance to companies on how to conduct their assessments effectively. All providers will be required to carry out risk assessments, keep them up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. Providers will then need to put in place measures to manage and mitigate the risks they identify in their risk assessments, including any emerging risks.
Given how crucial the risk assessments are to this framework, it is essential that we enable them to be properly scrutinised by the public. The government amendments in this group will place new duties on providers of the largest services—that is, category 1 and 2A services—to publish summaries of their illegal and child safety risk assessments. Through these amendments, providers of these services will also have a new duty to send full records of their risk assessments to Ofcom. This will increase transparency about the risk of harm on the largest platforms, clearly showing how risk is affected by factors such as the design, user base or functionality of their services. These amendments will further ensure that the risk assessments can be properly assessed by internet users, including by children and their parents and guardians, by ensuring that summaries of the assessments are publicly available. This will empower users to make informed decisions when choosing whether and how to use these services.
It is also important that Ofcom is fully appraised of the risks identified by service providers. That is why these amendments introduce duties for both category 1 and 2A services to send their records of these risk assessments, in full, to Ofcom. This will make it easier for Ofcom to supervise compliance with the risk assessment duties, as well as other duties linked to the findings of the risk assessments, rather than having to request the assessments from companies under its information-gathering powers.
These amendments also clarify that companies must keep a record of all aspects of their risk assessments, which strengthens the existing record-keeping duties on services. I hope that noble Lords will welcome these amendments. I beg to move.
My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.
The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.
I also welcome these amendments, but I have two very brief questions for the Minister. First, in Amendment 27A, it seems that the child risk assessment is limited only to category 1 services and will be published only in the terms of service. As he probably knows, 98% of people do not read terms of service, so I wondered where else we might find this, or whether there is a better way of dealing with it.
My second question is to do with Amendments 64A and 88A. It seems to me—forgive me if I am wrong—that the Bill previously stipulated that all regulated search and user services had to make and keep a written record of any measure taken in compliance with a relevant duty, but now it seems to have rowed back to only category 1 and 2A services. I may be wrong on that, but I would like to check it for the record.
My Lords, the noble Baroness, Lady Kidron, put her finger exactly on the two questions that I wanted to ask: namely, why only category 1 and category 2A, and is there some rowing back involved here? Of course, none of this prejudices the fact that, when we come later in Committee to talk about widening the ambit of risk assessments to material other than that which is specified in the Bill, this kind of transparency would be extremely useful. But the rationale for why it is only category 1 and category 2A in particular would be very useful to hear.
My Lords, I am grateful to the Minister for introducing this group, and we certainly welcome this tranche of government amendments. We know that there are more to come both in Committee and as we proceed to Report, and we look forward to seeing them.
The amendments in this group, as other noble Lords have said, amount to a very sensible series of changes to services’ risk-assessment duties. This perhaps begs the question of why they were not included in earlier drafts of the Bill, but we are glad to see them now.
There is, of course, the issue of precisely where some of the information will appear, as well as the wider status of terms of service. I am sure those issues will be discussed in later debates. It is certainly welcome that the department is introducing stronger requirements around the information that must be made available to users; it will all help to make this a stronger and more practical Bill.
We all know that users need to be able to make informed decisions, and it will not be possible if they are required to view multiple statements and various documents. It seems that the requirements for information to be provided to Ofcom go to the very heart of the Bill, and I suggest that the proposed system will work best if there is trust and transparency between the regulator and those who are regulated. I am sure that there will be further debate on the scope of risk assessments, particularly on issues that were dropped from previous iterations of the Bill, and certainly this is a reasonable starting point today.
I will try to be as swift as possible as I raise a few key issues. One is about avoiding warnings that are at such a high level of generality that they get put on to everything. Perhaps the Minister could indicate how Ofcom will ensure that the summaries are useful and accessible to the reader. The test, of course, should be that a summary is suitable and sufficient for a prospective user to form an assessment of the likely risk they would encounter when using the service, taking into account any special vulnerabilities that they might have. That needs to be the test; perhaps the Minister could confirm that.
Is the terms of service section the correct place to put a summary of the illegal content risk assessment? Research suggests, unsurprisingly, that only 3% of people read terms before signing up—although I recall that, in an earlier debate, the Minister confessed that he had read all the terms and conditions of his mobile phone contract, so he may be one of the 3%. It is without doubt that any individual should be supported in their ability to make choices, and the duty should perhaps instead be to display a summary of the risks with due prominence, to ensure that anyone who is considering signing up to a service is really able to read it.
I also ask the Minister to confirm that, despite the changes to Clause 19 in Amendment 16B, the duty to keep records of risk assessments will continue to apply to all companies, but with an enhanced responsibility for category 1 companies.
I am grateful to noble Lords for their questions on this, and particularly grateful to the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, for their chorus of welcome. Where we are able to make changes, we will of course bring them forward, and I am glad to be able to bring forward this tranche now.
As the noble Lord, Lord Allan, said, ensuring the transparency of services’ risk assessments will further ensure that the framework of the Bill delivers its core objectives relating to effective risk management and increased accountability regarding regulated services. As we have discussed, it is imperative that these providers take a thorough approach to identifying risks, including emerging risks. The Government believe that it is of the utmost importance that the public are able effectively to scrutinise the risk assessments of the largest in-scope services, so that users can be empowered to make informed decisions about whether and how to use their services.
On the questions from the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, about why it is just category 1 and category 2A services, we estimate that there will be around 25,000 UK service providers in scope of the Bill’s illegal and child safety duties. Requiring all these companies to publish full risk assessments and proactively to send them to Ofcom could undermine the Bill’s risk-based and proportionate approach, as we have discussed in previous groups on the burdens to business. A large number of these companies are likely to be low risk and it is unlikely that many people will seek out their risk assessments, so requiring all companies to publish them would be an excessive regulatory burden.
There would also be an expectation that Ofcom would proactively monitor a whole range of services, even ones that posed a minimal risk to users. That in turn could distract Ofcom from taking a risk-based approach in its regulation by overwhelming it with paperwork from thousands of low-risk services. If Ofcom wants to see records of the risk assessments of providers that are not category 1 or category 2A services, it has extensive information-gathering powers that it can use to require a provider to send it such records.
The noble Baroness, Lady Merron, was right to say that I read the terms of my broadband supply—I plead guilty to the nerdiness of doing that—but I have not read all the terms and conditions of every application and social medium I have downloaded, and I agree that many people do skim through them. They say the most commonly told lie on the planet at the moment is “I agree to the terms and conditions”, and the noble Baroness is right to point to the need for these to be intelligible, easily accessible and transparent—which of course we want to see.
In answer to her other question, the record-keeping duty will apply to all companies, but the requirement to publish is only for category 1 and category 2A companies.
The noble Baroness, Lady Kidron, asked me about Amendment 27A. If she will permit me, I will write to her with the best and fullest answer to that question.
I am grateful to noble Lords for their questions on this group of amendments.
Amendment 12A agreed.
Amendment 12B
Moved by
12B: Clause 6, page 5, line 16, at end insert “(2) to (6)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 19 below (because the new duty to supply records of risk assessments to OFCOM is only imposed on providers of Category 1 services).
Amendment 12B agreed.
House resumed.