Committee (2nd Day) (Continued)
12BA: Clause 6, page 5, line 16, at end insert—
“(g) the duties on regulated provider pornographic content set out in section 72.”Member’s explanatory statement
This amendment requires user-to-user services to comply with duties under Part 5.
My Lords, I wish to speak to the amendments in this group, which are in my name and are also supported by the noble Lord, Lord Morrow. There are three interconnected issues raised by these amendments. First, there should be a level playing field across the Bill for regulating pornographic content. The same duties should apply to all content, whether it is found in social media or a user-to-user pornography site, which fall under Part 3, or a commercial pornography site, with producer content that falls within Part 5 of the Bill. Secondly, these common duties in respect of pornography must come into effect at the same time. My requiring that the same duties under Clause 72 apply to both Part 3 and Part 5 services means that they will be regulated for pornographic content at the same time, ensuring uniformity across the Bill.
Thirdly, through Amendment 125A, I wish to probe how Part 5 will function more specifically. Will any website or platform actually be covered by Part 5 if these amendments are not made? I had the privilege of speaking earlier to the Minister on these issues, and one question I would pose at this stage is, how many sites are covered by Part 5? That is one of the questions to which your Lordships’ House requires an answer.
The issue of ensuring that pornography is named as a harm on the face of the Bill, and that all pornographic content is age-verified, is not new. Indeed, it has been raised from the outset of the Bill, including at Second Reading in your Lordships’ House. In pre-legislative scrutiny even, the Joint Committee on the draft Bill recommended that
“key, known risks of harm to children are set out on the face of the Bill. We would expect these to include (but not be limited to) access to or promotion of age-inappropriate material such as pornography”.
To partly address this issue, the Government added Part 5 to the Bill, which sought to ensure that any platform that was not in scope of Part 3 but which included pornographic content should be subject to age-verification measures. I know that other noble Lords have put forward amendments that would add to the list of online harms on the face of the Bill, which we will be debating later in group 10.
Other amendments add to the duties that platforms hosting pornographic content need to comply with. These include Amendment 184, in the name of the noble Baroness, Lady Kidron, which proposes that consent be obtained from performers, and Amendment 185, in the name of the noble Baroness, Lady Benjamin, which seeks to ensure parity between what is permissible offline and online. The amendments I propose in this group are, I believe, complementary to those amendments. My amendments seek to ensure that duties across Part 3 and Part 5 in respect of pornography are aligned. Therefore, those additional duties contained in other amendments would be aligned across the Bill as well. When we get to that stage in Committee, I will be supporting those amendments.
The harms of pornography are well known and I do not propose to go over those again. I do, however, want to highlight one issue raised in a recent report published by the Children’s Commissioner for England. Her report states:
“Pornography is not confined to dedicated adult sites. We found that Twitter was the online platform where young people were most likely to have seen pornography. Fellow mainstream social networking platforms Instagram and Snapchat rank closely after dedicated pornography sites.”
The report found that 41% of children encountered pornography on Twitter, 33% on Instagram and 32% on Snapchat, while only 37% of children encountered pornography on main commercial websites. This means that children are more likely to encounter pornographic content on social media. That is why we need to ensure that standards across all platforms are uniform. The same standards need to apply to social media as to commercial pornography websites.
While I appreciate that the Government state it is their intention that Part 3 services will have to implement age verification, and that all platforms will have similar duties to ensure that children are protected from accessing pornographic content, it would clearly be better to remove all doubt and have age verification for the protection of children in the Bill. This would ensure a level playing field for all pornographic content, which brings me to my second point.
Not only does there need to be a level playing field but there needs to be a concurrent timing of these requirements coming into effect. These amendments would ensure that age verification will apply to all platforms across the Bill at the same time. I am sure your Lordships will agree that this is what the public will expect. I am not sure that parents, or indeed children and young people, would understand why one website has age verification while other content does not.
As the Bill is drafted, pornography would need to be named as the primary priority content by the Secretary of State, alongside other online harms. I hope that the Minister, in his reply, could address that issue. Codes of practice for pornography and other harms will need to be drafted and implemented before Part 3 can come into effect. We know the harm that pornography causes: it is the only harm that is already named in the Bill. It has been given its own part within the Bill, and therefore we do not need any secondary legislation to name pornography as a harm and set down duties for pornographic websites and social media to protect children. By simply making user-to-user services subject to the duties of Part 5, children can be protected more quickly. Part 5 will be much more straightforward to implement, and extending the duties to Part 3 services with pornographic content will ensure parity across all services within the scope of the Bill.
This brings me to my third and final point. Amendment 125A, upon which the Minister and I had a discussion earlier this afternoon, probes the devil in the detail of what is defined as user-generated content. I ask the Committee to bear with me, as I am required to get into the detail of the definitions in Clause 49, which is important. This detail matters because it determines whether provider pornographic content, as defined in Clause 70, could be considered user-generated content.
Put simply, if a site is user generated it is regulated under Part 3 and if a site produces its own content it is covered by Part 5. The Government said, in their helpful factsheet circulated before the start of Committee, that Part 3 covers services
“such as social media platforms and dedicated pornography sites that enable user interaction”.
In the case of Part 5 services, the intention is that this will cover content provided only by the service. My Amendment 125A probes what happens next and what constitutes user interaction.
If users of a platform can interact, this seems to move the service into Part 3 of the Bill, as per its definition of user-generated content. The definition in Clause 49(2)(e) includes “comments and reviews”, which itself refers to Clause 49(6). However, Clause 49(6) does not bring much clarity about what
“Comments and reviews on provider content”
consist of. On a plain reading of Clause 49 it would appear that a pornography provider which currently falls under Part 5 would move to being a Part 3 service under the Bill if they allow users to comment on the content and allow user interaction. Therefore, the important question is: what is user-to-user functionality?
The British Board of Film Classification undertook an analysis of user-to-user functionality on adult sites in October 2020. It assumed that likes did not constitute user-to-user functionality, specifically saying:
“We have not included sites which offer users the chance to ‘rate’ content—for example, with a ‘thumbs up’ or ‘thumbs down’—as we were concerned that this would be too generous an interpretation of ‘user interaction’”.
Elsewhere, it said that such ratings
“would be a questionable interpretation of ‘user interaction’”.
This seems a reasonable interpretation to me. However, Clause 49 does not seem to be clear on this. It seems to allow ratings to constitute a review, thus giving room for interpretation.
My Amendment 125A would make it clear that likes or dislikes, or the use of an emoji, would not be considered a review and would therefore not be user-to-user content. That would keep a service which allowed this, but no textual comments, in Part 5. This may seem inconsequential but it is important, as it prevents services moving from one part of the Bill to another to utilise different regulatory requirements, or indeed to evade regulations.
I ask the Minister to set out the Government’s intentions and how the definitions in Clause 49 might move a service with provider pornographic content from Part 5 to Part 3. Furthermore, I would be grateful if the Minister could put on record how many of the top 200 pornographic websites visited in the UK he expects to be regulated by Part 3 and Part 5 respectively, and how many people in the UK he expects to visit services under each part.
My main concern is to ensure that, as soon as is practically possible after the Bill passes, children are protected. This issue was raised at Second Reading and earlier this evening in various contributions. It would not be acceptable for services such as social media and large pornography sites that fall under Part 3 to be left for three or even four years without a duty to protect children from pornographic content. It would be worse if sites were allowed to move from one part of the Bill to another by simply utilising user interaction and thereby avoiding regulation.
These amendments, in my name and that of the noble Lord, Lord Morrow, will ensure that all pornographic content is regulated in the same way, at the same time, and that Part 5 can be brought into force more quickly to ensure all content is treated in the same way. I believe that was certainly the will of your Lordships at Second Reading. I look forward to hearing the Minister’s views on how this will be achieved. I beg to move.
My Lords, first, I tender an apology from my noble friend Lord Morrow, whose name is attached to the amendments. Unfortunately, he is unable to participate in tonight’s debate, as he had to return home at very short notice. I will speak to the amendments in this group. I thank the noble Baroness, Lady Ritchie, and my noble friend Lord Morrow for tabling the amendments, allowing for a debate on how the duties of Part 5 should apply to Part 3 services and to probe what sites Part 5 will cover once it is implemented.
The Government have devised a Bill which attempts carefully to navigate regulation of several different types of service. I am sure that it will eventually become an exemplar emulated around the world, so I understand why there may be a general resistance on the part of the Government to tamper with the Bill’s architecture. However, these amendments are designed to treat pornographic content as a clear exception wherever it is found online. This can be achieved, because we already know the harm caused by pornography and Part 5 already creates a duty to ensure that rigorous age verification is in place to stop children accessing it.
The Government recognised that the original drafting of the Bill would not address the unfinished business of Part 3 of the Digital Economy Act. In 2017, as many will recall, this House and the other place expressed the clear demand that online pornography should not be accessible to children. Part 5 of the Bill is the evolution of that 2017 debate, but, regrettably, it was bolted on belatedly after pre-legislative scrutiny. That bolt-on approach has had the unfortunate consequence of creating two separate regimes to deal with pornography. Part 5 applies only to “provider pornographic content”, which is content
“published or displayed on the service by the provider … or by a person acting on behalf of the provider”.
Clause 70 makes it clear:
“Pornographic content that is user-generated content … is not to be regarded as provider pornographic content”;
in other words, if pornography is on social media or the large tube sites, it falls under Part 3, not Part 5. That means that not all content will be regulated in the same way or at the same time.
Amendment 125A addresses an issue raised by this two-tier approach to regulation. Clause 49 defines “user-generated content” as content
“generated directly on the service by a user of the service, or … uploaded to or shared on the service by a user of the service, and … that may be encountered by another user”.
Encounter is defined broadly, meaning to
“read, view, hear or otherwise experience content”,
including adding “comments and reviews”. By including reviews, that seems to be a broad definition. Does it include a like, an up vote or an emoji? That is an important question that Amendment 125A probes. On this basis, it seems that almost all the most popular pornographic websites are user-to-user services, and therefore will fall into Part 3.
I echo the question asked by the noble Baroness, Lady Ritchie: can the Minister identify what sites will be regulated by Part 5, how much United Kingdom traffic is directed to those sites and how will any site covered by Part 5 be prevented from adding functionality to allow encounters on its platform to move that site from Part 5 to Part 3 to delay implementation?
These are important questions. Ofcom could accelerate implementation of Part 5 separately and indeed it would be disappointing if Ofcom delayed Part 5 implementation to avoid these very questions. These amendments are needed to allow Part 5 to be implemented quickly and for pornography to be regulated across the Bill swiftly. Put simply, Part 5 does not rely on the vast amount of secondary legislation which we must consider before Part 3 can be brought into operation. But to do so without these amendments would be to abandon a sensible goal of creating a level playing field for any site which publishes pornographic content and would affect only a minority of the smaller sites, or indeed no websites at all.
The amendments before your Lordships today apply a far simpler logic. They place within scope of Part 5 any pornographic content wherever it is found and place a duty on Part 3 services to comply with the duties in Clause 72. I emphasise that this is not to apply age gates to an entire service, only the adult content. So, for example, on Twitter, research has found that 13% of tweets lead to pornographic images and videos. That platform would need users to prove they were 18 or over before they could see those particular tweets but not for the rest of the Twitter platform. Part 5 can very simply deal with all pornography online, so it can be introduced on a stand-alone basis allowing us to do so within, say, six months of Royal Assent. I understand that amendments seeking to achieve this are tabled for later in Committee.
We should keep in mind that in December 2018 Ministers announced that age verification would be required from the following Easter. We know the major porn sites had contracts negotiated with age-verification providers as they had accepted the inevitability of the policy and were prepared to comply. We saw in France, just over a year ago, that these sites were able to implement age checks with just 10 days’ notice.
By addressing all pornographic content under one part of the Bill, we would also remove the ambiguity Part 3 creates. User-to-user platforms are required only to act proportionately. For example, the social media site I referred to earlier may determine that only 13% of its content is pornographic. One may think that is more than enough to merit age verification. Ofcom may well agree. But this is a judgment, susceptible to judicial review. For the price of a small legal team, a site could delay enforcement for years as it argued through hearings and appeals that the demands on it were disproportionate. Part 3 suggests the use of age assurance and offers age verification as an example, not a requirement, but Part 5 leaves no room for doubt.
The unsuitability of pornography for children is not something we expect to change. We do not need to future-proof the Bill against the possibility that one day a Minister decides it is no longer something we wish to protect children from seeing. Indeed, if they do, I much prefer it if they have to return to Parliament to amend primary legislation before the law is relaxed.
I hope the Minister will see the logic of a level playing field to deliver a policy with widespread support across all ages and political parties. Indeed, without addressing pornography separately—and, in turn, quickly —we will pass a Bill with no discernible impact before the next general election. While they are walking to the polling station, parents will still fear what their children are looking at online. This is a quick win and a popular move, and I hope the Government will amend the Bill accordingly so that this House does not need to do so when we revisit this important issue on Report.
My Lords, it is a tremendous honour to follow the noble Lord, Lord Browne, who put the case extremely well; I agree with every word he just said. I thank the noble Baroness, Lady Ritchie, for bringing forward this issue, which she has done extremely well. I thank Christian Action Research and Education, which has been fundamental in thinking through some of these issues and has written an extremely good brief on the subject. There is indeed an urgent need for consistent regulation of pornographic content wherever it occurs online, whether it is in Section 3, Section 4, Section 5 or wherever. That is why, with the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Oxford and the noble Lord, Lord Stevenson, I have tabled amendments to address age verification on pornography and harms in the round.
Our amendments, which we will get to on Thursday and on later days in Committee, are different from those raised by the noble Baroness, Lady Ritchie, and others, but it is worth noting that many of the principles are the same. In particular, all pornographic content should be subject to the same duties, in the interests of consistency and transparency, wherever it is. Porn is porn, regardless of where it occurs online, and it carries the same risk of harm, particularly to children, whether it is accessed on social media or on a dedicated porn site.
We know from the Children’s Commissioner’s research that, for instance, Twitter was absolutely the online platform where young people were most likely to have seen pornography. Not Pornhub or one of the big tubes—on Twitter. We also know that children will consistently watch porn on dedicated porn sites. So why do we have inconsistent regulation of pornographic content in the Bill? This is the question I address to my noble friend the Minister. We can and we will get to the debate on how we will do this—indeed, I welcome further discussion with the Minister on how, and encourage him to have conversations across the House on this.
For today, we must look at why we have inconsistent regulation for pornographic content and what that means. As currently drafted, Part 3 services and Part 5 services are not subject to the same duties, as the noble Baroness rightly pointed out. Part 3 services, which include the biggest and most popular pornographic websites, such as Pornhub and Xvideos, as well as sites that host pornographic content, such as Twitter, will not be subject to regulation, including age verification, until secondary legislation is introduced, thereby delaying regulation of the biggest porn sites until at the very least 2025, if not 2026. This will create a massively unlevel playing field which, as others have said, will disincentivise compliance across the board, as well as leaving children with unfettered access to pornography on both social media sites and other user-to-user sites such as Pornhub.
Meanwhile, whichever commercially produced pornography websites are left in Part 5 will, as has already been suggested, simply change their functionality to become user-to-user and avoid regulation for another three years. I have a way in which this can be prevented and the noble Baroness, Lady Ritchie, has her way, but for today I stand with her in asking why the Government think this lack of consistency and fragmentation in the regulation of an industry that destroys childhoods and has implications that reverberate across society are to be accepted.
I look forward to hearing what the Minister has to say. It is clear to me that there is a consensus across the Committee and at every stage of the Bill that pornography should be regulated in a way that is consistent, clear and implemented as quickly as possible following Royal Assent—I have suggested within six months. Therefore, I would welcome discussions with the noble Baroness, Lady Ritchie, the Minister and others to ensure that this can be achieved.
My Lords, I want to inject into the debate some counterarguments, which I hope will be received in the constructive spirit in which they are intended. Primarily, I want to argue that a level playing field is not the right solution here and that there is a strong logic for a graduated response. It is often tempting to dial everything up to 11 when you have a problem, and we clearly do have an issue around child access to pornography. But from a practical point of view, the tools we are giving our regulator are better served by being able to treat different kinds of services differently.
I think there are three classes of service that we are thinking about here. The first is a service with the primary purpose and explicit intent to provide pornography and nothing else. A regime dedicated to those sites is quite appropriate. Such a service might have not just the strongest levels of age verification but a whole other set of requirements, which I know we will debate later, around content verification and all sorts of other things that kick into play. The second category is made up of services that are primarily designed for social interaction which prohibit pornography and make quite strenuous efforts to keep it off. Facebook is such a service. I worked there, and we worked hard to try to keep pornography off. We could not guarantee that it was never present, but that was our intent: we explicitly wanted to be a non-pornographic site. Then there are—as the noble Lord, Lord Bethell, pointed out—other services, such as Twitter, where the primary purpose is social but a significant proportion of adult content is allowed.
I suggest that one of the reasons for having a graduated response is that, from our point of view, we would like services to move towards porn reduction, and for those general-purpose services to prohibit porn as far as possible. That is our intent. If we have a regulatory system that says, “Look, we’re just going to treat you all the same anyway”, we may provide a perverse incentive for services not to move up the stack, as it were, towards a regime where by having less pornographic or sexualised content, they are able to see some benefit in terms of their relationship with the regulator. That is the primary concern I have around this: that by treating everybody the same, we do not create any incentive for people to deal with porn more effectively and thereby get some relief from the regulator.
From a practical point of view, the relationship that the regulator has is going to be critical to making all these things work. Look at what has been happening in continental Europe. There have been some real issues around enforcing laws that have been passed in places such as France and Germany because there has not been the kind of relationship that the regulator needs with the providers. I think we would all like to see Ofcom in a better position, and one of the ways it can do that is precisely by having different sets of rules. When it is talking to a pure pornography site, it is a different kind of conversation from the one it is going to have with a Twitter or a Facebook. Again, they need to have different rules and guidance that are applied separately.
The intent is right: we want to stop under-18s getting on to those pure porn sites, and we need one set of tools to do that. When under-18s get on to a social network that has porn on it, we want the under-18s, if they meet the age requirement, to have access—that is perfectly legitimate—but once they get there, we want them kept out of the section that is adult. For a general-purpose service that prohibits porn, I think we can be much more relaxed, at least in respect of pornography but not in respect of other forms of harmful content—but we want the regulator to be focused on that and not on imposing porn controls. That graduated response would be helpful to the regulator.
Some of the other amendments that the noble Lord, Lord Bethell, has proposed will help us to talk about those kinds of measures—what Twitter should do inside Twitter, and so on—but the amendments we have in front of us today are more about dialling it all up to 11 and not allowing for that graduation. That is the intent I heard from the amendments’ proposers. As I say, that is the bit that, respectfully, may end up being counterproductive.
Could the noble Lord advise us on how he would categorise a site such as Twitter, on which it is estimated that 13% of the page deliveries are to do with pornography? Does it qualify as a pornography site? To me, it is ambiguous. Such a large amount of its financial revenue comes from pages connected with pornography that it seems it has a very big foot in the pornography industry. How would he stop sites gaming definitions to benefit from one schedule or another? Does he think that puts great pressure on the regulator to be constantly moving the goalposts in order to capture who it thinks might be gaming the system, instead of focusing on content definition, which has a 50-year pedigree, is very well defined in law and is an altogether easier status to analyse and be sure about?
The Twitter scenario, and other scenarios of mixed sites, are some of the most challenging that we have to deal with. But I would say, straightforwardly, “Look, 13% is a big chunk, but the primary purpose of Twitter is not the delivery of pornography”. I use Twitter on a daily basis and I have never seen pornography on it. I understand that it is there and that people can go for it, and that is an issue, but I think people out there would say that for most people, most of the time, the primary purpose of Twitter is not pornography.
What we want to do—in answer to the noble Lord’s second point—is create an incentive for people to be recategorised in the right direction. There is an assumption here that it is all going to be about gaming the system. I actually think that there is an opportunity here for genuine changes. There will be a conversation with Twitter. It will be interesting, given Twitter’s current management—apparently it is run by a dog, so there will be a conversation with the dog that runs Twitter. In that conversation, the regulator, Ofcom, on our behalf, will be saying, “You could change your terms of service and get rid of pornography”. Twitter will say yes or no. If it says no, Ofcom will say, “Well, here are all the things we expect you to do in order to wall off that part of the site”.
That is a really healthy and helpful conversation to have with Twitter. I expect it is listening now and already thinking about how it will respond. But it would expect that kind of treatment and conversation to be different; and I think the public would expect that conversation to be a different and better conversation than just saying “Twitter, you’re Pornhub. We’re just going to treat you like Pornhub”.
That is the distinction. As I say, we have an opportunity to get people to be more robust about either limiting or removing pornography, and I fear that the amendments we have in front of us would actually undermine rather than enhance that effort.
At the centre of this is the question of whether we are trying to block the entire service or block at the level of porn content. It is the purpose of a set of amendments in the names of the noble Lord, Lord Bethell, myself and a number of other noble Lords to do exactly the latter. But I have to say to the noble Baroness that I am very much in sympathy with, first, putting porn behind an age gate; secondly, having a commencement clause; and, thirdly and very importantly—this has not quite come up in the conversation—saying that harms must be on the face of the Bill and that porn is not the only harm. I say, as a major supporter of the Bereaved Families for Online Safety, that “Porn is the only harm children face” would be a horrendous message to come from this House. But there is nothing in the noble Baroness’s amendments, apart from where the action happens, that I disagree with.
I also felt that the noble Baroness made an incredibly important point when she went into detail on Amendment 125A. I will have to read her speech in order to follow it, because it was so detailed, but the main point she made is salient and relates to an earlier conversation: the reason we have Part 5 is that the Government have insisted on this ridiculous thing about user-to-user and search, instead of doing it where harm is. The idea that you have Part 5, which is to stop the loophole of sites that do not have user-to-user, only to find that they can add user-to-user functionality and be another type of site, is quite ludicrous. I say to the Committee and the Minister, who I am sure does not want me to say it, “If you accept Amendment 2, you’d be out of that problem”—because, if a site was likely to be accessed by children and it had harm and we could see the harm, it would be in scope. That is the very common-sense approach. We are where we are, but let us be sensible about making sure the system cannot be gamed, because that would be ludicrous and would undermine everybody’s efforts—those of the Government and of all the campaigners here.
I just want to say one more thing because I see that the noble Lord, Lord Moylan, is back in his place. I want to put on the record that age assurance and identity are two very separate things. I hope that, when we come to debate the package of harms—unfortunately, we are not debating them all together; we are debating harms first, then AV—we get to the bottom of that issue because I am very much in the corner of the noble Lord and the noble Baroness, Lady Fox, on this. Identity and age assurance must not be considered the same thing by the House, and definitely not by the legislation.
My Lords, I add my support for all the amendments in this group. I thank the noble Baroness, Lady Ritchie, for bringing the need for the consistent regulation of pornographic content to your Lordships’ attention. Last week, I spoke about my concerns about pornography; I will not repeat them here. I said then that the Bill does not go far enough on pornography, partly because of the inconsistent regulation regimes between Part 3 services and Part 5 ones.
In February, the All-Party Parliamentary Group on Commercial Sexual Exploitation made a series of recommendations on the regulation of pornography. Its first recommendation was this:
“Make the regulation of pornography consistent across different online platforms, and between the online and offline spheres”.
It went on to say:
“The reforms currently contained in the Online Safety Bill not only fail to remedy this, they introduce further inconsistencies in how different online platforms hosting pornography are regulated”.
This is our opportunity to get it right but we are falling short. The amendments in the name of the noble Baroness, Lady Ritchie, go to the heart of the issue by ensuring that the duties that currently apply to Part 5 services will also apply to Part 3 services.
Debates about how these duties should be amended or implemented will be dealt with later on in our deliberations; I look forward to coming back to them in detail then. Today, the question is whether we are willing to have inconsistent regulation of pornographic content across the services that come into the scope of the Bill. I am quite sure that, if we asked the public in an opinion poll whether this was the outcome they expected from the Bill, they would say no.
An academic paper published in 2021 reported on the online viewing of 16 and 17 year-olds. It said that pornography was much more frequently viewed on social media, showing that the importance of the regulation of such sites remains. The impact of pornography is no different whether it is seen on a social media or pornography site with user-to-user facilities that fall within Part 3 or on a site that has only provider content that would fall within Part 5. There should not be an either/or approach to different services providing the same content, which is why I think that Amendment 125A is critical. If all pornographic content is covered by Part 5, what does and does not constitute user-generated material ceases to be our concern. Amendment 125A highlights this issue; I too look forward to hearing the Minister’s response.
There is no logic to having different regulatory approaches in the same Bill. They need to be the same and come into effect at the same time. That is the simple premise of these amendments; I fully support them.
My Lords, earlier today the noble Baroness, Lady Benjamin, referred to a group of us as kindred spirits. I suggest that all of us contributing to this debate are kindred spirits in our desire to see consistent outcomes. All of us would like to see a world where our children never see pornography on any digital platform, regardless of what type of service it is. At the risk of incurring the ire of my noble friend Lord Moylan, we should have zero tolerance for children seeing and accessing pornography.
I agree with the desire to be consistent, as the noble Baroness, Lady Ritchie, and the noble Lord, Lord Browne, said, but it is consistency in outcomes that we should focus on. I am very taken with the point made by the noble Lord, Lord Allan, that we must be very careful about the unintended consequences of a consistent regulatory approach that might end up with inconsistent outcomes.
When we get to it later—I am not sure when—I want to see a regulatory regime that is more like the one reflected in the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell. We need in the Bill a very clear definition of what age assurance and age verification are. We must be specific on the timing of introducing the regulatory constraints on pornography. We have all waited far too long for that to happen and that must be in the Bill.
I am nervous of these amendments that we are debating now because I fear other unintended consequences. Not only does this not incentivise general providers, as the noble Lord, Lord Allan, described them, to remove porn from their sites but I fear that it incentivises them to remove children from their sites. That is the real issue with Twitter. Twitter has very few child users; I do not want to live in a world where our children are removed from general internet services because we have not put hard age gates on the pornographic content within them but instead encouraged those services to put an age gate on the front door. Just as the noble Lord, Lord Allan, said earlier today, I fear that, with all the best intentions, the desire to have consistent outcomes and these current amendments would regulate the high street rather than the porn itself.
My Lords, there is absolutely no doubt that across the Committee we all have the same intent; how we get there is the issue between us. It is probably about the construction of the Bill, rather than the duties that we are imposing.
It is a pleasure again to follow the noble Baroness, Lady Harding. If you take what my noble friend Lord Allan said about a graduated response and consistent outcomes, you then get effective regulation.
I thought that the noble Baroness, Lady Kidron, had it right. If we passed her amendments in the second group, and included the words “likely to be accessed”, Clause 11 would bite and we would find that there was consistency of outcomes for primary priority content and so on, and we would then find ourselves in much the same space. However, it depends on the primary purpose. The fear that we have is this. I would not want to see a Part 5 service that adds user-generated content then falling outside Part 5 and finding itself under Part 3, with a different set of duties.
I do not see a huge difference between Part 3 and Part 5, and it will be very interesting when we come to debate the later amendments tabled by the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron. Again, why do we not group these things together to have a sensible debate? We seem to be chunking-up things in a different way and so will have to come back to this and repeat some of what we have said. However, I look forward to the debate on those amendments, which may be a much more effective way of dealing with this than trying to marry Part 5 and Part 3.
I understand entirely the motives of the noble Baroness, Lady Ritchie, and that we want to ensure that we capture this. However, it must be the appropriate way of regulating and the appropriate way of capturing it. I like the language about consistent outcomes without unintended consequences.
My Lords, this has been a very helpful debate and I hope it sets up the Committee up for when we return to these issues. As the noble Lord, Lord Clement-Jones, just said, it is about having appropriate regulation that does the job that we want. I feel from this debate, as I have felt before, that we are in agreement about what we want; the question, as ever, is how we get there.
The noble Lord, Lord Allan of Hallam, spoke well on the practicalities of the different ways that pornography is accessible and, because they are so different, the need to respond differently. An example is Twitter, which is primarily a social network but its content can be inappropriate when accessed by a group who should not be accessing it—children, in this case. It is important that the way this is approached does not take away the ability of Twitter, for example, to do the job that it is there to do but does protect those who need to be protected. The words that came to mind is that regulation needs to be fit for purpose, but the question is what the purpose is and how we make it fit for it.
I am grateful to all noble Lords who have spoken today. The noble Baroness, Lady Harding, spoke of consistency of outcome. That is a very good place from which to look backwards to see what is required. The noble Baroness, Lady Kidron, was right to say that we must not send out the message that pornography is, somehow, the only harm or that there is a hierarchy of harms. In my view, we are simply debating that at this stage. So pornography is not the only harm, nor is it of a higher order than other harms.
I would like to say how grateful I am to my noble friend Lady Ritchie of Downpatrick, who was supported in the Chamber by the noble Lord, Lord Browne, on behalf of his noble friend the noble Lord, Lord Morrow, who put his name to some of these amendments. I am grateful because the debate in this area facilitated an early debate on the issue of regulation and online pornography, and did it thoroughly. It raised a number of questions that we will need to address when debating later amendments.
There is no denying the damage that can be caused by young people readily having access to pornographic content online. They see material that it would be illegal for them to see offline. If we have already dealt with offline, our challenge is to protect children and young people in the same way online. However, as we will discuss later and probably at some length, this side of the House does not accept that access to illegal pornography is the only issue affecting how children can and should use the internet. Exposure to pornographic content changes young people’s perceptions of sexual activity and, in the worst cases, can contribute to sexual assault. Even in cases where there is consent, evidence is available that shows that depictions of certain high-risk activities in pornographic material mean that many more people are engaging in, for example, choking and asphyxiation, with the predictable but tragic outcome of permanent injury or even death.
Having said that, later we will be debating measures that need to be put in place to protect children of 18 and under from accessing sites that they are likely to encounter. We need to ensure that age-appropriate design is the keystone to the protection of children online. We are relying heavily on effective terms of service to protect vulnerable adults from accessing material which would cause them harm, and that issue definitely needs more debate.
Pornography has an influence on adult sexual behaviour and, regardless of our own personal views, we have to remember that much adult content is in fact perfectly legal, and for whatever reason, it is also very popular. While some of the most widely used user-to-user platforms have opted not to carry adult material, there are others, as we have heard in the debate, such as Twitter and Reddit, that do allow users to share explicit but legal content. There has been an explosion in the number of so-called content creators who upload their own material to sites such as OnlyFans. There has also been an explosion in user-to-user services such as Twitter, which I would presume to be the very valid motivation behind Amendment 183A.
Steps taken to restrict child access to adult content and user-to-user platforms are often easy to bypass, so the question of whether such services should be within the scope of Part 5 is indeed a valid one. There are some platforms that do take their responsibilities seriously, with OnlyFans having engaged with the topic of online safety long before it will be compelled to do so; but others have not. So, on that basis, it is clear that we cannot continue with the status quo given the ever-increasing risk that illegal material does not get taken down by algorithms and automated moderation.
We recognise that the Government have had their own reasons for not implementing Part 3 of the Digital Economy Act. That decision was disappointing, and in fact, the disappointment was made even worse by repeated empty promises, dither and delay. However, the department clearly recognises the issue, which is a welcome first step, and it is not clear that simply rerunning the arguments from the DEA is going to bear fruit this time round. This Bill is largely apolitical, and colleagues on all sides of the House, including from these Benches, have the opportunity to come together; we have the opportunity to agree a way forward to protect children, to reduce exposure to extreme forms of pornography but ultimately to allow adults to consume pornography if they wish to do so. That is the challenge that we have.
These Benches support robust age verification for access to pornographic content, but it is vital that these systems are secure and take appropriate steps to preserve the user’s privacy. The questions raised in this group are extremely valid, and the proposals presented by other colleagues, including the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, deserve very serious consideration. We hope that the Minister can demonstrate in his response that progress is being made.
My Lords, first, I will address Amendments 12BA, 183A and 183B, tabled by the noble Baroness, Lady Ritchie of Downpatrick, who I was grateful to discuss them with earlier today, and the noble Lord, Lord Morrow, whose noble friend, the noble Lord, Lord Browne of Belmont, I am grateful to for speaking to them on his behalf.
These amendments seek to apply the duties in Part 5 of the Bill, which are focused on published pornographic content and user-generated pornography. Amendments 183A and 183B are focused particularly on making sure that children are protected from user-to-user pornography in the same way as from published pornography, including through the use of age verification. I reassure the noble Baroness and the noble Lord that the Government share their concerns; there is clear evidence about the impact of pornography on young people and the need to protect children from it.
This is where I come to the questions posed earlier by the noble Lord, Lord McCrea of Magherafelt and Cookstown. The research we commissioned from the British Board of Film Classification assessed the functionality of and traffic to the UK’s top 200 most visited pornographic websites. The findings indicated that 128 of the top 200 most visited pornographic websites—that is just under two-thirds, or 64%—would have been captured by the proposed scope of the Bill at the time of the Government’s initial response to the online harms White Paper, and that represents 85% of the traffic to those 200 websites.
Since then, the Bill’s scope has been broadened to include search services and pornography publishers, meaning that children will be protected from pornography wherever it appears online. The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk to children, such as online pornography. Age-assurance technologies and other measures will be used to provide children with an age-appropriate experience on their service.
As noble Lords know, the Bill does not mandate that companies use specific approaches or technologies when keeping children safe online as it is important that the Bill is future-proofed: what is effective today might not be so effective in the future. Moreover, age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties under the Bill. For instance, if a user-to-user service, such as a social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. That would allow content to be better detected and removed, instead of restricting children from a service that is designed to be appropriate for their use—as my noble friend Lady Harding of Winscombe puts it, avoiding the situation where children are removed from these services altogether.
While I am sympathetic to the aims of these amendments, I assure noble Lords that the Bill already has robust, comprehensive protections in place to keep children safe from all pornographic content, wherever or however it appears online. This amendment is therefore unnecessary because it duplicates the existing provisions for user-to-user pornography in the child safety duties in Part 3.
It is important to be clear that, wherever they are regulated in the Bill, companies will need to ensure that children cannot access pornographic content online. This is made clear, for user-to-user content, in Clause 11(3); for search services, in Clause 25(3); and for published pornographic content in Clause 72(2). Moving the regulation of pornography from Part 3 to Part 5 would not be a workable or desirable option because the framework is effective only if it is designed to reflect the characteristics of the services in scope.
Part 3 has been designed to address the particular issues arising from the rapid growth in platforms that allow the sharing of user-generated content but are not the ones choosing to upload that content. The scale and speed of dissemination of user-generated content online demands a risk-based and proportionate approach, as Part 3 sets out.
It is also important that these companies understand the risks to children in the round, rather than focusing on one particular type of content. Risks to children will often be a consequence of the design of these services—for instance, through algorithms, which need to be tackled holistically.
I know that the noble Baroness is concerned about whether pornography will indeed be designated as primary priority content for the purposes of the child safety duties in Clauses 11(3) and 25(3). The Government fully intend this to be the case, which means that user-to-user services will need to have appropriate systems to prevent children accessing pornography, as defined in Clause 70(2).
The approach taken in Part 3 is very different from services captured under Part 5, which are publishing content directly, know exactly where it is located on their site and already face legal liability for the content. In this situation the service has full control over its content, so a risk-based approach is not appropriate. It is reasonable to expect that service to prevent children accessing pornography. We do not therefore consider it necessary or effective to apply the Part 5 duties to user-to-user pornographic content.
I also assure the noble Baroness and the noble Lord that, in a case where a provider of user-to-user services is directly publishing pornographic content on its own service, it will already be subject to the Part 5 duties in relation to that particular content. Those duties in relation to that published pornographic content will be separate from and in addition to their Part 3 duties in relation to user-generated pornographic content.
This means that, no matter where published pornographic content appears, the obligation to ensure that children are not normally able to encounter it will apply to all in-scope internet service providers that publish pornographic content. This is made clear in Clause 71(2) and is regardless of whether they also offer user-to-user or search services.
As set out in a recent letter to your Lordships, Ofcom will prioritise protecting children from pornography and other harmful content. In the autumn it intends to publish draft guidance for Part 5 pornography duties and draft codes of practice for Part 3 illegal content duties, including for child sexual exploitation and abuse content. Draft codes of practice for children’s safety duties will follow in summer 2024. These elements of the regime are being prioritised ahead of others, such as category 1 duties, to reflect the critical importance of protecting children.
It is right that Ofcom consult on Part 5 guidance as quickly as possible, to protect children from accessing pornography on Part 5 services. Part 5 guidance is focused entirely on the provision of pornography, whereas codes of practice under Part 3 are significantly more complex, as they deal with other forms of harmful content and so require longer to develop. This may mean there will be a limited period of time during which Part 5 protections are in place ahead of those in Part 3. It would not be right to delay the Part 5 consultation on that basis.
As the Bill makes clear, we expect companies to use technology such as age verification to prevent children accessing pornography, whether it is user-generated or published. Any technology used to comply with the Bill will need to be effective in accurately identifying the age of users. Ofcom will be able to take enforcement action if a company uses inadequate technological solutions. But, as I mentioned earlier, the Bill will not mandate specific approaches or technologies.
In her Amendment 125A, the noble Baroness, Lady Ritchie of Downpatrick, raises concerns that a provider of pornographic content could move from being a Part 5 service to a Part 3 service if they allow comments or reviews on their content. I am grateful to her for raising and discussing the issue earlier. Amendment 125A in her name intends to narrow—
I am sorry, but can the Minister just clarify that? Is he saying that it is not possible to be covered by both Part 3 and Part 5, so that where a Part 5 service has user-generated content it is also covered by Part 3? Can he clarify that you cannot just escape Part 5 by adding user-generated content?
Yes, that is correct. I was trying to address the points raised by the noble Baroness, but the noble Lord is right. The point on whether people might try to be treated differently by allowing comments or reviews on their content is that they would be treated the same way. That is the motivation behind the noble Baroness’s amendment trying to narrow the definition. There is no risk that a publisher of pornographic content could evade their Part 5 duties by enabling comments or reviews on their content. That would be the case whether or not those reviews contained words, non-verbal indications that a user liked something, emojis or any other form of user-generated content.
That is because the Bill has been designed to confer duties on different types of content. Any service with provider pornographic content will need to comply with the Part 5 duties to ensure that children cannot normally encounter such content. If they add user-generated functionality—
It is covered in the Bill as Twitter. I am not quite sure what my noble friend is asking me. The harms that he is worried about are covered in different ways. Twitter or another social medium that hosts such content would be hosting it, not publishing it, so would be covered by Part 3 in that instance.
Perhaps I will speak to the noble Lord afterwards and make sure I have his question right before I do so.
I hope that answers the questions from the noble Baroness, Lady Ritchie, and that on that basis, she will be happy to withdraw her amendment.
My Lords, this has been a very wide-ranging debate, concentrating not only on the definition of pornography but on the views of noble Lords in relation to how it should be regulated, and whether it should be regulated, as the noble Baroness, Lady Kidron, the noble Lords, Lord Bethell and Lord Browne, and I myself believe, or whether it should be a graduated response, which seems to be the view of the noble Lords, Lord Allan and Lord Clement-Jones.
I believe that all pornography should be treated the same. There is no graduated response. It is something that is pernicious and leads to unintended consequences for many young people, so therefore it needs to be regulated in all its forms. I think that is the point that the noble Lord, Lord Bethell, was making. I believe that these amendments should have been debated along with those of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, because then we could have an ever wider-ranging debate, and I look forward to that in the further groups in the days to come. The focus should be on the content, not on the platform, and the content is about pornography.
I agree with the noble Baroness, Lady Kidron, that porn is not the only harm, and I will be supporting her amendments. I believe that they should be in the Bill because if we are serious about dealing with these issues, they have to be in there.
I do not think my amendments are suggesting that children will be removed from social media. I agree that it is a choice to remove pornography or to age-gate. Twitter is moving to subscriber content anyway, so it can do it; the technology is already available to do that. I believe you just age-gate the porn content, not the whole site. I agree with the noble Lord, Lord Clement-Jones, as I said. These amendments should have been debated in conjunction with those of the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, as I believe that the amendments in this group are complementary to those, and I think I already said that in my original submission.
I found the Minister’s response interesting. Obviously, I would like time to read Hansard. I think certain undertakings were given, but I want to see clearly spelled out where they are and to discuss with colleagues across the House where we take these issues and what we come back with on Report.
I believe that these issues will be debated further in Committee when the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, are debated. I hope that in the intervening period the Minister will have time to reflect on the issues raised today about Parts 3 and 5 and the issue of pornography, and that he will be able to help us in further sessions in assuaging the concerns that we have raised about pornography. There is no doubt that these issues will come back. The only way that they can be dealt with, that pornography can be dealt with and that all our children throughout the UK can be dealt with is through proper regulation.
I think we all need further reflection. I will see, along with colleagues, whether it is possible to come back on Report. In the meantime, I beg leave to withdraw the amendment.
Amendment 12BA withdrawn.
Amendments 12C to 12E
12C: Clause 6, page 5, line 23, at end insert “(2) to (10)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 11 below (because the new duty to summarise children’s risk assessments in the terms of service is only imposed on providers of Category 1 services).
12D: Clause 6, page 5, line 25, at end insert—
“(za) the duty about illegal content risk assessments set out in section 9(8A),(zb) the duty about children’s risk assessments set out in section 11(10A),”Member’s explanatory statement
This amendment ensures that the new duties set out in the amendments in the Minister’s name to clauses 9 and 11 below (duties to summarise risk assessments in the terms of service) are imposed on providers of Category 1 services only.
12E: Clause 6, page 5, line 32, at end insert “, and
(f) the duty about record-keeping set out in section 19(8A).”Member’s explanatory statement
This amendment ensures that the new duty set out in the amendment in the Minister’s name to clause 19 below (duty to supply records of risk assessments to OFCOM) is imposed on providers of Category 1 services only.
Amendments 12C to 12E agreed.
House adjourned at 9.55 pm.