Skip to main content

Online Safety Bill

Volume 830: debated on Tuesday 23 May 2023

Committee (8th Day)

Relevant documents: 28th Report from the Delegated Powers Committee

Clause 38: Procedure for issuing codes of practice

Amendment 110

Moved by

110: Clause 38, page 38, line 24, leave out subsections (2) to (8) and insert—

“(2) Upon receiving the draft code of practice from OFCOM, the Secretary of State must— (a) make a statement confirming they have received the draft code of practice, and(b) lay the draft code of practice before Parliament.(3) Unless the Secretary of State intends to give a direction to OFCOM under section 39(1) in relation to the draft, regulations giving effect to the code of practice may not be laid before Parliament unless the Secretary of State has—(a) consulted each devolved authority on the content of the draft code of practice;(b) produced an impact assessment including, but not limited to, an assessment of the impact of the proposed regulations on—(i) human rights and equalities,(ii) freedom of expression, and(iii) employment and labour; and(c) produced an assessment of the impact of the proposed regulations on children and vulnerable adults.(4) The Secretary of State may not make regulations under this section until any select committee charged by the relevant House of Parliament with scrutinising regulations made under this section has—(a) completed its consideration of the draft code of practice and the impact assessments referred to in subsection (3)(b) and (c), and(b) reported on its deliberation to the relevant House; andthe report of the committee has been debated in that House, or the period of six weeks beginning on the day on which the committee reported has elapsed.(5) The Secretary of State may not lay regulations under this section until they are satisfied that—(a) issues raised by a devolved authority have been resolved, or(b) if they have not been resolved, the Secretary of State has informed Parliament of the steps they intend to take in response to the issues raised.”Member’s explanatory statement

This amendment, which replaces most of the current Clause 38, would require the Secretary of State to publish draft codes of conduct from OFCOM for consideration by relevant committees of both Houses of Parliament.

My Lords, I rise to move Amendment 110 in my name and thank the noble Lord, Lord Clement-Jones, for his support. This is a complex group of amendments but they are about very significant powers that are supposed to be granted to the Secretary of State in this Bill. We believe that this part of the Bill must be significantly amended before it leaves this House, and while we await the Government’s response to the amendments in my name and that of the noble Baroness, Lady Stowell, I want to make it clear that if we do not see some significant movement from the Government we will return to these issues on Report. As it looks as though we will be having another long hiatus before Report, there is plenty of time for discussion and agreement.

Two House of Lords committees—the Communications and Digital Committee and the Delegated Powers and Regulatory Reform Committee—have called on the Government to remove or amend a number of the clauses engaged by these amendments, and a third, the Constitution Committee, has noted the concerns raised. I think it fair to say that these issues concern all parties and all groups in the House and urgently need addressing. The noble Baroness, Lady Stowell, in her capacity as chair of the Communications and Digital Committee, has a number of amendments very similar to mine to which I and others have signed up, and which I know she will go through in detail. I support the line she and the committee are taking, although I make some additional suggestions in some areas.

The amendments from the noble Lord, Lord Moylan —who I am sad to see is not in his place and who will not therefore be able to participate in this debate—broadly support the thrust of the amendments in this group. Perhaps they do not go quite as far as ours do, but it is certainly nice to have him on our side—for a change. I do not want to delay the Committee as I know many of us will want to discuss the points which will be raised in detail by the noble Baroness, Lady Stowell, so I think the best thing is for me to talk more generally about where we think the Government need to change approach, and I hope my remarks will open up the debate.

Before I do that, I thank the Carnegie Trust—I know a number of noble Lords have received documentation from it—for its detailed work in this area in particular, but it has covered the Bill comprehensively. It has been invaluable and we have also received support from the All-Party Digital Regulation Group, which has been pushing information around as well.

We have mentioned in the past the difficulty of amending the Bill because of the structures and the different way it treats the various types of company likely to be in scope. But, in essence, my amendments would ensure that Ofcom is able to operate as an independent regulator, delivering what is required of it under the Bill, and is not subject to instruction or direction by the Secretary of State except in exceptional circumstances. We are told that these will be restricted mainly to national security issues or public safety, though precisely what those issues are going to be needs spelling out in the Bill.

The Secretary of State should not be able to give Ofcom direction. In the broadcasting regime, there are no equivalent powers. Our press is not regulated in that way. We believe that the right approach is that the Secretary of State should, if he or she wishes, write to Ofcom with non-binding observations when it is thought necessary to do so. It would be for Ofcom to have regard to such letters, but there should be no requirement to act, provided that it operates within its powers as set out in the Bill. It follows that the powers taken by the Secretary of State in Clause 156 to issue directions to Ofcom in special circumstances, in Clause 157 to issue detailed tactical guidance to Ofcom in the exercise of its functions, and in Clause 153, which allows the Secretary of State to make a statement of strategic priorities relating to online safety, are significant threats to the independence of Ofcom, and we believe that they should be deleted. In addition, Clauses 38 and 39 need to be revised.

The independence of media regulators is important and must be preserved as it is at present. That is the norm in most developed democracies. The UK has signed many international statements in this vein, including, as recently as in April 2022 at the Council of Europe, a statement saying that

“media and communication governance should be independent and impartial to avoid undue influence on policy making, discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power”.

I hope that when he comes to respond to the debate, the Minister will confirm that he stands by that international agreement that his Government have signed up to.

My second point deals with the other powers given to the Secretary of State in the Online Safety Bill—for example, to specify in regulations the primary priority content harmful to children and priority content harmful to children in Clause 54; to amend the duties on fraudulent advertising in Clause 191; to change the exemption to the regime in Clause 192; and to amend the list of terrorism offences, CSEA offences and other priority offences in Clause 194. Appropriate procedures for the exercise of these powers—ensuring that they are in line with the approach of this group of amendments —need to be set out in the Bill, because the present drafting is, in our view, inadequate. The reliance on conventional secondary legislation approval mechanisms will not be sufficient given the scale and impact of what is in contemplation.

At Second Reading, the Minister said,

“we remain committed to ensuring that Ofcom maintains its regulatory independence, which is vital to the success of this framework … We intend to bring forward two changes to the existing power: first, replacing the ‘public policy’ wording with a defined list of reasons that a direction can be made; and secondly, making it clear that this element of the power can only be used in exceptional circumstances … the framework ensures that Parliament will always have the final say on codes of practice, and that strong safeguards are in place to ensure that the use of this power is transparent and proportionate”.—[Official Report, 1/2/23; cols. 691-2.]

Those are fine words but, unfortunately, we have not yet seen the draft amendments that would give credence to that statement. Can the Minister give us any hint on the timetable?

My third point is that we are also not convinced that the processes currently specified for the approval of the high volume of secondary legislation pursuant to the Bill, including the codes of practice, engage sufficiently with Parliament. As my noble friend Lady Merron said at Second Reading, in our view the Bill suffers from an imbalance around what role Parliament should have in scrutinising the new regime and how changes to the statutory functions will be accommodated in future years. We can all agree that there will certainly be many more such occasions and more legislation in this area in future years.

This is, of course, a skeleton Bill, requiring significant amounts of secondary legislation before it begins to bite. How should Parliament be involved, both in the necessary scrutiny of those codes of practice, which put the regime into practice and define the way in which the regulated companies are to operate, and in anticipating changes that will be required as technology develops? It is to answer this question that I have put down a number of amendments aimed at carving out a role for the Select Committees of the two Houses—or perhaps a new Joint Committee, if that were to be the decision of Parliament. Indeed, that was a recommendation of the pre-legislative scrutiny committee and the Communications and Digital Committee in previous reports.

My Amendment 290, after Clause 197, tries to gather together the instances of powers exercisable by the Secretary of State and provide an additional parliamentary stage each time those powers are exercised. This would require that:

“The Secretary of State may not exercise the powers”

granted under the Bill unless and until

“any select committee charged by the relevant House of Parliament with scrutinising such regulations has … completed its consideration of the draft regulations and … reported on their deliberation to the relevant House”.

I appreciate that this is a major step. Introducing parliamentary scrutiny of this type may mean it takes more time to achieve results in what is already a complex process. Maybe this should be introduced in stages so as not to delay further the measures in the Bill.

The idea of engaging the Select Committees of Parliament is not unprecedented. It was introduced in a similar form as the Grimstone rule, using an agreed statement from the Dispatch Box setting out a commitment by the Government for the procedure for Select Committee consideration of trade agreements in both Houses under the international trade Bill. Similar issues have been raised recently in other Bills this Session. Do we like the sound of a parallel Parkinson rule? The noble Lord smiled—he must be pleased.

At heart, I recognise that this is in principle no more than ensuring that the expertise and knowledge of those who have served in an appropriate parliamentary Select Committee are grafted on to the normal affirmative or negative approval mechanisms for secondary legislation, but I also think it opens up a substantial new way of doing what has, on many occasions, been merely a rubber-stamping of what can be rather significant policy changes. It also gives a good opportunity to bring Parliament and parliamentarians into the policy delivery mechanism in what seems to me to be a satisfying way. It makes sense to do this for a complex new regime in a fast-changing technological environment such as the one that the Bill is ushering in, but it might have other applications, particularly consideration of other legislation that is currently in the pipeline. I beg to move.

My Lords, it is a great pleasure to follow the noble Lord, Lord Stevenson. I am grateful to him, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, for their support for my amendments, which I will come to in a moment. Before I do, I know that my noble friend Lord Moylan will be very disappointed not to be here for the start of this debate. From the conversation I had with him last week when we were deliberating the Bill, I know that he is detained on committee business away from the House. That is what is keeping him today; I hope he may join us a bit later.

Before I get into the detail of my amendments, I want to take a step back and look at the bigger picture. I remind noble Lords that on the first day in Committee, when we discussed the purpose of the Bill, one of the points I made was that, in my view, the Bill is about increasing big tech’s accountability to the public. For too long, and I am not saying anything that is new or novel here, it has enjoyed power beyond anything that other media organisations have enjoyed—including the broadcasters, which, as we know, have been subject to regulation for a long time now. I say that because, in my mind, the fundamental problem this legislation seeks to address is the lack of accountability of social media and tech platforms to citizens and users for the power and influence they have over our lives and society, as well as their economic impact. The latter will be addressed via the Digital Markets, Competition and Consumers Bill.

I emphasise “if that is the problem”, because when we talk about this bit of the Bill and the amendments we have tabled, we have started—and I am as guilty of this as anyone else—to frame it very much as if the problem is around the powers for the Secretary of State. In my view, we need to think about why they are not, in the way they are currently proposed, the right solution to the problem that I have outlined.

I do not think what we should be doing, as some of what is proposed in the Bill tends to do, is shift the democratic deficit from big tech to the regulator, although, of course, like all regulators, Ofcom must serve the public interest as a whole, which means taking everyone’s expectations seriously in the way in which it goes about its work.

That kind of analysis of the problem is probably behind some of what the Government are proposing by way of greater powers for the Secretary of State for oversight and direction of the regulator in what is, as we have heard, a novel regulatory space. I think that the problem with some, although not all, of the new powers proposed for the Secretary of State is that they would undermine the independence of Ofcom and therefore dilute the regulator’s authority over the social media and tech platforms, and that is in addition to what the noble Lord, Lord Stevenson, has already said, which is that there is a fundamental principle about the independence of media regulators in the western world that we also need to uphold and to which the Government have already subscribed.

If that is the bigger picture, my amendments would redress the balance between the regulator and the Executive, but there remains the vital role of Parliament, which I will come back to in a moment and which the noble Lord, Lord Stevenson, has already touched on, because that is where we need to beef up oversight of regulators.

Before I get into the detail, I should also add that my amendments have the full authority of your Lordships’ Communications and Digital Select Committee, which I have the great honour of chairing. In January, we took evidence from my noble friend Minister and his colleague, Paul Scully, and our amendments are the result of their evidence. I have to say that my noble friend on the Front Bench is someone for whom I have huge respect and admiration, but on that day when the Ministers were before us, we found as a committee that the Government’s evidence in respect of the powers that they were proposing for the Secretary of State was not that convincing.

I shall outline the amendments, starting with Amendments 113, 114, and 115. I am grateful to other noble Lords who have signed them, which demonstrates support from around the House. The Bill allows the Secretary of State to direct Ofcom to change its codes of practice on regulating social media firms for reasons of public policy. While it is legitimate for the Government to set strategic direction, this goes further and amounts to direct and unnecessary interference. The Government have suggested clarifying this clause, as we have heard, with a list of issues such as security, foreign policy, economic policy and burden to business, but it is our view as a committee that the list of items is so vague and expansive that almost anything could be included in it. Nor does it recognise the fact that the Government should respect the separation of powers between Executive and regulator in the first place, as I have already described. These amendments would therefore remove the Secretary of State’s power to direct Ofcom for reasons of public policy. Instead, the Secretary of State may write to Ofcom with non-binding observations on issues of security and child safety to which it must have regard. It is worth noting that under Clause 156 the Secretary of State still has powers to direct Ofcom in special circumstances to address threats to public health, safety and security, so the Government will not be left toothless, although I note that the noble Lord, Lord Stevenson, is proposing to remove Clause 156. Just to be clear, the committee is not proposing removing Clause 156; that is a place where the noble Lord and I propose different remedies.

Amendments 117 and 118 are about limiting the risk of infinite ping-pong. As part of its implementation work, Ofcom will have to develop codes of practice, but the Government can reject those proposals infinitely if they disagree with them. At the moment that would all happen behind closed doors. In theory, this process could go on for ever, with no parliamentary oversight. The Select Committee and I struggle to understand why the Government see this power as necessary, so our amendments would remove the Secretary of State’s power to issue unlimited directions to Ofcom on a draft code of practice, replacing it with a maximum of two exchanges of letters.

Amendment 120, also supported by the noble Lords I referred to earlier, is closely related to previous amendments. It is designed to improve parliamentary oversight of Ofcom’s draft codes of practice. Given the novel nature of the proposals to regulate the online world, we need to ensure that the Government and Ofcom have the space and flexibility to develop and adapt their proposals accordingly, but there needs to be a role for Parliament in scrutinising that work and being able to hold the Executive and regulator to account where needed. The amendment would ensure that the affirmative procedure, and not the negative procedure currently proposed in the Bill, was used to approve Ofcom’s codes of practice if they had been subject to attempts by the Secretary of State to introduce changes. This amendment is also supported by the Delegated Powers and Regulatory Reform Committee in its report.

Finally, Amendment 257 would remove paragraph (a) from Clause 157(1). This is closely related to previous amendments regarding the Secretary of State’s powers. The clause currently provides powers to provide wide-ranging guidance to Ofcom about how it carries out its work. This is expansive and poorly defined, and the committee again struggled to see the necessity for it. The Secretary of State already has extensive powers to set strategic priorities for Ofcom, establish expert advisory committees, direct action in special circumstances, direct Ofcom about its codes or just write to it if my amendments are accepted, give guidance to Ofcom about its media literacy work, change definitions, and require Ofcom to review its codes and undertake a comprehensive review of the entire online safety regime. Including yet another power to give unlimited guidance to Ofcom about how it should carry out its work seems unnecessary and intrusive, so this amendment would remove it, by removing paragraph (a) of Clause 157(1).

I hope noble Lords can see that, even after taking account of the amendments that the committee is proposing, the Secretary of State would be left with substantial and suitable powers to discharge their responsibilities properly.

Perhaps I may comment on some of the amendments to which I have not added my name. Amendment 110 from the noble Lords, Lord Stevenson and Lord Clement-Jones, and Amendment 290 from the noble Lord, Lord Stevenson, are about parliamentary oversight by Select Committees. I do not support the detail of these amendments nor the procedures proposed, because I believe they are potentially too cumbersome and could cause too much delay to various processes. As I have already said, and as the noble Lord, Lord Stevenson, said in opening, the Select Committee and I are concerned to ensure that there is adequate parliamentary oversight of Ofcom as it implements this legislation over the next few years. My committee clearly has a role in this, alongside the new DSIT Select Committee in the House of Commons and perhaps others, but we need to guard against duplication and fragmentation.

There are bigger questions about the parliamentary oversight of regulators in the wider digital field, as none of this is sector-specific and it also affects the CMA—we will see that magnified when we get to the digital markets Bill—so I do not think parliamentary oversight is something that we can just ignore. It is an issue of growing importance when it comes to regulators, particularly those that are regulating in areas that are new and different and require a different kind of approach by those regulators. As I said at the start, we need to get the distribution of power right between big tech, the regulators, the Government and Parliament if we are to achieve what I think is our ultimate aim and purpose: greater accountability to the public at large for the technology that has so much power at so many levels of our individual and national life.

In reply to my letter to the Secretary of State in January—with my committee hat on—which is available on the committee’s website, the Minister here and Paul Scully, the Minister in the other place, indicated a willingness to discuss my amendments after Committee. I hope my noble friend and his colleagues will honour that commitment and the Government will accept my amendments. While there are other amendments in this group that would provide interesting solutions—as the noble Lord, Lord Stevenson, has said, some of them go a bit further than those that I am proposing—what the committee is proposing represents a measured and appropriate approach, and I hope the Government take it seriously. I look forward to discussing that further with the Minister.

I also hope that the Government support Parliament in enhancing its oversight of the regulators in which so much power is being vested. However expert, independent and professional they may be—I note that my noble friend Lord Grade is not in the Chamber today, as I believe he is overseas this week, but no one respects and admires my noble friend more than I do, and I am not concerned in any way about the expertise and professionalism of Ofcom—none the less we are in a situation where they are being vested with a huge amount of power and we need to make sure that the oversight of them is right. Even if I do not support that which is specifically put forward by the noble Lord, Lord Stevenson, this is an area where we need to move forward but we need the Government to support us in doing so if we are going to make it happen. I look forward to what my noble friend has to say in response to this group.

My Lords, I have put my name to Amendments 113, 114, 117, 118, 120 and 257. As the noble Baroness, Lady Stowell, has said, it is crucial that Ofcom both has and is seen to have complete independence from political interference when exercising its duty as a regulator.

On Ofcom’s website there is an article titled “Why Independence Matters in Regulating TV and Radio”—for the purposes of the Bill, I suggest that we add “Online”. It states:

“We investigate following our published procedures which contain clear, transparent and fair processes. It’s vital that our decisions are always reached independently and impartially”.

I am sure there are few Members of the Committee who would disagree with that statement. That sentiment is supported by a recent UNESCO conference to create global guidance for online safety regulation, whose concluding statement said that

“an independent authority is better placed to act impartially in the public interest and to avoid undue influence from political or industry interests”.

As the noble Baroness, Lady Stowell, has said, that is what successive Governments have striven to do with Ofcom’s regulation of broadcast and radio. Now the Government and Parliament must succeed in doing the same by setting up this Bill to ensure absolute independence for Ofcom in regulating the digital space.

The codes of practice drawn up by Ofcom will be central to the guidance for the parameters set out by the media regulator for the tech companies, so it is essential that the regulator, when setting them up, can act independently from political interference. In my view and that of many local Lords, Clause 39 does not provide that level of independence from political interference. No impartial observer can think that the clause as drafted allows Ofcom the independence that it needs to shape the limits of the tech platforms’ content. In my view, this is a danger to freedom of expression in our country by giving permission for the Secretary of State to interfere continually and persistently in Ofcom’s work.

Amendments 114 and 115 would ensure a badly needed reinforcement of the regulator’s independence. I see why the Minister would want a Secretary of State to have the right to direct the regulator, but I ask him to bear in mind that it will not always be a Minister he supports who is doing the directing. In those circumstances, surely he would prefer a Secretary of State to observe or have regard to the views on the draft codes of practice. Likewise, the endless ping-pong envisaged by Clause 39(7) and (9) allows huge political pressure and interference to be placed on the regulator. This would not be allowed in broadcast regulation, so why is it allowed for online regulation, which is already the dominant medium and can get only more dominant and more important?

Amendment 114 is crucial. Clause 39(1)(a), allowing the Minister’s direction to cover public policy, covers almost everything and is impossibly broad and vague. If the Government want an independent regulator, can the Minister explain how this power would facilitate that goal? I am unsure of how the Government will approach this issue, but I am told that they want to recognise the concerns about an overmighty Secretary of State by bringing forward their own amendment, limiting the powers of direction to specific policy areas. Can the Minister confirm that he is looking at using the same areas as in the Communications Act 2003, which are

“national security … relations with the government of a country … compliance with international obligations of the United Kingdom … the safety of the public or of public health”?

I worry about any government amendment which might go further and cover economic policy and burden to business. I understand that the Government would want to respond to the concerns that this Bill might create a burden on business and therefore could direct Ofcom to ease regulations in these areas. However, if this area is to be included, surely it will create a lobbyists’ charter. We all know how effective the big tech companies have been at lobbying the Government and slowing down the process of shaping this Bill. The Minister has only to talk to some of the Members who have helped to shape the Bill to know the determination and influence of those lobbying companies.

To allow the DCMS Secretary of State to direct Ofcom continuously to modify the codes of practice until they are no longer a burden to business would dramatically dilute the power and independence of the UK’s world-respected media regulator. Surely this is not what the people of Britain would want; the Minister should not want it either. The words “vague” and “broad” are used repeatedly by freedom of speech campaigners when looking at the powers of political interference in the Bill.

When the draft Bill came out, I was appalled by the extraordinary powers that it gave the Secretary of State to modify the content covered by “legal but harmful”, and I am grateful to the Government for responding to the Joint Committee and many other people’s concerns about this potentially authoritarian power. Clause 39 is not in the same league, but for all of us who want to ensure that Ministers do not have the power to interfere in the independence of Ofcom, I ask the Minister to accept the well-thought-through solutions represented by these amendments and supported by all Benches. I also support the request made by the noble Baroness, Lady Stowell, that Parliament should be involved in the oversight of Ofcom. I ask the Minister to respond to these widely supported amendments, either by accepting them or by tabling amendments of his own which guarantee the independence of the regulator.

My Lords, I broadly support all these amendments in spirit, since, as we have heard, they tackle excessive levels of influence that any Secretary of State is awarding themselves to shape the strategic priorities and codes of conduct of Ofcom. I will speak to Amendments 254 and 260, tabled by the noble Lord, Lord Moylan, who I am glad to see in his place. He will see in Hansard that he was about to be much missed. I cannot do him credit, but I will carry on regardless because I support his amendments.

The main difference between Amendment 254 and other similar amendments is that it requires that any guidance issued to Ofcom—under Clause 157, for example—is

“approved by resolution of each House of Parliament”

rather than by committees. However, the spirit of it, which is to remove the Secretary of State’s power to give wide-ranging guidance or instructions about Ofcom’s functions, and the concerns that we have about that, is broadly in line with everything else we have heard.

It is important to ask whether it is appropriate for our right to freedom of expression to be curtailed by secondary legislation, which cannot be amended, which has little parliamentary oversight and on which challenges are very often reduced to nothing more than rhetorical whinges in this House. That would mean that the power exercised by the Secretary of State would bypass the full democratic process.

In fact, it also weakens our capacity to hold Ofcom to account. One thing that has become apparent throughout our Committee deliberations is that—as I think the noble Baroness, Lady Stowell, indicated—Ofcom will be an uber-regulator. It is itself very powerful, as I have raised, with respect to potentially policing and controlling what UK citizens see, read and have access to online, what they are allowed to search, private messaging and so on. In some ways, I want Ofcom to have much more scrutiny and be accountable to Parliament, elected politicians and the public realm. But that is not the same as saying that Ofcom should be accountable and answerable to the Secretary of State; that would be a whole different ball game. It could be said that the Secretary of State will be elected, but we know that that is a sleight of hand.

I want more accountability and scrutiny of Ofcom by individual users of online services; we even talked about that, the other day, in relation to complaints. I want more democratic scrutiny, but the Bill does the opposite of that by allowing the Government a huge amount of executive power to shape the proposed system of online speech moderation and even influence political discourse in the public square.

I want to move on to that issue. Under the Bill, the Secretary of State will have the power to set Ofcom’s strategic priorities, direct Ofcom to modify its code of practice through secondary legislation, set criteria for platform categorisation and designate priority illegal offences. They will be able to change codes of practice for “reasons of public policy”, which is as vague a phrase as you will ever get. I fear that, frankly, that level of discretion is likely to lead to a highly politicised and—my dread—censorship-heavy approach to regulation.

The Secretary of State could come under extreme pressure to respond to each individual concerning case of digital content—whatever happens to be in the news this week—with an ever-expanding list of areas to be dealt with. I dread that this will inevitably be exploited by highly political lobbyists and organisations, who will say, “You must act on this. This is hate speech. You’ve got to do something about this”. That is a completely arbitrary way to behave.

According to the Bill, Ofcom has no choice but to comply, and that obviously leads to the dangers of politicisation. I do not think it is scaremongering to say that this is politicisation and could compromise the independence of Ofcom. The Secretary of State’s power of direction could mean that the Government are given the ability to shape the permissibility of categories of online content, based on the political mood of the day, and the political whims of a specific Secretary of State to satisfy a short-term moral panic on a particular issue.

One question for the Minister, and the Government, is: should you ever create powers that you would not want to see your political opponents exercising? The Secretary of State today will not always be the Secretary of State tomorrow; they will not always be in the same image. Awarding such overwhelming powers, and the potential politicising of policing speech, might feel comfortable for the Government today; it might be less comfortable when you look at the way that some people view, for example, the tenets of Conservatism.

In recent weeks, since a “National Conservatism” conference was held up the road, I have heard members of opposition parties describe the contents of Conservatism as “Trumpist”, “far-right” and “fascist” hate speech. I am worried—on behalf of the Government —that some of those people might end up as a Secretary of State and it could all blow up in their face as it were, metaphorically.

In all seriousness, because I am really not interested in the fate of either the Opposition or the Government in terms of their parties, I am trying to say that it is too arbitrary. In a situation where we have such weak commitments to freedom of conscience, thought or speech in this Bill, I really do not want to give the Secretary of State the power to broaden out the targets that might be victim to it.

Finally—and I apologise to the noble Lord, Lord Moylan, who would have been much more professional, specific and hard-hitting on his amendment—from what I have heard, I hope that all the tablers of the amendments from all parties might well have got together by Report and come up with satisfactory amendments that will deal with this. I think we all agree, for once, that something needs to be done to curtail power. I look forward to supporting that later in the process.

My Lords, I rise very briefly to support the amendments in the name of the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson. Like other speakers, I put on record my support for the regulator being offered independence and Parliament having a role.

However, I want to say one very brief and minor thing about timing—I feel somewhat embarrassed after the big vision of the noble Baroness, Lady Stowell. Having had quite a lot of experience of code making over the last three years, I experienced the amount of time that the department was able to take in responding to the regulator as being a point of power, a point of lobbying, as others have said, and a point of huge distraction. For those of us who have followed the Bill for five years and as many Secretaries of State, we should be concerned that none of the amendments has quite tackled the question of time.

The idea of acting within a timeframe is not without precedent; the National Security and Investment Act 2021 is just one recent example. What was interesting about that Act was that the reason given for the Secretary of State’s powers being necessary was as a matter of national security—that is, they were okay and what we all agree should happen—but the reason for the time restriction was for business stability. I put it to the Committee that the real prospect of children and other users being harmed requires the same consideration as business stability. Without a time limit, it is possible that inaction can be used to control or simply fritter away.

My Lords, I will make a short contribution on this substantive question of whether concerns about ministerial overreach are legitimate. Based on a decade of being on the receiving end of representations from Ministers, the short answer is yes. I want to expand on that with some examples.

My experience of working on the other side, inside a company, was that you often got what I call the cycle of outrage: something is shared on social media that upsets people; the media write a front-page story about it; government Ministers and other politicians get involved; that then feeds back into the media and the cycle spins up to a point where something must be done. The “something” is typically that the Minister summons people, such as me in my old job, and brings them into an office. That itself often becomes a major TV moment, where you are brought in, browbeaten and sent out again with your tail between your legs, and the Minister has instructed you to do something. That entire process takes place in the political rather than the regulatory domain.

I readily concede that, in many cases, something of substance needed to be addressed and there was a genuine problem. It is not that this was illegitimate, but these amendments are talking about the process for what we should do when that outrage is happening. I agree entirely with the tablers of the amendments that, to the extent that that process can be encapsulated within the regulator rather than a Minister acting on an ad hoc basis, it would be a significant improvement.

I also note that this is certainly not UK-specific, and it would happen in many countries with varying degrees of threat. I remember being summoned to the Ministry of the Interior in Italy to meet a gentleman who has now sadly passed. He brought me into his office, sat me down, pointed to his desk and said “You see that desk? That was Mussolini’s desk”. He was a nice guy and I left with a CD of his rhythm and blues band, but it was clear that I was not supposed to say no to him. He made a very clear and explicit political direction about content that was on the platform.

One big advantage of this Bill is that it has the potential to move beyond that world. It could move from individual people in companies—the noble Baroness, Lady Stowell of Beeston, made this point very powerfully—to changing the accountability model away from either platforms being entirely accountable themselves or platforms and others, including Ministers, somehow doing deals that will have an impact, as the noble Baroness, Lady Fox, and the noble Viscount, Lord Colville, said, on the freedom of expression of people across the country. We do not want that.

We want to move on in the Bill and I think we have a model which could work. The regulator will take on the outrage and go as far as it can under the powers granted in the Bill. If the regulator believes that it has insufficient powers, it will come back to Parliament and ask for more. That is the way in which the system can and should work. I think I referred to this at Second Reading; we have an opportunity to create clear accountability. Parliament instructs Ofcom, which instructs the platforms. The platforms do what Ofcom says, or Ofcom can sanction them. If Ofcom feels that its powers are deficient, it comes back to Parliament. The noble Lord, Lord Stevenson, and others made the point about scrutiny and us continually testing whether Ofcom has the powers and is exercising them correctly. Again, that is entirely beneficial and the Government should certainly be minded to accept those amendments.

With the Secretary of State powers, as drafted in the Bill and without the amendments we are considering today, we are effectively taking two steps forward and one step back on transparency and accountability. We have to ask: why take that step back when we are able to rely on Ofcom to do the job without these directions?

The noble Baroness, Lady Stowell of Beeston, made the point very clearly that there are other ways of doing this. The Secretary of State can express their view. I am sure that the Minister will be arguing that the Secretary of State’s powers in the Bill are better than the status quo because at least what the Secretary of State says will be visible; it will not be a back-room deal. The noble Baroness, Lady Stowell of Beeston, has proposed a very good alternative, where the Secretary of State makes visible their intentions, but not in the form of an order—rather in the form of advice. The public—it is their speech we are talking about—then have the ability to see whether they agree with Ofcom, the companies or the Secretary of State if there is any dispute about what should happen.

It is certainly the case that visible instructions from the Secretary of State would be better, but the powers as they are still leave room for arm-twisting. I can imagine a future scenario in which future employees of these platforms are summoned to the Secretary of State. But now the Secretary of State would have a draft order sitting there. The draft order is Mussolini’s desk. They say to the people from the platforms, “Look, you can do what I say, or I am going to send an order to Ofcom”. That takes us back to this world in which the public are not seeing the kind of instructions being given.

I hope that the Government will accept that some amendment is needed here. All the ones that have been proposed suggest different ways of achieving the same objective. We are trying to protect future Secretaries of State from an unhealthy temptation to intervene in ways that they should not.

My Lords, on day eight of Committee, I feel that we have all found our role. Each of us has spoken in a similar vein on a number of amendments, so I will try to be brief. As the noble Lord, Lord Allan, has spoken from his experience, I will once again reference my experience as the chief executive, for seven years, of a business regulated by Ofcom; as the chair of a regulator; and as someone who sat on the court of, arguably, the most independent of independent regulators, the Bank of England, for eight years.

I speak in support of the amendments in the name of my noble friend Lady Stowell, because, as a member of the Communications and Digital Committee, my experience, both of being regulated and as a regulator, is that independent regulators might be independent in name—they might even be independent in statute—but they exist in the political soup. It is tempting to think that they are a sort of granite island, completely immovable in the political soup, but they are more like a boat bobbing along in the turbulence of politics.

As the noble Lord, Lord Allan, has just described, they are influenced both overtly and subtly by the regulated companies themselves—I am sure we have both played that game—by politicians on all sides, and by the Government. We have played these roles a number of times in the last eight days; however, this is one of the most important groups of amendments, if we are to send the Bill back in a shape that will really make the difference that we want it to. This group of amendments challenges whether we have the right assignment of responsibility between Parliament, the regulator, government, the regulated and citizens.

It is interesting that we—every speaker so far—are all united that the Bill, as it currently stands, does not get that right. To explain why I think that, I will dwell on Amendment 114 in the name of my noble friend Lady Stowell. The amendment would remove the Secretary of State’s ability to direct Ofcom to modify a draft of the code of practice “for reasons of public policy”. It leaves open the ability to direct in the cases of terrorism, child sexual abuse, national security or public safety, but it stops the Secretary of State directing with regard to public policy. The reason I think that is so important is that, while tech companies are not wicked and evil, they have singularly failed to put internet safety, particularly child internet safety, high enough up their pecking order compared with delivering for their customers and shareholders. I do not see how a Secretary of State will be any better at that.

Arguably, the pressures on a Secretary of State are much greater than the pressures on the chief executives of tech companies. Secretaries of State will feel those pressures from the tech companies and their constituents lobbying them, and they will want to intervene and feel that they should. They will then push that bobbing boat of the independent regulator towards whichever shore they feel they need to in the moment—but that is not the way you protect people. That is not the way that we treat health and safety in the physical world. We do not say, “Well, maybe economics is more important than building a building that’s not going to fall down if we have a hurricane”. We say that we need to build safe buildings. Some 200 years ago, we were having the same debates about the physical world in this place; we were debating whether you needed to protect children working in factories, and the consequences for the economics. Well, how awful it is to say that today. That is the reality of what we are saying in the Bill now: that we are giving the Secretary of State the power to claim that the economic priority is greater than protecting children online.

I am starting to sound very emotional because at the heart of this is the suggestion that we are not taking the harms seriously enough. If we really think that we should be giving the Secretary of State the freedom to direct the regulator in such a broad way, we are diminishing the seriousness of the Bill. That is why I wholeheartedly welcome the remark from the noble Lord, Lord Stevenson, that he intends to bring this back with the full force of all of us across all sides of the Committee, if we do not hear some encouraging words from my noble friend the Minister.

My Lords, it is pleasure to follow the noble Baroness, Lady Harding, whose very powerful speech took us to the heart of the principles behind these amendments. I will add my voice, very briefly, to support the amendments for all the key reasons given. The regulator needs to be independent of the Secretary of State and seen to be so. That is the understandable view of the regulator itself, Ofcom; it was the view of the scrutiny committee; and it appears to be the view of all sides and all speakers in this debate. I am also very supportive of the various points made in favour of the principle of proper parliamentary scrutiny of the regulator going forward.

One of the key hopes for the Bill, which I think we all share, is that it will help set the tone for the future global conversation about the regulation of social media and other channels. The Government’s own impact assessment on the Bill details parallel laws under consideration in the EU, France, Australia, Germany and Ireland, and the noble Viscount, Lord Colville, referred to standards set by UNESCO. The standards set in the OSB at this point will therefore be a benchmark across the world. I urge the Government to set that benchmark at the highest possible level for the independence and parliamentary oversight of the regulator.

My Lords, I speak to support my noble friend Lady Stowell and the noble Lord, Lord Stevenson. I would like to share two insights: one a piece of experience from my role as a junior Minister, and one as it bears down on the Bill.

As a junior Health Minister responsible for innovation and life sciences, it was my responsibility to look after 22 arm’s-length bodies, including the MHRA—an incredibly powerful regulator, possibly as powerful and important as Ofcom is, and certainly will be under this new Bill. As the junior Minister, you are under huge pressure from civil society, from the pharma industry and from noble Lords—some of whom I see in the Chamber today—who all have extremely strong opinions about the regulation of medicines. They also have, at times, very important insights about patients and what might be able to be done if certain innovative medicines could be accelerated. The great thing about being the Life Sciences Minister is that there is nothing you can do about it whatever. Your hands are tied. The MHRA obeys science and the regulation of science and not, I am pleased to say, Ministers, because Ministers are not good people to judge the efficacy and safety of medicines.

My advice to the Minister is to embrace the Bethell principle: that it is a huge relief not to be able to interfere in the day-to-day operations of your regulator. I remember speaking at a G7 meeting of Health Ministers to one of my compadres, who expressed huge envy for the British system because he had demonstrators and political donors on his back night and day, trying to get him to fix the regulations one way or the other. That is my point about the day-to-day management and implementation of policy.

When it comes to the objectives of the regulator, the Bill maybe leaves scope for some improvement. I thought my noble friend put it extremely well: it is where Parliament needs to have a voice. We have seen that on the subject of age verification for porn—a subject I feel very strongly about—where, at the moment, Parliament is leaving it to the regulator to consult industry, users of the internet and wider civic society to determine what the thresholds for age verification should be. That is a mistake; it is not the right way round to do things. It is where Parliament should have a voice, because these are mandatory population-wide impositions. We are imposing them on the population, and that is best done by Parliament, not the regulator. It needs the heft of Parliament when it comes to imposing and enforcing those regulations. If you do not have that parliamentary heft, the regulator may be on a granite island but it would be a very lonely island without the support it needs when taking on extremely powerful vested interests. That is why Parliament needs a reach into the system when it comes to objective setting.

My Lords, it is a pleasure to follow the noble Lord, Lord Bethell, who is clearly passionate about this aspect. As the noble Baroness, Lady Harding, said, this is one of the most important groups of amendments that we have to debate on the Bill, even though we are on day eight of Committee. As she said, it is about the right assignment of responsibilities, so it is fundamental to the way that the Bill will operate.

My noble friend Lord Allan brilliantly summed up many of the arguments, and he has graphically described the problem of ministerial overreach, as did the noble Baroness, Lady Harding. We on these Benches strongly support the amendments put forward by the noble Lord, Lord Stevenson, and those put forward by the noble Baroness, Lady Stowell. Obviously, there is some difference of emphasis. They each follow the trail of the different committees of which their proposers were members, which is entirely understandable. I recall that the noble Lord, Lord Gilbert, was the hinge between the two committees—and brilliantly he did that. I very much hope that, when we come back at the next stage, if the Minister has not moved very far, we will find a way to combine those two strands. I think they are extremely close—many noble Lords have set out where we are on accountability and oversight.

Strangely, we are not trying to get out of the frying pan of the Secretary of State being overbearing and move to where we have no parliamentary oversight. Both the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson, are clearly in favour of greater oversight of Ofcom. The question is whether it is oversight of the codes and regulation or of Ofcom itself. I think we can find a way to combine those two strands. In that respect, I entirely agree with the noble Baroness, Lady Fox: it is all about making sure that we have the right kind of oversight.

I add my thanks to Carnegie UK. The noble Lord, Lord Stevenson, and the noble Baroness, Lady Stowell, set out the arguments, and we have the benefit of the noble Baroness’s letter to the Secretary of State of 30 January, which she mentioned in her speech. They have set out very clearly where speakers in this debate unanimously want to go.

The Government have suggested some compromise on Clause 39. As the noble Lord, Lord Stevenson said, we have not seen any wording for that, but I think it is highly unlikely that that, by itself, will satisfy the House when we come to Report.

There are many amendments here which deal with the Secretary of State’s powers, but I believe that the key ones are the product of both committees, which is about the Joint Committee. If noble Lords read the Government’s response to our Joint Committee on the draft Bill, they will see that the arguments given by the Government are extremely weak. I think it was the noble Baroness, Lady Stowell, who used the phrase “democratic deficit”. That is exactly what we are not seeking: we are trying to open this out and make sure we have better oversight and accountability. That is the goal of the amendments today. We have heard from the noble Viscount, Lord Colville, about the power of lobbying by companies. Equally, we have heard about how the Secretary of State can be overbearing. That is the risk we are trying to avoid. I very much hope that the Minister sees his way to taking on board at least some of whichever set of amendments he prefers.

My Lords, the amendments concern the independence of Ofcom and the role of parliamentary scrutiny. They are therefore indeed an important group, as those things will be vital to the success of the regime that the Bill sets up. Introducing a new, ground-breaking regime means balancing the need for regulatory independence with a transparent system of checks and balances. The Bill therefore gives powers to the Secretary of State comprising a power to direct Ofcom to modify a code of practice, a power to issue a statement of strategic priorities and a power to issue non-binding guidance to the regulator.

These powers are important but not novel; they have precedent in the Communications Act 2003, which allows the Secretary of State to direct Ofcom in respect of its network and spectrum functions, and the Housing and Regeneration Act 2008, which allows the Secretary of State to make directions to the Regulator of Social Housing to amend its standards. At the same time, I agree that it is important that we have proportionate safeguards in place for the use of these powers, and I am very happy to continue to have discussions with noble Lords to make sure that we do.

Amendment 110, from the noble Lord, Lord Stevenson, seeks to introduce a lengthier process regarding parliamentary approval of codes of practice, requiring a number of additional steps before they are laid in Parliament. It proposes that each code may not come into force unless accompanied by an impact assessment covering a range of factors. Let me reassure noble Lords that Ofcom is already required to consider these factors; it is bound by the public sector equality duty under the Equality Act 2010 and the Human Rights Act 1998 and must ensure that the regime and the codes of practice are compliant with rights under the European Convention on Human Rights. It must also consult experts on matters of equality and human rights when producing its codes.

Amendment 110 also proposes that any designated Select Committee in either House has to report on each code and impact assessment before they can be made. Under the existing process, all codes must already undergo scrutiny by both Houses before coming into effect. The amendment would also introduce a new role for the devolved Administrations. Let me reassure noble Lords that the Government are working closely with them already and will continue to do so over the coming months. As set out in Schedule 5 to the Scotland Act 1998, however, telecommunications and thereby internet law and regulation is a reserved policy area, so input from the devolved Administrations may be more appropriately sought through other means.

Amendments 111, 113, 114, 115, and 117 to 120 seek to restrict or remove the ability of the Secretary of State to issue directions to Ofcom to modify draft codes of practice. Ofcom has great expertise as a regulator, as noble Lords noted in this debate, but there may be situations where a topic outside its remit needs to be reflected in a code of practice. In those situations, it is right for the Government to be able to direct Ofcom to modify a draft code. This could, for example, be to ensure that a code reflects advice from the security services, to which Ofcom does not have access. Indeed, it is particularly important that the Secretary of State be able to direct Ofcom on matters of national security and public safety, where the Government will have access to information which Ofcom will not.

I have, however, heard the concerns raised by many in your Lordships’ House, both today and on previous occasions, that these powers could allow for too much executive control. I can assure your Lordships that His Majesty’s Government are committed to protecting the regulatory independence of Ofcom, which is vital to the success of the framework. With this in mind, we have built a number of safeguards into the use of the powers, to ensure that they do not impinge on regulatory independence and are used only in limited circumstances and for the appropriate reasons.

I have heard the strong feelings expressed that this power must not unduly restrict regulatory independence, and indeed share that feeling. In July, as noble Lords noted, the Government announced our intention to make substantive changes to the power; these changes will make it clear that the power is for use only in exceptional circumstances and will replace the “public policy” wording in Clause 39 with a defined list of reasons for which a direction can be made. I am happy to reiterate that commitment today, and to say that we will be making these changes on Report when, as the noble Lord, Lord Clement-Jones, rightly said, noble Lords will be able to see the wording and interrogate it properly.

Additionally, in light of the debate we have just had today—

Can my noble friend the Minister clarify what he has just said? When he appeared in front of the Communications and Digital Committee, I think he might have been road-testing some of that language. In the specific words used, he would still have allowed the Secretary of State to direct Ofcom for economic reasons. Is that likely to remain the case? If it is, I feel it will not actually meet what I have heard is the will of the Committee.

When we publish the wording, we will rightly have an opportunity to discuss it before the debate on Report. I will be happy to discuss it with noble Lords then. On the broader points about economic policy, that is a competency of His Majesty’s Government, not an area of focus for Ofcom. If the Government had access to additional information that led them to believe that a code of practice as drafted could have a significant, disproportionate and adverse effect on the livelihoods of the British people or to the broader economy, and if it met the test for exceptional circumstances, taking action via a direction from the Secretary of State could be warranted. I will happily discuss that when my noble friend and others see the wording of the changes we will bring on Report. I am sure we will scrutinise that properly, as we should.

I was about to say that, in addition to the commitment we have already made, in the light of the debate today we will also consider whether transparency about the use of this power could be increased further, while retaining the important need for government oversight of issues that are genuinely beyond Ofcom’s remit. I am conscious that, as my noble friend Lady Stowell politely said, I did not convince her or your Lordships’ committee when I appeared before it with my honourable friend Paul Scully. I am happy to continue our discussions and I hope that we may reach some understanding on this important area.

I am sorry to interrupt, but may I clarify what my noble friend just said? I think he said that, although he is open to increasing the transparency of the procedure, he does not concede a change—from direction to a letter about guidance which Ofcom should take account of. Is he willing to consider that as well?

I am happy to continue to discuss it, and I will say a bit more about the other amendments in this group, but I am not able to say much more at this point. I will happily follow this up in discussion with my noble friend, as I know it is an issue of interest to her and other members of your Lordships’ committee.

The noble Lord, Lord Stevenson, asked about our international obligations. As noble Lords noted, the Government have recognised the importance of regulatory independence in our work with international partners, such as the Council of Europe’s declaration on the independence of regulators. That is why we are bringing forward the amendments previously announced in another place. Ensuring that powers of direction can be issued only in exceptional circumstances and for a set of reasons defined in the Bill will ensure that the operational independence of Ofcom is not put at risk. That said, we must strike a balance between parliamentary oversight and being able to act quickly where necessary.

Regarding the amendment tabled by my noble friend Lady Stowell, which calls for all codes which have been altered by a direction to go through the affirmative procedure, as drafted, the negative procedure is used only if a direction is made to a code of practice relating to terrorism or child sexual exploitation or abuse, for reasons of national security or public safety. It is important that the parliamentary process be proportionate, particularly in cases involving national security or public safety, where a code might need to be amended quickly to protect people from harm. We therefore think that, in these cases, the negative procedure is more appropriate.

On timing, the Government are committed to ensuring that the framework is implemented quickly, and this includes ensuring that the codes of practice are in force. The threshold of exceptional circumstances for the power to direct can lead to a delay only in situations where there would otherwise be significant consequences for national security or public safety, or for the other reasons outlined today.

My noble friend Lord Moylan was not able to be here for the beginning of the debate on this group, but he is here now. Let me say a little about his Amendment 254. Under Clause 153, the Secretary of State can set out a statement of the Government’s strategic priorities in relation to matters of online safety. This power is necessary, as future technological changes are likely to shape online harms, and the Government must be able to state their strategic priorities in relation to them. My noble friend’s amendment would go beyond the existing precedent for the statement of strategic priorities in relation to telecommunications, management of the radio spectrum, and postal services outlined in the Communications Act. The Secretary of State must consult Ofcom and other appropriate persons when preparing this statement. This provides the opportunity for widespread scrutiny of a draft statement before it can be designated through a negative parliamentary procedure. We consider that the negative procedure is appropriate, in line with comparable existing arrangements.

Amendment 257 from the noble Lord, Lord Stevenson, seeks to remove the Secretary of State’s power to issue guidance to Ofcom about the exercise of its online safety functions. Issuing guidance of this kind, with appropriate safeguards, including consultation and limitations on its frequency, is an important part of future-proofing the regime. New information—for example, resulting from parliamentary scrutiny or technological developments—may require the Government to clarify the intent of the legislation.

Amendments 258 to 260 would require the guidance to be subject to the affirmative procedure in Parliament. Currently, Ofcom must be consulted, and any guidance must be laid before Parliament. The Bill does not subject the guidance to a parliamentary procedure because the guidance does not create any statutory requirements, and Ofcom is required only to have had regard to it. We think that remains the right approach.

The noble Lord, Lord Stevenson, has made clear his intention to question Clause 156, which grants the Secretary of State the power to direct Ofcom’s media literacy activity only in special circumstances. This ensures that the regulatory framework is equipped to respond to significant future threats—for example, to the health or safety of the public, or to national security. I have already set out, in relation to other amendments, why we think it is right that the Secretary of State can direct Ofcom in these circumstances.

The delegated powers in the Bill are crucial to ensuring that the regulatory regime keeps pace with changes in this area. Amendment 290 from the noble Lord, Lord Stevenson, would go beyond the existing legislative process for these powers, by potentially providing for additional committees to be, in effect, inserted into the secondary legislative process. Established committees themselves are able to decide whether to scrutinise parts of a regime in more detail, so I do not think they need a Parkinson rule to do that.

Noble Lords have expressed a common desire to see this legislation implemented as swiftly as possible, so I hope they share our wariness of any amendments which could slow that process down. The process as envisaged in this amendment is an open-ended one, which could delay implementation. Of course, however, it is important that Parliament is able to scrutinise the work of the regulator. Like most other regulators, Ofcom is accountable to Parliament on how it exercises its functions. The Secretary of State is required to present its annual report and accounts before both Houses. Ministers from Scotland, Wales and Northern Ireland must also lay a copy of the report before their respective Parliament or Assembly. Moreover, the officers of Ofcom can be required to appear before Select Committees to answer questions about its operations on an annual basis. Parliament will also have a role in approving a number of aspects of the regulatory framework through its scrutiny of both the primary and secondary legislation. This will include the priority categories for harms and Ofcom’s codes of practice.

More broadly, we want to ensure that this ground-breaking legislation has the impact we intend. Ongoing parliamentary scrutiny of it will be crucial to help to ensure that. There is so much expertise in both Houses, and it has already helped to improve this legislation, through the Joint Committee on the draft Bill, the DCMS Select Committee in another place and, of course, your Lordships’ Communications and Digital Committee.

As my noble friend Lady Stowell said, we must guard against fragmentation and duplication, which we are very mindful of. Although we do not intend to legislate for a new committee—as I set out on previous occasions, including at Second Reading and before the Communications and Digital Committee—we remain happy to discuss possible mechanisms for oversight to ensure that we make best use of the expertise in both Houses of Parliament so that the Bill delivers what we want. With that, I hope that Members of the Committee will be happy to continue the discussions in this area and not press their amendments.

I am grateful to the noble Lord for his comprehensive response and for the welcome change in tone and the openness to further debate and discussions. I thank all those who spoke in the debate. The noble Baroness, Lady Harding, was right: we are getting into a routine where we know roughly where our places are and, if we have contributions to make, we make them in the right order and make them comprehensive. We did our bit quite well, but I am afraid that the Minister’s response made me a bit confused. As I said, I welcome the change of tone, the sense of engagement with some of the issues and the ability to meet to discuss ways forward in some of those areas. But he then systematically and rather depressingly shut off just about everything that I thought we were going to discuss. I may be overstating that, so I will read Hansard carefully to make sure that there are still chinks of light in his hitherto impenetrable armour. I really must stop using these metaphors— I thought that the noble Baroness, Lady Harding, had managed to get me off the hook with her question about whether we were an island of concrete rock, and about whether the boat was going to end up in the stormy sea that we were creating. I decided that I could not follow that, so I will not.

We ought to take forward and address three things, which I will briefly go through in the response. One that we did not nail down was the good point made by the noble Baroness, Lady Kidron, that we had focused on regulatory structures in the form of set bodies relating—or not relating—to parliamentary procedures and to Ministers and their operations. She pointed out that, actually, the whole system has a possible drag effect that we also need to think about. I note that good point because we probably need a bit of time to think about how that would work in the structures that come forward.

The noble Lord, Lord Allan, said that we are trying to look at the changing of the accountability model. I disagree with the word “changing” because we are not trying to change anything; we have a model that works, but the new factor that we are trying to accommodate is the intensity of interaction and, as we said, the amplification that comes from the internet. I worry that this was not being picked up enough in the Minister’s response, but we will pick it up later and see if we can get through it.

The three points I wanted to make sure of were as follows. Following the line taken by the noble Baroness, Lady Stowell, one point is on trying to find a proper balance between the independence of the regulator; the Secretary of State’s right, as an elected leader of this aspect of the Government, to make recommendations and proposals to that regulator on how the system can be better; and Parliament’s ability to find a place in that structure, which is still eluding us a little, so we will need to spend more time on it. There is enough there to be reassured that we will find a way of balancing the independence of the regulator and the role of the Secretary of State. It does not need as many mentions in the legislation as it currently has. There is clearly a need for the Secretary of State to be able to issue direction in cases of national security et cetera—but it is the “et cetera” that I worry about: what are these instances? Until they are nailed down and in the Bill, there has to be a question about that.

How Parliament relates to the process is not yet defined. There is a willingness to debate about that. We do not have to go down any of the models that have been discussed today, but there are interesting ways of trying to do that. I think there is a general sense around the Committee that we want Parliament to have a bigger role complementary to what is in the Bill but different from it in a way that will be helpful and supportive of what we are trying to do.

The noble Lord, Lord Bethell, was very good about his experiences as a Minister with a strong regulator. We should listen to him—as we must do on many other issues, of course. But on this one he makes a point that there is a strength to be gained by isolating the way in which Parliament, Ministers and the regulator operate. I hope very much we will get there. Are we a rock or are we a boat? I am not sure, but we want to have a good journey and I would like to look forward to having that in future. I beg leave to withdraw the amendment.

Amendment 110 withdrawn.

Amendments 111 and 112 not moved.

Clause 38 agreed.

Clause 39: Secretary of State’s powers of direction

Amendments 113 to 119 not moved.

Clause 39 agreed.

Clause 40: Procedure for issuing codes of practice following direction under section 39

Amendment 120 not moved.

Clause 40 agreed.

Clauses 41 and 42 agreed.

Clause 43: Minor amendments of codes of practice

Amendment 121 not moved.

Clause 43 agreed.

Clause 44: Relationship between duties and codes of practice

Amendments 122 to 122ZB not moved.

Clause 44 agreed.

Clause 45 agreed.

Clause 46: Duties and the first codes of practice

Amendment 122ZC not moved.

Clause 46 agreed.

Clause 47: OFCOM’s guidance about certain duties in Part 3

Amendment 122A

Moved by

122A: Clause 47, page 46, line 10, after “29” insert “, except the duty set out in subsection (8A) of those sections”

Member’s explanatory statement

This amendment ensures that OFCOM need not produce guidance about the new duties in clauses 19 and 29 to supply records of risk assessments to OFCOM.

Amendment 122A agreed.

Clause 47, as amended, agreed.

Clause 48: OFCOM’s guidance: content that is harmful to children and user empowerment

Amendment 123 not moved.

Amendment 123A

Moved by

123A: Clause 48, page 46, line 22, at end insert—

“(c) pornographic material that must be put behind age assurance as set out in section (Ofcom's guidance about age assurance).Member’s explanatory statement

This amendment ensures parity of age assurance between pornographic material on part 3 and part 5 services.

My Lords, it is a privilege to introduce Amendments 123A, 142, 161 and 184 in my name and those of the noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. These amendments represent the very best of your Lordships’ House and, indeed, the very best of Parliament and the third sector because they represent an extraordinary effort to reach consensus between colleagues across the House including both opposition parties, many of the Government’s own Benches, a 40-plus group of Back-Bench Conservatives and the Opposition Front Bench in the other place. Importantly, they also enjoy the support of the commercial age check sector and a vast array of children’s charities and, in that regard, I must mention the work of Barnardo’s, CEASE and 5Rights, which have really led the charge.

I will spend the bulk of my time setting out in detail the amendments themselves, and I will leave my co-signatories and others to make the arguments for them. Before I do, I once again acknowledge the work of the noble Baroness, Lady Benjamin, who has been fighting this fight for many years, and the noble Baroness, Lady Harding, whose characteristic pragmatism was midwife to the drafting process. I also acknowledge the time spent talking about this issue with the Secretary of State, the noble Lord the Minister and officials at DSIT. I thank them for their time and their level of engagement.

Let me first say a few words about age assurance and age verification. Age assurance is the collective term for all forms and levels of age verification, which means an exact age, and age estimation, which is an approximate or probable age. Age assurance is not a technology; it is any system that seeks to achieve a level of certainty about the age or age range of a person. Some services with restricted products and services have no choice but to have the very highest level of assurance or certainty—others less so.

To be clear at the outset, checking someone’s age, whether by verification or estimation, is not the same as establishing identity. While it is absolutely the case that you can establish age as a subset of establishing someone’s identity, the reverse is not necessarily true. Checking someone’s age does not need to establish their identity.

Age assurance strategies are multifaceted. As the ICO’s guidance in the age-appropriate design code explains, online services can deploy a range of methods to achieve the necessary level of certainty about age or age range. For example, self-verification, parental authentication, AI estimation and/or the use of passports and other hard identifiers may all play a role in a single age assurance strategy, or any one of them may be a mechanism in itself in other circumstances. This means that the service must consider its product and make sure that the level of age assurance meets the level of risk.

Since we first started debating these issues in the context of the Digital Economy Act 2017, the technology has been transformed. Today, age assurance might just as effectively be achieved by assessing the fluidity of movement of a child dancing in a virtual reality game as by collecting their passport. The former is over 94% accurate within five seconds and is specific to that particular child, while a passport may be absolute but less reliable in associating the check with a particular child. So, in the specific context of that dancing child, it is likely that the former gives the greater assurance. When a service’s risk profile requires absolute or near absolute certainty—for example, any of the risks that are considered primary priority harms, including, but not limited to, pornography—having the highest possible level of assurance must be a precondition of access.

Age assurance can also be used to ensure that children who are old enough to use a service have an age-appropriate experience. This might mean disabling high-risk features such as hosting, livestreaming or private messaging for younger children, or targeting child users or certain age groups with additional safety, privacy and well-being interventions and information. These amendments, which I will get to shortly, are designed to ensure both. To achieve the levels of certainty and privacy which are widely and rightly demanded, the Bill must both reflect the current state of play and anticipate nascent and emerging technology that will soon be considered standard.

That was a long explanation, for which I apologise, but I hope it makes it clear that there is no single approach, but, rather, a need to clearly dictate a high bar of certainty for high-risk services. A mixed economy of approaches, all geared towards providing good outcomes for children, is what we should be promoting. Today we have the technology, the political will and the legislative mechanism to make good on our adult responsibilities to protect children online. While age assurance is eminently achievable, those responsible for implementing it and, even more importantly, those subject to it need clarity on standards; that is to say, rules of the road. In an era when data is a global currency, services have shown themselves unable to resist the temptation to repurpose information gleaned about the age of their users, or to facilitate the access to industrial amounts of harmful material for children for commercial gain. As with so many of tech’s practices, this has eroded trust and heightens the need for absolute clarity on how services build their age-assurance systems and what they do—and do not do—with the information they gather, and the efficacy and security of the judgments they make.

Amendment 125A simply underlines the point made frequently in Committee by the noble Baroness, Lady Ritchie of Downpatrick, that the Bill should make it clear that pornography should not be judged by where it is found but by the nature of the material itself. It would allow Ofcom to provide guidance on pornographic material that should be behind an age gate, either in Part 3 or Part 5.

Amendment 142 seeks to insert a new clause setting out matters that Ofcom must reflect in its guidance for effective age assurance; these are the rules of the road. Age assurance must be secure and maintain the highest levels of privacy; this is paramount. I do not believe I need to give examples of the numerous data leaks but I note the excessive data harvesting undertaken by some of the major platforms. Age assurance must not be an excuse to collect users’ personal and sensitive information unnecessarily, and it should not be sold, stored or used for other purposes, such as advertising, or offered to third parties.

Age assurance must be proportionate to the risk, as per the results of the child risk assessment, and let me say clearly that proportionality is not a route to allow a little bit of porn or a medium amount of self-harm, or indeed a lot of both, to a small number of children. In the proposed new clause, proportionality means that if a service is high-risk, it must have the highest levels of age assurance. Equally, if a service is low-risk or no-risk, it may be that no age assurance is necessary, or it should be unobtrusive in order to be proportionate. Age-assurance systems must provide mechanisms to challenge or change decisions to ensure that everyone can have confidence in their use, and they do not keep individuals—adults or children—out of spaces they have the right to be in. It must be inclusive and accessible so that children with specific accessibility needs are considered at the point of its design, and it must provide meaningful information so that users can understand the mode of operation. I note that the point about accessibility is of specific concern to the 5Rights young advisers. Systems must be effective. It sounds foolish to say so, but look at where we are now, when law in the US, Europe, the UK and beyond stipulates age restrictions and they are ignored to the tune of tens of millions of children.

Age assurance is not to rely solely on the user to provide information; a tick box confirming “I am 18” is not sufficient for any service that carries a modicum of risk. It must be compatible with the following laws: the Data Protection Act, the Human Rights Act, the Equality Act and the UNCRC. It must have regard to the risks and opportunities of interoperable age assurance, which, in the future, will see these systems seamlessly integrated into our services, just as opening your phone with your face, or using two-factor authentication when transferring funds, are already normalised. It must consult with the Information Commissioner and other persons relevant to technological expertise and an understanding of child development.

On that point, I am in full support of the proposal from the noble Lord, Lord Allan, to require Ofcom to produce regular reports on age-assurance technology, and see his amendment as a necessary companion piece to these amendments. Importantly, the amendment stipulates that the guidance should come forward in six months and that all systems of age assurance, whether estimated or verified, whether operated in-house or by third-party providers, and all technologies must adhere to the same principles. It allows Ofcom to point to technical standards in its guidance, which I know that the ISO and the IEEE are currently drafting with this very set of principles in mind.

Amendment 161, which I promise I will get through a little more quickly, simply sets out the need for any regulated service to have an appropriate level of confidence in the age or age range of child relative to risk. It makes clear under what circumstances an age-assurance strategy is required and that any methodology is permitted provided it is adequate to the risk inherent in the service and meets Ofcom’s guidance, which I have already spoken to. Paragraph 3 of the proposed new schedule specifies that the highest standard of age assurance is required for pornographic services covered by Part 5; that is, they must confirm beyond reasonable doubt that the user is not a child. It also makes provision for auditing systems that age-check children. Paragraph 4 deals expressly with pornography accessed via Part 3 services and, crucially, it requires the same high bar of age assurance to access pornography.

I pause for a moment to underline the fact that the impact on children from pornography, which I know other noble Lords will talk to, is not lessened by the route by which they access it it. Arguably, pornography that a child sees in the context of a Part 3 service of news, chatter and shopping is normalised by that context and, therefore, worse. So while we are clear that a Part 3 service must put material that reaches the definition of porn in Clause 70(2) behind an age gate, we are not, as some would suggest, age-gating the internet.

Paragraph 5 of the proposed new schedule makes it clear that a company must consider all parts of the service separately; for example, those that are high-risk may require a higher level of age-assurance than those that are not. Paragraph 3 makes it clear that existing users, as well as new users, should be given the benefit of the schedule. Paragraph 7 refers to the definition and Paragraph 8 is a commencement clause that requires this coming into effect within 12 months of the Act receiving Royal Assent, a subject to which I will return to in a moment.

Between them, Amendments 142 and 146 together and separately give the services, Ofcom and children the very best chance of introducing effective, privacy-preserving age-verification and estimation. There will be no more avoiding, no more excuses, no more using age checking as a data point for commercial purposes. While the Bill requires age assurance under certain circumstances, age checking as a concept is not brought in by this Bill. It is already widely demanded by US, EU and UK laws, but it is poorly done and largely unregulated, so we continue to see children in their millions accessing platforms they are too young to be on and children who are 13, but do not yet have adult capacity, being offered services designed for adults which do not account for their vulnerabilities.

That brings me to the idea of a commencement clause in both amendments. The failure to implement Part 3 of the DEA means there is simply no trust left in the community that the Government will do as they say or even that the Online Safety Act will do as it says. That is not helped by the repealing of Part 3 last week, hidden in a group of government amendments on devolution. Putting a time limit will, if history repeats itself, allow campaigners to go to the courts.

I am encouraged by indications that the Government will bring forward their own amendments and by their willingness to put some of this in the Bill; however, I must express a certain level of frustration at the tendency to reject what we have written in favour of something that clearly does less. I struggle to see why we need to argue that age assurance systems should be secure, that they should not use data for other purposes, that users should have a way of challenging age assurance decisions, or that they should be inclusive or accessible or take account of a child’s need to access certain information. These are not aspirational; they are a critical intervention needed to make age assurance workable. They have had a great deal of expert input over many years, so much so that they have been adopted voluntarily by the commercial age check sector. More importantly, without a transparent and trusted system of age assurance, all the provisions in the Bill aimed at ensuring that children have heightened protections will fall down.

The bereaved parents of Olly, Molly, Breck, Frankie and Sophie have come to your Lordships’ House to urge noble Lords to back these amendments. As they said when meeting Minister Scully, each of them had parental controls, each of them reported problems to companies, schools and/or the police—and still their children are dead. We need this regime for self-harm and pro-suicide material as much as we need it for pornography. If it were not against parliamentary rules, I would get down on my knees and beg the Minister to go back to the department and say that these amendments must be passed with the full force of their meaning. This is robust, practical and much needed to make the Bill acceptable to adults and safe for children.

Amendment 184, also in my name, which seeks to establish the age of porn performers, will be spoken to by others at greater length, but I take the opportunity to tell your Lordships’ House that this is already the case in the United States and it works very well. I suggest we follow suit. Finally, this group should be seen as a companion piece to the harms schedule put forward by the same group of noble Lords, and which has the full support of all those groups and people I mentioned at the outset. The scale and range of expertise that has gone into this package of amendments is astonishing and between them, they would go a very long way to ensure that companies do not profit from purveying harms to children. I beg to move.

My Lords, it is a tremendous honour to speak after the noble Baroness, Lady Kidron. I think it fair to say that we would just not be here today if not for the advocacy she has performed on this issue and on the parallel issue of harms, over a great many years. I say to the Minister and to any in the Chamber who are thinking about being in the regulatory space in the years to come that they are going to have the noble Baroness on their back on these issues and it is very well worth listening to her words.

I echo the thanks and tributes the noble Baroness, Lady Kidron, has already paid, but I also want to single out the Minister and thank him for the leadership he has shown on these issues. I know he is very passionate about transforming our relationship in the digital world and the role the Bill can play, and we all recognise his commitment to making sure that the Bill gets over the line. I also thank those from another place who have passed us the Bill and are now closely watching our proceedings in this Chamber. We know that the Bill will be going back there and it is worth bearing in mind that these provisions have a lot of scrutiny and interest from Members of Parliament.

I am extremely concerned that for all the Bill’s strengths—and it has a great many—the measures on age verification are ill-defined and require tightening up in five particular ways. As the noble Baroness, Lady Kidron, alluded to, I shall talk a little bit about the rationale for the amendments in my name and those of the noble Lord, Lord Stevenson, the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Kidron, and also speak in support of amendments in the same group from the noble Baronesses, Lady Ritchie and Lady Benjamin, and the noble Lord, Lord Allan.

To summarise, I am concerned about a mindset that is in awe of and deeply concerned about a colossal tsunami of legal judicial reviews coming our way as a result of measures to put pornography behind age verification, and that somehow the route out of that is to go into the battlefield with a very loose set of arrangements defined in the Bill and to perform regulation by consultation. Consultations are invaluable when it comes to implementation but are not appropriate for setting objectives. I fear that too often in the Bill at the moment, it is the objectives that are going out to the public, to vested interests and to the industry to consult on, and that is why the amendment seeks to put some of those objectives in the Bill. We want to give a clear definition for age verification, to put in a clear timetable and to ensure that those consultations, which will be contested, can be clear and deliver clear value.

I want to talk about our effective age-assurance schedule; the introduction of a threshold of “beyond reasonable doubt”; the introduction of independent auditing of the performance of age-verification measures; and our wish to erase proportionality provisions for websites that carry pornographic content, to introduce a clear timetable and to protect underage performers. At present, the Bill gives the example of age verification as just one possible measure for protecting children from accessing pornography. However, nowhere in the Bill is a clear definition or standard set out that age verification should meet. The amendments seek to address that.

At the moment the Bill defines age assurance as

“measures designed to estimate or verify the age or age-range of users of a service”.

That is just far too vague. There is no output-based performance standard. It leaves regulated services free to apply age-verification systems that at present have no clear oversight or quality control. We already know that some major platforms have a public position whereby apparently, they welcome child safety measures, but we also know that a great many pornography operators will resist implementing age verification to gain a competitive advantage or because they believe it will be detrimental to their business model. Leaving this wide-open gap for them to negotiate the operational efficiency of their age-verification measures is a big mistake.

That is why we have tabled Amendment 161, which would introduce an effective age-assurance schedule. That is an essential building block for the other amendments in this group, including the important Amendment 142. The effective age-assurance schedule would set out in the Bill the requirement for age checking about pornographic content to be of the absolute strongest kind. The amendment gives an indication of where the benchmark should be:

“‘age verification’ … beyond reasonable doubt”.

The amendment also sets out the underlying principles of age-verification regimes: that they must be independently audited, effective and privacy-preserving. The amendment has with it a commencement date to ensure that other provisions for age-checking porn can happen as quickly as possible after Royal Assent of the Bill.

We all agree that verifying age should be based not on any particular technology but on an outcome threshold. That is why we have pushed so hard for the “beyond reasonable doubt” threshold. It is an effort to set a clear and high bar for access to pornography. If that does not happen, I fear that, under the cover of proportionality, Ofcom will accept that Part 3 services, the regular websites, will need only to apply estimation techniques rather than demanding an increased level of assurance to ensure that minors cannot access pornography wherever it is found.

We have had feedback from the Bill team that the phrase “beyond reasonable doubt” is more usually found in criminal proceedings than civil legislation. I am very open to a discussion about whether there is a better or alternative phrase. If the Minister would like to address that point from the Dispatch Box, that would be very welcome.

On independent auditing, a number of noble Lords in this Committee have noted their concerns about internet companies marking their own homework. My noble friend Lady Wyld mentioned that and made a comparison with her own daughter’s homework provisions on an earlier day in Committee.

The fear is that many of the large pornography companies are establishing their own age-verification companies. I admire their enthusiasm but doubt their intentions. It is incumbent on the Government to ensure that the tools they produce and implement meet the requirements of the legislation. For Ofcom to verify the effectiveness of a particular app or piece of software is neither intrusive nor a business restraint; it is simply about giving confidence to parents that the technology is robust and trustworthy. Therefore, we would look for this aspect of the amendment to be built into any age-verification system.

We need to be very clear about what we mean by “proportionate”. We need to make it crystal clear that any pornographic content that meets the Bill’s definition of pornography must always require verification at the outcome standard—we suggest “beyond reasonable doubt”. The Gambling Commission does not apply any proportionality test to age restrictions, and nor do Soho nightclubs or the sellers of pornographic magazines. We have made it clear that the worst content should always meet this bar.

As drafted, the Bill creates a substantial loophole for social media, which my noble friend Lady Harding alluded to. While I appreciate the Minister’s attempts to answer these questions during previous debates, it would be extremely helpful if he could address this point in his comments.

The timetable for these measures is a very significant issue. The noble Baroness, Lady Kidron, made this point extremely well. The Bill leaves the timetable for implementation completely open-ended. There is no clear road map from Ofcom for implementation and there is the opportunity for repeated delays. These measures have an enormous amount of public support and have the attention of both Houses. It is not reasonable to put forward legislation with an uncapped timetable attached.

Age verification is already in place in other countries. In the UK, video-on-demand platforms are already subject to age verification. The pornography industry was already preparing to implement AV in 2019 and therefore is ready to start implementing the changes quickly after measures are enacted. When the regulator and the legislature moved in Louisiana, the pornographic industry was able to bring in an implementation programme extremely quickly indeed.

On this issue, the Minister’s words on timing at Second Reading did not provide any hard dates, sequences or an implementation programme. The Minister said that the Government intend

“to have the regime operational as soon as possible after Royal Assent”.—[Official Report, 1/2/23; col. 774.]

As with Part 3 of the Digital Economy Act, which did not have a statutory deadline, there will always be a tendency to let the best be the enemy of the good, or to fear some kind of backlash from the industry that might delay commencement. That is why Amendment 142, in my name and those of the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and the right reverend Prelate the Bishop of Oxford, would direct Ofcom to prepare and issue a code of practice within six months of Royal Assent.

The issue of age verification and consent for performers is one I feel particularly strongly about. It is absolutely unconscionable that we do not have age verification and consent checks for anyone featured in pornographic content. We heard extremely powerful testimony at Second Reading from those who found that videos of themselves when they were underage were repeatedly featured on pornographic channels. It is impossible for them to take down that material or to reverse any decision they made to take part in one of these productions. It is absolutely our responsibility to change that with this Bill.

This was addressed in the House of Commons by Amendment 33 from Dame Diana Johnson. At that time, the Secretary of State, Jeremy Wright, claimed that the proposals were unworkable. I do not accept those points. There are many industries where the publishers establish the provenance of material, intellectual property ownership and the identity of participants. The music, film, gaming and software industries all have very good examples of good practice.

Many pornography sites already have measures in place to achieve these standards; for instance, OnlyFans, one of the largest players, has made steps in that direction. The pornography industry, often the originator of technology innovation, can surely find mechanisms to address this point—that is my view, and that of over 80% of the UK public. I very much hope that my noble friend the Minister will address this point, which is in Amendment 184, based on the Commons amendment from Dame Diana Johnson.

The measures in this group are not meant to stifle innovation or to hold back the industry—quite the opposite. My noble friend Lady Harding alluded to the Industrial Revolution; taking children out of the pits led to a great investment in, and the growth of, the coal mining industry. Setting clear tracks for progress and putting in place humane provisions create the conditions under which industries can flourish. I fear that, if we do not get this one right, we will be tripping over ourselves; the pornographic industry will become grit in the gears of industry for years to come. By being clearer and more emphatic in these measures, the Bill can be an agent for innovation and encourage a great flourishing of these very important industries.

My Lords, I support everything that was said by the intrepid noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. I will speak to Amendment 185, which is in my name and is supported by the noble Lord, Lord Farmer. My amendment seeks to bring the regime for online pornography content in line with what exists offline.

The Video Recordings Act 1984 makes it a criminal offence to have prohibited content offline or to supply any unclassified work. Under this regulation, the BBFC will not classify any pornographic content that is illegal or material that is potentially harmful. That includes material that depicts or promotes child sex abuse, incest, trafficking, torture and harmful sexual acts. This content would not be considered R18, and so would be prohibited for DVD and Blu-ray. This also applies, under the Communications Act 2003, to a wide range of services that are regulated by Ofcom, from large providers such as ITVX or Disney+, to smaller providers including those that produce or provide pornographic content.

However, in the wild west of the online world, there is no equivalent regulation. Online pornography so far has been left to evolve without boundaries or limitations. Devastatingly, this has had a disastrous impact on child protection. Content that would be prohibited offline is widely available on mainstream pornographic websites. This includes material that promotes violent sexual activity, including strangulation; pornography that depicts incest, including that between father and daughters or brothers and sisters; and content that depicts sexual activity with adult actors made to look like children. This content uses petite, young-looking adult performers, who are made to look underage through props such as stuffed toys, lollipops and children’s clothing. This blurring of the depiction of sexual activity with adult actors who are pretending to be underage makes it so much harder to spot illegal child sex abuse material.

According to research by Dr Vera-Gray and Professor McGlynn, incest pornography is rife. Online, all of this can be accessed at the click of a button; offline, it would not be sold in sex shops. Surely this Bill should bring an end to such disparities. This content is extremely harmful: promoting violence against girls and women, sexualising children and driving the demand for real child sex abuse material, which of course is illegal.

Depictions of sexual activity with the title “teen” are particularly violent. A study analysing the content of the three most accessed pornographic websites in the UK found that the three most common words in videos containing exploitation were “schoolgirl”, “girl” and “teen”. It is clear that underage sexual activity is implied. How have we as a society arrived at a point where one of the most commonly consumed pornographic genres is sexual violence directed at children?

Our security services can confirm this too. Retired Chief Constable Simon Bailey, the former child protection lead at the National Police Chiefs’ Council, told the Independent Inquiry into Child Sex Abuse that the availability of pornography was

“creating a group of men who will look at pornography”

so much that they reach

“the point where they are simply getting no sexual stimulation from it … so the next click is child abuse imagery”.

We know that the way pornography affects the brain means that users need more and more extreme content to fulfil themselves. It is like a drug. Pornography sites know this and exploit it. They design their sites to keep users for as long as possible, so as to increase exposure to adverts and therefore their revenue. They do this by presenting a user with ever-more extreme content. In 2021, Dr Vera-Gray and Professor McGlynn found that one in every eight titles advertised to a new user described acts of sexual violence.

I recently hosted a screening of the harrowing documentary “Barely Legal” here in the House of Lords. The documentary demonstrated just how far the pornography industry will go to make a profit, using extremely young-looking adult actors in content that suggests sexual activity with underage girls. Believe it or not, the pornography industry is worth much more than Hollywood; it makes thousands and thousands of dollars per second. Its quest for money comes at the expense of child protection and of society as a whole. This cannot be allowed to continue without regulation. Enough is enough.

Interviews with offenders who view illegal child sex abuse material in the UK indicate that most had not intentionally sought out child sex abuse materials. Nine out of 10 offenders said that they first encountered child sex abuse material through online pop-ups and linked material while looking at pornography sites.

I visited Rye Hill prison in Rugby, which houses over 600 sex offenders. Many said that they were affected by viewing porn, with devastating, life-changing outcomes. The largest ever survey of offenders who watch child sex abuse material online found significant evidence that those who watch illegal material are at high risk of going on to contact or abuse a child directly. Almost half said that they sought direct contact with children through online platforms after viewing child sexual abuse material.

This is an urgent and immediate child protection issue affecting our children. These concerns were shared earlier this year by the Children’s Commissioner for England, whose research found that 79% of children had encountered violent pornography

“depicting … degrading or pain-inducing sex acts”

before they reached the age of 18. The impact that this is having on our children is immeasurable.

I declare an interest as vice-president of the children’s charity, Barnardo’s. It has shared with many of us how its front-line services are working every day with children who have accessed pornography. It supports children who have taken part in acts that they have seen in pornographic videos, despite feeling uncomfortable and scared. Children see these acts as expected parts of relationships. I will never forget being told about a 10 year-old boy who said to a four year-old girl, “I’m going to rape you and you’re going to like it”. This is why I have campaigned to protect children from pornographic content for over a decade. We are creating a conveyor belt of child sex abusers who will inflict pain and suffering on others.

That is why I am supportive of the package of amendments tabled by the noble Baroness, Lady Kidron —whom I have the highest regard for—to ensure, among other vitally important things, that children are protected from accessing pornographic content. I believe that pornography is a gateway to so many other harms that are included in this Bill. Given the distinct proven harm of pornography, only the most high-level age verification is acceptable, where the age of the user is proved to be 18 or above, beyond reasonable doubt.

I am supportive of the amendment tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, which would place a duty on pornography companies to verify the age and consent of all performers in pornographic content. Pornography companies currently moderate user-uploaded pornographic content at great volume and speed, meaning that it is almost impossible for the moderators to spot content in which performers are underage or do not consent. Again, we cannot allow profit to come before protection, so this duty should be applied.

Without this package of amendments working in unison, the public health emergency of pornography— this devastating plague—will be allowed to create generations of children who are experiencing harm and generations of adults who create harm in their intimate relationships.

To sum up, I support a clearer definition of age assurance and age verification for pornography, and a six-month implementation deadline—we all know what happened with the repeated delays in implementing Part 3 of the Digital Economy Act 2017. We need performer age checks and quicker enforcement, without the need to go into court, and most of all, to cover all porn with the same regulation, whether on social media or on dedicated sites. I urge the Government to accept all these amendments. I look forward to receiving the Minister’s assurances that this regulation of online pornographic content will be included within the scope of the Online Safety Bill. We need to show our children that we truly care.

My Lords, it is such a privilege to follow the noble Baroness, Lady Benjamin. I pay tribute to her years of campaigning on this issue and the passion with which she spoke today. It is also a privilege to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, in supporting all the amendments in this group. They are vital to this Bill, as all sides of this Committee agree. They all have my full support.

When I was a child, my grandparents’ home, like most homes, was heated by a coal fire. One of the most vital pieces of furniture in any house where there were children in those days was the fireguard. It was there to prevent children getting too near to the flame and the smoke, either by accident or by design. It needed to be robust, well secured and always in position, to prevent serious physical harm. You might have had to cut corners on various pieces of equipment for your house, but no sensible family would live without the best possible fireguard they could find.

We lack any kind of fireguard at present and the Bill currently proposes an inadequate fireguard for children. A really important point to grasp on this group of amendments is that children cannot be afforded the protections that the Bill gives them unless they are identified as children. Without that identification, the other protections fail. That is why age assurance is so foundational to the safety duties and mechanisms in the Bill. Surely, I hope, the Minister will acknowledge both that we have a problem and that the present proposals offer limited protection. We have a faulty fireguard.

These are some of the consequences. Three out of five 11 to 13 year-olds have unintentionally viewed pornography online. That is most of them. Four out of five 12 to 15 year-olds say they have had a potentially harmful experience online. That is almost universal. Children as young as seven are accessing pornographic content and three out of five eight to 11 year-olds—you might want to picture a nine year-old you know—have a social media profile, when they should not access those sites before the age of 13. That profile enables them to view adult content. The nation’s children are too close to the fire and are being harmed.

There is much confusion about what age assurance is. As the noble Baroness, Lady Kidron, has said, put simply it is the ability to estimate or verify an individual’s age. There are many different types of age assurance, from facial recognition to age verification, which all require different levels of information and can give varying levels of assurance. At its core, age assurance is a tool which allows services to offer age-appropriate experiences to their users. The principle is important, as what might be appropriate for a 16 year-old might be inappropriate for a 13 year-old. That age assurance is absolutely necessary to give children the protections they deserve.

Ofcom’s research shows that more than seven out of 10 parents of children aged 13 to 17 were concerned about their children seeing age-inappropriate content or their child seeing adult or sexual content online. Every group I have spoken to about the Bill in recent months has shared this concern. Age assurance would enable services to create age-appropriate experiences for children online and can help prevent children’s exposure to this content. The best possible fireguard would be in place.

Different levels of age assurance are appropriate in different circumstances. Amendments 161 and 142 establish that services which use age assurance must do so in line with the basic rules of the road. They set out that age assurance must be proportionate to the level of risk of a service. For high-risk services, such as pornography, sites much establish the age of their users beyond reasonable doubt. Equally, a service which poses no risk may not need to use age assurance or may use a less robust form of age assurance to engage with children in an age-appropriate manner—for example, serving them the terms and conditions in a video format.

As has been said, age assurance must be privacy-preserving. It must not be used as an excuse for services to use the most intrusive technology for data-extractive purposes. These are such common-sense amendments, but vital. They will ensure that children are prevented from accessing the most high-risk sites, enable services to serve their users age-appropriate experiences, and ensure that age assurance is not used inappropriately in a way that contravenes a user’s right to privacy.

As has also been said, there is massive support for this more robust fireguard in the country at large, across this House and, I believe, in the other place. I have not yet been able to understand, or begin to understand, the Government’s reasons for not providing the best protection for our children, given the aim of the Bill. Better safeguards are technically possible and eminently achievable. I would be grateful if the Minister could attempt to explain what exactly he and the Government intend to do, given the arguments put forward today and the ongoing risks to children if these amendments are not adopted.

My Lords, it is a pleasure to follow the right reverend Prelate the Bishop of Oxford. He used an interesting analogy of the fireguard; what we want in this legislation is a strong fireguard to protect children.

Amendments 183ZA and 306 are in my name, but Amendment 306 also has the name of the noble Lord, Lord Morrow, on it. I want to speak in support of the general principles raised by the amendments in this group, which deal with five specific areas, namely: the definition of pornography; age verification; the consent of those participating in pornographic content; ensuring that content which is prohibited offline is also prohibited online; and the commencement of age verification. I will deal with each of these broad topics in turn, recognising that we have already dealt with many of the issues raised in this group during Committee.

As your Lordships are aware, the fight for age verification has been a long one. I will not relive that history but I remind the Committee that when the Government announced in 2019 that they would not implement age verification, the Minister said:

“I believe we can protect children better and more comprehensively through the online harms agenda”.—[Official Report, Commons, 17/10/19; col. 453.]

Four years later, the only definition for pornography in the Bill is found in Clause 70(2). It defines pornographic content as

“produced solely or principally for the purpose of sexual arousal”.

I remain to be convinced that this definition is more comprehensive than that in the Digital Economy Act 2017.

Amendment 183ZA is a shortened version of the 2017 definition. I know that the Digital Economy Act is out of vogue but it behoves us to have a debate about the definition, since what will be considered as pornography is paramount. If we get that wrong, age verification will be meaningless. Everything else about the protections we want to put in place relies on a common understanding of when scope of age verification will be required. Put simply, we need to know what it is we are subjecting to age verification and it needs to be clear. The Minister stated at Second Reading that he believed the current definition is adequate. He suggested that it ensured alignment across different pieces of legislation and other regulatory frameworks. In reviewing other legislation, the only clear thing is this: there is no standard definition of pornography across the legislative framework.

For example, Section 63 of the Criminal Justice and Immigration Act 2008 uses the definition in the Bill, but it requires a further test to be applied: meeting the definition of “extreme” material. Section 368E of the Communications Act 2003 regulates online video on demand services. That definition uses the objective tests of “prohibited material”, meaning material too extreme to be classified by the British Board of Film Classification, and “specially restricted material”, covering R18 material, while also using a subjective test that covers material that

“might impair the physical, mental or moral development”

of under-18s.

This is wholly different from the definition in Clause 70(2). The Bill’s definition is therefore not somehow aligned with other legislation. It primarily examines intention: why was a particular piece of content created? This is a subjective test that begs the question whether content providers could argue that the intent of the material is not sexual arousal but, for example, art or education, and therefore that it falls outside the scope of the Bill. Subjectivity must be removed as far as possible. Amendment 183ZA proposes adding an objective test: would the content receive, or has it been given, an R18 or 18 certificate, or would it be considered too extreme for either? The subjective test remains but makes it clear that the intention is for the Bill to cover material which, were it physical media, would receive a classification from the British Board of Film Classification that puts it firmly in the adult category. This definition is in line with offline platforms such as cinemas and DVDs and is broadly consistent with the definition already in place for other online platforms, such as video on demand services. Ensuring parity of regulation across platforms should be a key principle.

The Government have stated many times that what is illegal offline should be illegal online. It is illegal for a person to supply age-restricted physical media to someone below the age set by the British Board of Film Classification. This is exactly the same principle the Bill seeks to apply online by using age verification. I know your Lordships will agree that parents will expect the same standards that apply offline to apply online. Content deemed unsuitable for under-18s offline is unsuitable for them online. That is what the public expect this Bill to deliver, and my Amendment 183ZA seeks to do just that. Can the Minister be sure that the current definition will cover all material—18, R18 and unclassified—that would be age-gated for children offline?

That brings me to Amendment 185 in the name of the noble Baroness, Lady Benjamin, which also seeks parity between offline and online content by referencing the video on demand services standards. There is no barrier to finding material online which is illegal or so extreme that it would not receive a BBFC classification offline. Online, such material cannot be included in video on demand services, many of which are pornography providers. A couple of weeks ago, the noble Baroness, Lady Benjamin, hosted a film for us all to watch, and I must say I found that a very daunting video of what actually can happen. In the contribution of the noble Baroness this afternoon, she referred to a lot of what appeared on that day. That was most instructive, and it clearly told me that such material should be banned online and offline.

It seems absurd that if this Bill is passed as it stands, online content delivered by user-to-user services or commercial pornography sites will be held to a lower standard than other online content or content sold offline in shops. Amendment 185 makes it clear that the regulation of pornography should be consistent across all platforms.

I will also make some comments about Amendment 184. Large pornography websites are filled with material that has been uploaded without consent. Material that features children is extremely concerning. Uploaded material also portrays adults filmed with consent but without their consent for it to be shared with the world. We also need to remember that pornography production can involve individuals trafficked, coerced, forced or threatened. There should be a way to verify an individual’s age and their consent. Large porn companies need to be held to account. This Bill can deliver that, and I urge the Government to ensure that Amendment 184 is delivered on Report.

Regrettably, Amendments 184 and 185 would apply only to provider content, not user-to-user content—an issue I raised previously in Committee. That cannot be right and makes me wonder whether a service such as Pornhub might argue that third-party, studio-produced content that it uploads is not provider content and could be another loophole in the Bill. I do not wish to reopen a previous debate in Committee, but there needs to be parity across Parts 3 and 5 of the Bill. I note that the noble Baroness, Lady Kidron, who is a leading light on this issue, has tabled Amendment 123A in an attempt to bridge the gap between Parts 3 and 5. However, I do not think the amendment gets to the heart of the issue. Between now and Report, we need to work together across the House to ensure that this issue of parity between Parts 3 and 5 is dealt with.

I turn to the other amendments in the name of the noble Baroness, Lady Kidron, and others. The need for robust age verification for pornography is undisputed. The Bill clearly creates different regulatory frameworks for different types of content, based on who uploads it rather than its impact. I repeat that pornography, wherever it is found, should be regulated to the highest standards, with consistency across the Bill. That is why I welcome these amendments, which seek to ensure that anyone seeking to access pornographic content is verified as being aged 18 or older and that the standard of proof for their age is deemed beyond reasonable doubt. There needs to be consistent age verification, rather that age assurance, for pornographic content.

I have some questions for the noble Baroness about the detail. I am concerned that the amendments do not deal with different regulatory frameworks for pornography in Parts 3 and 5, creating potential loopholes which could be exploited by large pornography providers. I am particularly concerned about the approach taken in paragraphs 3 and 4 of the proposed new schedule in Amendment 161, and subsection (3)(c) of the proposed new clause in Amendment 142. My Amendments 183A and 183B, which we debated previously, would ensure that duties are consistent across the Bill, and I urge the noble Baroness to reflect on the need for clarity and consistency before Report.

Finally, I turn to my Amendment 306, also in the name of the noble Lord, Lord Morrow, which would ensure that age verification is brought into force within six months of the Bill receiving Royal Assent. A whole generation of children have become adults since the Digital Economy Act was passed but never commenced. That cannot be allowed to happen again and is why we need a clear commencement date in the Bill. Parents expect age verification—and swiftly. Given the recent welcome announcement that primary priority content, including pornography, will be named in the Bill, that should be an achievable goal.

I hope the Minister will commit to bringing this back on Report. I was concerned on 25 April when he said that the Government intend to commence Parts 3 and 5 at different times. I ask how this would work in practice, as a pornography provider may have its own provider content and user-to-user content on the same service, which implies that one part of the service would be regulated while the other would not. In theory, the content could be the same images. Logically, that content should be regulated from the same moment.

Thousands of children over the years have been let down. We know the harm that is caused by pornography. We legislated in 2017 to stop that harm, yet if the Ofcom road map is accurate it could be two or three more years before pornography is regulated. That is unacceptable. We must ensure that we do all we can to bring in age verification as quickly as possible.

My Lords, I too add my support to Amendments 123A, 142, 161, 183, 184, 185, 297, 300 and 306. I am grateful to my noble friend Lord Bethell and the noble Baroness, Lady Kidron, for putting before us such a comprehensive list of amendments seeking to protect children from a host of online harms, including online pornography. I am also grateful to the noble Baroness, Lady Benjamin, who, through her Amendment 185, draws our attention to the horrifying material that is prohibited in the offline world though is inexplicably legal in the online world. I also lend my support to Amendment 306 in the name of noble Baroness, Lady Ritchie, and the noble Lord, Lord Morrow, in relation to the swift implementation of age verification for pornography. I am sorry to have jumped the queue.

I spoke at Second Reading on the harms of pornography to children, but so much more evidence has come to my notice since then. I recently wrote an article for the Daily Telegraph about age verification, which resulted in my inbox being absolutely flooded by parents saying, “Please keep going”. There are probably noble Lords here who feel that we have spoken enough about pornography over the last few weeks, but anybody who has watched any of this would, I am afraid, beg to differ. I hope noble Lords will forgive me for quoting from another email I received in response to that article, which is relevant to today’s debate. A young man wrote:

“When I first visited online porn, I was about 12”.

Incidentally, that is the average age at the first exposure. He said:

“I can remember feeling that this was ‘wrong’ but also that it was something that all boys do. I had no idea about masturbation, but that soon followed, and I was able to shake off the incredibly depressive sensation of having done something wrong after finishing by finding many online resources informing me that the practice was not bad, and actually quite healthy. Only over the past 3 years have I been able to tackle this addiction and I am now 31.

I will try and keep this letter as succinct as possible, but I believe the issue of pornography is at the root of so many issues in society that nobody, no man at least, seems willing to speak about it openly. If you research what happens in the brain of a person viewing pornography”,

especially when so young,

“you see that the dopamine receptors get so fried it’s almost as bad as a heroin experience and far more addictive. Far more addictive, in that I can just log on to my phone and open Pandora’s box at any time, anywhere, and it’s all free.

I’ll tell you that I became alienated from women, in that I became afraid of them. Perhaps out of guilt for looking at pornography. Instead of having the confidence to ask a girl out and experience an innocent teenage romance, I would be in my room looking at all sorts of images.

The human brain requires novelty, mine does at least, so soon you find yourself veering off from the boring vanilla porn into much darker territories.

The internet gives you access to literally everything you could possibly imagine, and the more you get sucked down the rabbit hole, the more alienated you become from your peers. You are like an addict searching for your next hit, your whole world revolves around your libido and you can’t actually look at a woman without fantasising about sex.

Then if you do manage to enter into a relationship, the damage this causes is beyond comprehension. Instead of living each moment with your partner, you end up in a dual relationship with your phone, masturbating behind their back. In fact, your partner can’t keep up with the porn, and you end up with issues with your erections and finding her attractive.

Whenever you would watch information about porn on TV or the internet, you would be told that it should be encouraged and is healthy. You end up trying to watch porn with your partner, and all the weird psychological ramifications that has. You go further down the rabbit-hole, but for some reason nothing feels right and you have this massive crippling depression following you wherever you go in life”.

I hope noble Lords will forgive me for reading that fairly fully. It is a tiny illustration, and it is typical of how pornography steals men’s childhoods and their lives. I discussed this with young men recently, and one told me that, because he had been in Dubai—where there is no access to it—for a month, he feels much better and plans to keep away from this addictive habit. When young men reach out to Peers because they have nowhere else to go, we must surely concede that we have failed them. We have failed generations of boys and girls—girls who are afraid to become women because of what they see—and, if we do not do something now, we will fail future generations.

Porn addiction is very real and it is growing. As I just read, it triggers the dopamine processes in the brain and, just like addictive products such as tobacco and alcohol, it can create pathways within the brain that lead to cravings, which push consumers to search longer and continually for the same level of high. What is worse is that the amount of dopamine that floods the brain increases only with repeated consumption. Porn can trigger this process endlessly because it is endlessly available.

No one doubts that we have a serious problem on our hands. The Children’s Commissioner report published in January this year made clear that the volume of pornography accessed by children is rising and that pornography exposure is widespread and normalised, to the extent that children cannot opt out. We know that it is shaping sexual scripts and, in the absence of good relationships and sex education, it is teaching children that violent sex is normal, that girls and children like to be subjugated, and that boys and men need to be dominating and violent. We cannot let this continue, which is why I support the package of amendments outlined today by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell.

I strongly appreciate that the noble Baroness, Lady Ritchie, and the noble Lord, Lord Morrow, raised similar issues on the timeline for age verification. I am aware that the Bill makes provision for age verification for pornography, and I was relieved to hear that primary priority content and priority content would be in the Bill. But there is still a massive problem with the intended timeline for age verification: the Bill fails to outline a timeline for the implementation of it—there is no clear road map in the Bill—which allows for repeated delays. This is on top of what is already a repeated delay; as we heard, it could have been introduced in 2017. I and many others in both Houses fear that, without the timetable, it could be several more years before it is put into place. Parents expect age verification to start protecting children soon after Royal Assent, and a long delay will be unpopular and baffling to the public—it is baffling to me. Either it can be introduced now, in which case it should be, or it cannot, in which case I would be grateful if the Minister could explain why, in words that the public can understand.

I am also bewildered by the issue that the noble Baroness, Lady Benjamin, raised. I seek to support her by asking my noble friend the Minister: why is content depicting child sexual abuse allowed to be freely accessed online when it would be prohibited or illegal in the offline world? I watched with horror the documentary “Barely Legal”, screened here last week. It outlined this violent and horrific material, with young women dressed up to look like children, told to look as young as possible and having sex—and worse—with older men. The instructions from the interviewed porn directors and producers who produce this material were that the younger these women could be made to look, the better. This material contributes to the number of porn consumers—mostly men—who, as a result of watching this material, seek out real child sexual abuse material. So I fully support the amendment from the noble Baroness, Lady Benjamin, which would make this content illegal and prohibited online, as it is offline. I ask my noble friend the Minister to explain why this is not already the case.

My Lords, I wish to direct my comments to Amendments 123A, 142, 161, 183, 297 and 306, and I want to focus mainly on age verification.

The starting point for effective age verification is being able to define what it is that is being regulated. That is why I support Amendment 183ZA in the name of the noble Baroness, Lady Ritchie. Without that clarity on what it is that needs to be age verified, the expectation of parents will not be met, and children who deserve the strongest and fullest protection will still be subject to harm. The noble Baroness made a convincing case for parity of regulation across different media, and that indeed was the principle behind the definition in the Digital Economy Act 2017. I hope that the Minister will set out either today in the Committee or via a letter afterwards how content which would fall under the different British Board of Film Classification ratings of 18, R18 and unclassified would be treated under the definition in this Bill. I cannot stress this enough: a proper definition of pornography is the foundation of good age-verification legislation. I agree with the noble Baroness, Lady Ritchie, that the definition in the Bill is just not robust.

I stood here on 28 January 2022 at the Second Reading of my Private Member’s Bill encouraging the Government to bring into effect Part 3 of the Digital Economy Act 2017, which would have brought in age verification for commercial pornographic websites. Looking back to 2017, that legislation seems pioneering. Other countries have implemented similar measures since, but our children are still waiting despite the Government’s assurances, when they postponed implementation of Part 3 in October 2019 in favour of the Bill we are debating today, that preventing children’s access to pornography is a critically urgent issue. It still is. On 9 May, the Children’s Commissioner again reiterated the importance of age verification. She said,

“I am categorically clear: no child should be able to access or watch pornography. Protecting children from seeing inappropriate material is critical”.

I congratulate the noble Baroness, Lady Kidron, on yet again trying to persuade the Government to protect children in a robust manner. I am also pleased to support the noble Baroness, Lady Ritchie, as a co-signatory to her Amendment 306. She has learned the lessons of 2017 and is seeking to ensure that this policy cannot be abandoned again. A six-month implementation clause is the very least that we should accept.

It is now well documented that early exposure to pornography carries with it a host of harms—we have just heard something about that from the previous speaker—for the children and young people exposed and for society more widely. The Children’s Commissioner has plainly spelled out those harms in the reports she has published this year. These dangers are ones which parents alone are unable to prevent. With pornography now merely one click away for children across the UK, it is no wonder that the majority of children have already been exposed to pornography before they become teenagers. It is heartbreaking that most children report that they have been exposed to it accidentally. The reality is that robust age verification is an effective antidote to this pervasive problem. Shockingly, only 4.5% of the top 200 pornographic websites have any mechanism to prevent or detect children accessing their sites, and it is unlikely that they would meet the bar of robust age verification.

It is clear that there is almost unanimous support for age verification across your Lordships’ House. However the question before us is whether the Bill as it stands enables a robust enough level of protection. I welcome the duties in Clauses 11 and 22 on age verification. At present, the Bill enables pornographic sites to apply a light touch and, potentially, entirely inadequate age verification. Without a coherent, consistent approach, we are leaving the door open to those wishing to circumvent the prevention of harm we are putting in place.

Considering what we know and have heard about this industry, while the inclusion of age verification checks is welcome, is it really appropriate to leave the critical task of designing and implementing them to the pornography industry? I feel not. Should they be the ones charged with safeguarding our children and young people? Really? Would many in the pornographic industry prioritise their own web traffic over the welfare of children and young people? I think the answers to those questions are very clear.

Amendments 142 and 161 set higher standards so that the test is that there should be a “beyond reasonable doubt” age verification approach which should apply to Part 5 services or material that meet the definition of pornography in Clause 70(2). This will ensure children and young people are proactively protected from the deeply detrimental impact of online pornography.

While I welcome the new schedule introduced by Amendment 161, one of my key concerns is that there should be a consistent approach to enforcement of age verification when it comes to online pornography. Regardless of which websites it is found on, all forms of pornography should be held to the highest possible regulatory standards. I recognise that the noble Baroness has sought to go some way to addressing this with Amendment 123A, but this covers only Ofcom guidance on what constitutes pornographic material, rather than the requirements of age verification per se and the duty under Clause 11. Indeed, the amendment refers to age assurance, not age verification.

I know that we have already debated these points under earlier amendments tabled by the noble Baroness, Lady Ritchie, but the importance of the point cannot be overstated. Indeed, the Children’s Commissioner said the same in her report of 9 May, saying that the requirement for robust age verification must be

“consistent across all types of regulated services – both user to user sites and pornography providers”.

I must ask the noble Baroness, Lady Kidron—who I have admired on what she has been doing here—why paragraphs 3 and 4 of the proposed new schedule in Amendment 161 create different barriers under separate regulatory regimes for Part 3 services and Part 5 services. Why does the new clause proposed in Amendment 142 on the Ofcom guidance on age assurance refer only to Part 5 in subsection (3)(c)? Why is it that regulatory regimes for content produced by providers will differ from those where content is uploaded by users?

As it stands, those two regulatory regimes will be treated differently and, as I have said, I am not reassured by Amendment 123A. My concern is that, without change, we will not see all digital pornography treated with parity, serving only to create ambiguity and potential loopholes. I hope your Lordships will take note of the advice of the Children’s Commissioner and prevent that happening.

I would also like to ask the noble Baroness about her plans for bringing this schedule into effect. Paragraph 8 says the schedule should be in effect within 12 months, but there is no obligation elsewhere to bring in Parts 3 and 5 on that same timetable, so age verification may be required but could not be implemented until other parts of the Bill are commenced.

I urge the noble Baroness, Lady Kidron, to take note of Amendment 306 in the name of the noble Baroness, Lady Ritchie, which puts the commencement elements of age verification into the commencement clause with a timetable of three months for the Ofcom guidance and the rest of Part 5 and relevant enforcement powers within six months. Of course, the noble Baroness, Lady Ritchie, in her earlier amendments, intended for Part 3 to have the same duties as Part 5 and I certainly hope we will come back to that on Report. I was concerned that, at the end of that debate, the Minister said about commencement:

“This may mean there will be a limited period of time during which Part 5 protections are in place ahead of those in Part 3”. —[Official Report, 25/4/23; col. 1201.]

This again reiterates a dual approach to Part 3 and Part 5 services which host pornography.

I hope the Minister will assure us that it is no longer the Government’s plan that pornography will be on the face of the Bill as primary priority content, and that they are making it clear to service providers now that they need to plan to prevent children accessing this material, so it will be possible to commence the duties at the same time. Indeed, this would be in line with the comments made by the Minister in Committee in the other place, although I regret that he was not as ambitious about the timetable as Amendment 306.

I conclude by saying that we are well aware of the dangers that online pornography poses and, while age verification on pornographic websites is an important step forward, we must utilise this opportunity to ensure that the age verification processes are robust and able to function in such a way as to prevent children and young people being exposed to pornography. This change should be brought in as soon as possible and consistently across the Bill.

My Lords, I support the noble Baroness, Lady Benjamin, in bringing the need for consistent regulation of pornographic content to your Lordships’ attention and have added my name in support of Amendment 185. I also support Amendments 123A, 142, 161, 183, 184 and 306 in this group.

There should not be separate regimes for how pornographic content is regulated in this country. I remember discussions about this on Report of the Digital Economy Bill around six years ago. The argument for not making rules for the online world consistent with those for the offline world was that the CPS was no longer enforcing laws on offline use anyway. Then as now, this seems simply to be geared towards letting adults continue to have unrestricted access to an internet awash with pornographic material that depicts and/or promotes child sexual abuse, incest, trafficking, torture, and violent or otherwise harmful sexual acts: adult freedoms trumping all else, including the integrity of the legal process. In the offline world, this material is illegal or prohibited for very good reason.

The reason I am back here, arguing again for parity, is that, since 2017, an even deeper seam of academic research has developed which fatally undermines the case for untrammelled cyber-libertarianism. It has laid bare the far-reaching negative impacts that online pornography has had on individuals and relationships. One obvious area is the sharp rise in mental ill-health, especially among teenagers. Research from CEASE, the Centre to End All Sexual Exploitation, found that over 80% of the public would support new laws to limit free and easy access.

Before they get ensnared—and some patients of the Laurel Centre, a private pornography addiction clinic, watch up to 14 hours of pornography a day—few would have been aware that sexual arousal chained to pornography can make intimate physical sex impossible to achieve. Many experience pornography-induced erectile dysfunction and Psychology Today reports that

“anywhere from 17% to 58% of men who self-identify as heavy/compulsive/addicted users of porn struggle with some form of sexual dysfunction”.

As vice-chair of the APPG on Issues Affecting Men and Boys, I am profoundly concerned that very many men and boys are brutalised by depictions of rape, incest, violence and coercion, which are not niche footage on the dark web but mainstream content freely available on every pornography platform that can be accessed online with just a few clicks.

The harms to their growing sons, which include an inability to relate respectfully to girls, should concern all parents enough to dial down drastically their own appetite for porn. There is enormous peer pressure on teenage boys and young men to consume it, and its addictive nature means that children and young people, with their developing brains, are particularly susceptible. One survey of 14 to 18 year-olds found almost a third of boys who used porn said it had become a habit or addiction and a third had enacted it. Another found that the more boys watched porn and were sexually coercive, the less respect they had for girls.

Today’s headlines exposed the neurotoxins in some vaping products used by underage young people. There are neurotoxins in all the porn that would be caught by subsection 368E(2) of the Communications Act 2003, if it was offline—hence the need for parity and, just like the vapes, children as well as adults will continue to be exposed. Trustworthy age verification will stop children stumbling across it or finding it in searches, but adults who are negligent, or determined to despoil children’s innocence, will facilitate their viewing it if it remains available online. This Bill will not make the UK the safest place in the world for children online if we continue to allow content that should be prohibited, for good reason, to flood into our homes.

Helen Rumbelow, writing in the Times earlier this month, said the public debate—the backdrop to our own discussions in this Bill—is “spectacularly ill-informed” because we only talk about porn’s side-effects and not what is enacted. So here goes. Looking at the most popular pages of the day on Pornhub, she found that 12 out of 32 showed men physically abusing women. One-third of these showed what is known as “facial abuse”, where a woman’s airway is blocked by a penis: a porn version of waterboarding torture. She described how

“in one a woman is immobilised and bound by four straps and a collar tightened around her neck. She ends up looking like a dead body found in the boot of a car. In another a young girl, dressed to look even younger in a pair of bunny ears and pastel socks, is held down by an enormous man pushing his hand on her neck while she is penetrated. The sounds that came from my computer were those you might expect from a battle hospital: cries of pain, suction and “no, no, no”. I won’t tell you the worst video I saw as you may want to stop reading now. I started to have to take breaks to go outside and look at the sky and remember kindness”.

Turning briefly to the other amendments, I thank my noble friend Lord Bethell for his persistence in raising the need for the highest standard of age verification for pornography. I also commend the noble Baroness, Lady Kidron, for her continued commitment to protecting children from harmful online content and for representing so well the parents who have lost children, in the most awful of circumstances, because of online harms. I therefore fully support the package of amendments in this group tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell.

This Bill should be an inflection point in history and future generations will judge us on the decisions we make now. It is highly like they will say “Shame on them”. To argue that we cannot put the genie back in the bottle is defeatist and condemns many of our children and grandchildren to the certainty of a dystopic relational future. I say “certain” because it is the current reality of so many addicted adults who wish they could turn back the clock. Therefore, it is humane and responsible, not quaint or retrogressive, to insist that this Government act decisively to make online and offline laws consistent and reset the dial.

My Lords, I will speak to my Amendment 232, as well as addressing issues raised more broadly by this group of amendments. I want to indicate support from these Benches for the broader package of amendments spoken to so ably by the noble Baroness, Lady Kidron. I see my noble friend Lord Clement-Jones has returned to check that I am following instructions during my temporary occupation of the Front Bench.

The comments I will make are going to focus on an aspect which I think we have not talked about so much in the debate, which is age assurance in the context of general purpose, user-to-user and search services, so-called Part 3, because we like to use confusing language in this Bill, rather than the dedicated pornography sites about which other noble Lords have spoken so powerfully. We have heard a number of contributions on that, and we have real expertise in this House, not least from my noble friend Lady Benjamin.

In the context of age assurance more generally, I start with a pair of propositions that I hope will be agreed to by all participants in the debate and build on what I thought was a very balanced and highly informative introduction from the noble Baroness, Lady Kidron. The first proposition is that knowledge about the age of users can help all online platforms develop safer services than they could absent that information—a point made by the right reverend Prelate the Bishop of Oxford earlier. The second is that there are always some costs to establishing age, including to the privacy of users and through some of the friction they encounter when they wish to use a service. The task before us is to create mechanisms for establishing age that maximise the safety benefits to users while minimising the privacy and other costs. That is what I see laid out in the amendment that the noble Baroness, Lady Kidron, has put before us.

My proposed new clause seeks to inform the way that we construct that balance by tasking Ofcom with carrying out regular studies into a broad range of approaches to age assurance. This is exactly the type of thinking that is complementary to that in Amendment 142; it is not an alternative but complementary to it. We may end up with varying views on exactly where that balance should be struck. Again, I am talking about general purpose services, many of which seek to prohibit pornography—whether they do so 100%, it is a different set of arguments from those that apply to services which are explicitly dedicated to pornography. We may come to different views about where we eventually strike the balance but I think we probably have a good, shared understanding of the factors that should be in play. I certainly appreciate the conversations I have had with the noble Baroness, Lady Kidron, and others about that, and think we have a common understanding of what we should be considering.

If we can get this formulation right, age assurance may be one of the most significant measures in the Bill in advancing online safety, but if we get it wrong, I fear we may create a cookie banner scenario, such as the one I warned about at Second Reading. This is my shorthand for a regulatory measure that brings significant costs without delivering its intended benefits. However keen we are to press ahead, we must always keep in mind that we do not want to create legislation that is well-intended but does not have the beneficial effect that we all in this Committee want.

Earlier, the noble Baroness, Lady Harding, talked about the different roles that we play. I think mine is to try to think about what will actually work, and whether the Bill will work as intended, and to try to tease out any grit in it that may get in the way. I want in these remarks to flag what I think are four key considerations that may help us to deliver something that is actually useful and avoid that cookie banner outcome, in the context of these general purpose, Part 3 services.

First, we need to recognise that age assurance is useful for enabling as well as disabling access to content—a point that the noble Baroness, Lady Kidron, rightly made. We rightly focus on blocking access to bad content, but other things are also really important. For example, knowing that a user is very young might mean that the protocol for the reporting system gets to that user report within one hour, rather than 24 hours for a regular report. Knowing that a user is young and is being contacted by an older user may trigger what is known as a grooming protocol. Certainly at Facebook we had that: if we understood that an older user was regularly contacting younger users, that enabled us to trigger a review of those accounts to understand whether something problematic was happening—something that the then child exploitation and online protection unit in the UK encouraged us to implement. A range of different things can then be enabled. The provision of information in terms that a 13 year-old would understand can be triggered if you know the age of that user.

Equally, perfectly legitimate businesses, such as alcohol and online gambling businesses, can use age assurance to make sure that they exclude people who should not be part of that. We in this House are considering measures such as junk food advertising restrictions, which again depend on age being known to ensure that junk food which can be legitimately marketed to older people is not marketed to young people. In a sense, that enables those businesses to be online because, absent the age-gating, they would struggle to meet their regulatory obligations.

Secondly, we need to focus on outcomes, using the risk assessment and transparency measures that the Bill creates for the first time. We should not lose sight of those. User-to-user and search services will have to do risk assessments and share them with Ofcom, and Ofcom now has incredible powers to demand information from them. Rather than asking, “Have you put in an age assurance system?”, we can ask, “Can you tell us how many 11 year-olds or 15 year-olds you estimate access the wrong kind of content?”, and, “How much pornography do you think there is on your service despite the fact that you have banned it?” If the executives of those companies mislead Ofcom or refuse to answer, there are criminal sanctions in the Bill.

The package for user-to-user and search services enables us to really focus on those outcomes and drill down. In many cases, that will be more effective. I do not care whether they have age-assurance type A or type B; I care whether they are stopping 99.9% of 11 year-olds accessing the wrong kind of content. Now, using the framework in the Bill, Ofcom will be able to ask those questions and demand the answers, for the first time ever. I think that a focus on outcomes rather than inputs—the tools that they put in place—is going to be incredibly powerful.

The third consideration is quite a difficult one. We need to plan for actual behaviour, and how it will change over time and how we adapt to that, rather than relying on assumptions that we make today. My experience is that actual behaviour often differs. Cookie banners are an example. The assumption of the regulator was, “We’ll put these cookie banners in place, and it will be so off-putting that people will stop using cookies”. The reality is completely different: everyone has just carried on using cookies and people click through. The behaviour has not matched up to expectations.

You can put a lot of these tools in place, such as age assurance and age-restricted services. If you build an age-restricted version of your service—there is a YouTube for kids along with many other kids’ services—then you can see whether or not they are going to be acceptable. If people are rejecting them, you need to adapt. There is no point saying, “Well, you should go and use YouTube Kids”. If people are signing up for it but finding it too restrictive and going elsewhere, we need to be able to think about how we can adapt to that and work with it.

The reality today, as the right reverend Prelate the Bishop of Oxford referred to, is that 60% of kids are on social media, and in many cases their parents have bought the phone and enabled that access. How do we deal with that? We cannot just bury our heads in the sand and ignore it; we have to be able to adapt to that behaviour and think about what tools work in that environment.

My last point on adaptation is that we found, working in the industry, that sometimes the law incentivised ignorance, which is the worst possible outcome. We had age-estimation tools that allowed us to understand that children were 11 or 12. They may have passed an age check by providing ID that showed that they were overage, and their parents may have helped them, but we knew they were not. But that knowledge itself created legal risk, so we would bury the knowledge. If kids are going on these online platforms, my view is that I would much rather the platforms use all the tools available—we should not discourage them from understanding that these are 11 year-olds—and we find a way to work with that and make the service as safe as possible. These are hard questions because, while we want the law to work in an absolute way, in practice people are very creative and will work around and through systems.

The final point about making all this work is understanding the key role of usability. I was struck by the compelling vision of noble Baroness, Lady Kidron, of low-friction age assurance. There are issues of principle and practice when it comes to making sure that age assurance is usable. The argument around principle is that we as a free society do not want to create unnecessary friction for people to access information online. We can put measures in place and impinge on freedom of expression in a way that is necessary and proportionate. With regard to all the arguments that we have heard about access to pornography sites, the case is of course clear and absolute—there is a strong case for putting in place very restrictive measures—but for other general purpose services that younger people may want to use to connect with family and friends, we need to tread quite carefully if we are putting in place things that might infringe on their rights. The UN convention also talks about the right to express oneself. We need to be careful about how we think that through, and usability is critical to that.

I share the noble Baroness’s vision, in a sense. A lot of it could happen at the phone level, when a parent sets a phone up for their 11 year-old. The phone knows that you are an 11 year-old, and there is a system in place for the phone to tell the apps that you install that you are 11. That works in a smooth way and is potentially low friction but very effective. There are exceptions, but, by and large, teenagers have their own phone and the phone is tied to them; if you know the age of the phone then you have a highly reliable understanding of the age of the user. There is a lot in there. It is those kinds of low-friction measures that we should be looking for.

At the other end of the spectrum, if every app that you use asks you to upload your passport, that is not going to work. People will just bypass it, or they will stick to the apps they know already and never install new ones. We would end up concentrating the market, and our friends at the Competition and Markets Authority would not be very pleased with that potential outcome. It is about making something usable from both a principled reason—not blocking access to legitimate services—and a pragmatic reason: making sure we do not create an incentive. One of the phrases the team doing the age verification at the Facebook would use was, “Our competition is lying”. If people do not like what you are doing, they will simply lie and find their way round it. We need to bear that in mind, even under the new regime.

If we are mindful of these practical considerations, we should be able to deliver very useful age-assurance tools. I hope the Minister will agree with that. I look forward to hearing the Government’s description of how they think all this will work, because that bit is missing from the debate. I know other Lords will have heard from Ofcom, which started putting out information about how it thinks it will work. The age assurance community has started putting out information. The gap in all this is that it is not clear what the Government think this future world will look like. I hope they can at least start to fill that gap today by trying to explain to us, in response to these amendments, their vision for the new world of age assurance, not just for the pornography sites but, critically, for all these other sites that millions of us use that certainly aim to be gateways not to pornography but rather to other forms of communication and information.

My Lords, I am still getting used to the rules in Committee. I did not get up quickly enough before the noble Lord, Lord Allan, so I hope I am able to add my voice to the amendments.

I support the amendments tabled by the noble Baroness, Lady Kidron, and supported by the noble Lord, Lord Bethell, to bring about robust age verification for pornography wherever it is found online and the need to prove beyond reasonable doubt that the user is above the age of 18. I also support the amendment tabled by the noble Baroness, Lady Benjamin, which would mean that content that is prohibited and illegal offline would also be prohibited online, and the other amendments in this group, tabled by the noble Baroness, Lady Ritchie, and the noble Lord, Lord Allan.

The impact that pornography has on violence against women and girls is well documented, yet the Bill does not address it. What is considered mainstream pornography today would have once been seen as extreme. In fact, swathes of content that is readily available online for all to view in just a few clicks would be illegal and prohibited by the British Board of Film Classification to possess or supply offline, under the powers given to it by the Video Recordings Act 1984. But because online pornography has been allowed to evolve without any oversight, pornographic content that includes overt sexual violence, such as choking, gagging and forceful penetration, is prevalent. This is alongside content that sexualises children, as mentioned by the noble Baroness, Lady Benjamin, which includes petite, young-looking adult actors being made to look like children, and pornography which depicts incest.

This content is not just available in niche corners of the internet or the dark web; it is presented to users on mainstream websites. Research by the academics Clare McGlynn and Fiona Vera-Gray into the titles of videos that were available on the landing page of three of the UK’s most popular pornography websites revealed that one in eight titles used descriptions such as “pain”, “destroy”, “brutal”, “torture”, “violate”, “hurt”, and many others that are too unpleasant to mention in this Chamber. To reiterate, these videos were available on the landing pages and presented to first-time visitors to the site without any further searching necessary. We have to acknowledge that the vast majority of online pornographic content, viewed by millions across the globe, is directly promoting violence against women and girls.

A very real example of this impact is Wayne Couzens, the murderer of Sarah Everard. In court, a former colleague set out how Couzens was attracted to “brutal sexual pornography”. Indeed, it is not hard to find an addiction to violent pornography in the background of many notorious rapists and killers of women. While not all men will jump from violent pornography to real-life harm, we know that for some it acts as a gateway to fulfilling a need for more extreme stimulation.

A study published in 2019 in the National Library of Medicine by Chelly Maes involving 568 adolescents revealed that exposure to pornographic content was related to individuals’ resistance towards the #MeToo movement and increased acceptance of rape myths. Even the Government’s own research found substantial evidence of an association between the use of pornography and harmful attitudes and behaviours towards women and girls, yet the Bill does nothing to bring parity between the way that pornography is regulated online and offline. That is why I support the amendment in the name of the noble Baroness, Lady Benjamin, to address this inconsistency.

We must remember that while this pornography has an effect on the viewer, it also has an effect on the performers taking part. Speaking to the APPG on Commercial Sexual Exploitation’s recent inquiry into pornography, Linda Thompson, the national co-ordinator at the Women’s Support Project, said:

“We know pornography is a form of violence against women. It is not just fantasy; it is the reality for the women involved. It is not just a representation of sex, it is actual sexual violence that is occurring to the women”.

There is currently no obligation for pornography companies to verify that a performer in pornographic content is over the age of 18 and that they consent. Once the videos are uploaded on to platforms, they go through little moderation, meaning that only the most overtly extreme and obviously illegal and non-consensual content is readily identified and reported. This means that for many girls and women, their sexual abuse and rape is readily available and is viewed for pleasure. The Bill is an opportunity to rectify this and to put this duty on pornography companies. For that reason, I am supportive of amendments tabled by the noble Baroness, Lady Kidron, to put this duty on them.

Finally, we must remember the impact that having access to this content is having on our children right now. A recent report by the Children’s Commissioner for England about pornography and subsequent harmful sexual behaviour and abuse demonstrates this. Using data from children’s own testimonies about cases of child sexual abuse committed against them by another child, references to specific acts of sexual violence commonly found in pornography were present in 50% of the cases examined.

Pornography is how children are receiving a sex education. We know the impact that this type of pornographic content is having on adults, yet we are allowing children to continue having unfettered access as their attitudes towards sex and relationships are forming. In 2021, the singer Billie Eilish spoke out about her experiences of watching pornography from the age of 11, saying:

“The first few times I … had sex, I was not saying no to things that were not good. It was because I thought that’s what I was supposed to be attracted to”.

Children are seeing violent sexual acts as normal and expected parts of relationships. This cannot continue. That is why I am supporting the amendments tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, to bring in robust age verification for pornography wherever it is found online. I thank them and the noble Baroness, Lady Benjamin, for their hard work in raising awareness of the online risks posed to children. I also express gratitude to all the children’s charities such as Barnardo’s which work tirelessly day and night to keep children safe. The Government also have a duty to keep children safe.

My Lords, I too should have spoken before the noble Lord, Lord Allan; I should have known, given his position on the Front Bench, that he was speaking on behalf of the Liberal Democrats. I was a little reticent to follow him, knowing his expertise in the technical area, but I am very pleased to do so now. I support this very important group of amendments and thank noble Lords for placing them before us. I echo the thanks to all the children’s NGOs that have been working in this area for so long.

For legislators, ambiguity is rarely a friend, and this is particularly true in legislation dealing with digital communications, where, as we all acknowledge, the law struggles to keep pace with technical innovation. Where there is ambiguity, sites will be creative and will evade what they see as barriers—of that I have no doubt. Therefore, I strongly believe that there is a need to have clarity where it can be achieved. That is why it is important to have in the Bill a clear definition of age verification for pornography.

As we have heard this evening, we know that pornography is having a devastating impact on our young people and children: it is impacting their mental health and distorting their views of healthy sexual relationships. It is very upsetting for me that evidence shows that children are replicating the acts they see in pornographic content, thinking that it is normal. It is very upsetting that, in particular, young boys who watch porn think that violence during intimacy is a normal thing to do. The NSPCC has told us that four in 10 boys aged 11 to 16 who regularly view porn say they want to do that because they want to get ideas as to the type of sex they want to try. That is chilling. Even more chilling is the fact that content is often marketed towards children, featuring characters from cartoons, such as “Frozen”, “Scooby Doo” and “The Incredibles”, to try to draw young people on to those sites. Frankly, that is unforgivable; it is why we need robust age verification to protect our children from this content. It must apply to all content, regardless of where it is found; we know, for instance, that Twitter is often a gateway to pornographic sites for young people.

The noble Lord, Lord Bethell, referred to ensuring, beyond all reasonable doubt, that the user is over 18. I know that that is a very high standard—it is the criminal law level—but I believe it is what is needed. I am interested to hear what the Minister has to say about that, because, if we are to protect children and if we take on the role of the fireguard, which the right reverend Prelate referred to, we need to make sure that it is as strong as possible.

Also, this is not just about making sure that users are over 18; we need to make sure that adults, not children, are involved in the content. The noble Baroness, Lady Benjamin, talked about adults being made to look like children, but there is also the whole area of young people being trafficked and abused into pornography production; therefore, Amendment 184 on performer age checks is very important.

I finish by indicating my strong support for Amendment 185 in the name of the noble Baroness, Lady Benjamin. Some, if not most, mainstream pornography content sites are degrading, extremely abusive and violent. Such content would be prohibited in the offline world and is illegal to own and to have; this includes sexual violence including strangulation, incest and sexualising children. We know that this is happening online because, as we have heard, some of the most frequently searched terms on porn sites are “teens”, “schoolgirls” or “girls”, and the lack of regulation online has allowed content to become more and more extreme and abusive. That is why I support Amendment 185 in the name of noble Baroness, Lady Benjamin, which seeks to bring parity between the online and offline regulation of pornographic content.

This Bill has been eagerly awaited. There is no doubt about that. It has been long in the gestation—some people would say too long. We have had much discussion in this Committee but let us get it right. I urge the Minister to take on board the many points made this afternoon. That fireguard needs not only to be put in place, but it needs to be put in place so that it does not move, it is not knocked aside and so that it is at its most effective. I support the amendments.

My Lords, I also failed to stand up before the noble Lord, Lord Allan, did. I too am always slightly nervous to speak before or after him for fear of not having the detailed knowledge that he does. There have been so many powerful speeches in this group. I will try to speak swiftly.

My role in this amendment was predefined for me by the noble Baroness, Lady Kidron, as the midwife. I have spent many hours debating these amendments with my noble friend Lord Bethell, the noble Baroness, Lady Kidron, and with many noble Lords who have already spoken in this debate. I think it is very clear from the debate why it is so important to put a definition of age assurance and age verification on the face of the Bill. People feel so passionately about this subject. We are creating the digital legal scaffolding, so being really clear what we mean by the words matters. It really matters and we have seen it mattering even in the course of this debate.

My two friends—they are my friends—the noble Baroness, Lady Kidron, and my noble friend Lord Bethell both used the word “proportionate”, with one not wanting us to be proportionate and the other wanting us to be proportionate. Yet, both have their names to the same amendment. I thought it might be helpful to explain what I think they both mean—I am sure they will interrupt me if I get this wrong—and explain why the words of the amendment matter so much.

Age assurance should not be proportionate for pornography. It should be the highest possible bar. We should do everything in our power to stop children seeing it, whether it is on a specific porn site or on any other site. We do not want our children to see pornography; we are all agreed on that. There should not be anything proportionate about that. It should be the highest bar. Whether “beyond reasonable doubt” is the right wording or it should instead be “the highest possible bar practically achievable”, I do not know. I would be very keen to hear my noble friend the Minister’s thoughts on what the right wording is because, surely, we are all clear it should be disproportionate; it should absolutely be the hardest we can take.

Equally, age assurance is not just about pornography, as the noble Lord, Lord Allan, has said. We need to have a proportionate approach. We need a ladder where age assurance for pornography sits at the top, and where we are making sure that nine year-olds cannot access social media sites if they are age-rated for 13. We all know that we can go into any primary school classroom in the land and find that the majority of nine year-olds are on social media. We do not have good age assurance further down.

As both the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, have said, we need age assurance to enable providers to adapt the experience to make it age-appropriate for children on services we want children to use. It needs to be both proportionate and disproportionate, and that needs to be defined on the face of the Bill. If we do not, I fear that we will fall into the trap that the noble Lord, Lord Allan, mentioned: the cookie trap. We will have very well-intentioned work that will not protect children and will go against the very thing that we are all looking for.

In my role as the pragmatic midwife, I implore my noble friend the Minister to hear what we are all saying and to help us between Committee and Report, so that we can come back together with a clear definition of age assurance and age verification on the face of the Bill that we can all support.

My Lords, about half an hour ago I decided I would not speak, but as we have now got to this point, I thought I might as well say what I was going to say after all. I reassure noble Lords that in Committee it is perfectly permissible to speak after the winder, so no one is breaking any procedural convention. That said, I will be very brief.

My first purpose in rising is to honour a commitment I made last week when I spoke against the violence against women and girls code. I said that I would none the less be more sympathetic to and supportive of stronger restrictions preventing child access to pornography, so I want to get my support on the record and honour that commitment in this context.

My noble friend Lady Harding spoke on the last group about bringing our previous experiences to bear when contributing to some of these issues. As I may have said in the context of other amendments earlier in Committee, as a former regulator, I know that one of the important guiding principles is to ensure that you regulate for a reason. It is very easy for regulators to have a set of rules. The noble Baroness, Lady Kidron, referred to rules of the road for the tech companies to follow. It is very easy for regulators to examine whether those rules are being followed and, having decided that they have, to say that they have discharged their responsibility. That is not good enough. There must be a result, an outcome from that. As the noble Lord, Lord Allan, emphasised, this must be about outcomes and intended benefits.

I support making it clear in the Bill that, as my noble friend Lady Harding said, we are trying to prevent, disproportionately, children accessing pornography. We will do all we can to ensure that it happens, and that should be because of the rules being in place. Ofcom should be clear on that. However, I also support a proportionate approach to age assurance in all other contexts, as has been described. Therefore, I support the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell, and the role my noble friend Lady Harding has played in arriving at a pragmatic solution.

My Lords, it is a privilege to be in your Lordships’ House, and on some occasions it all comes together and we experience a series of debates and discussions that we perhaps would never have otherwise reached, and at a level which I doubt could be echoed anywhere else in the world. This is one of those days. We take for granted that every now and again, we get one of these rapturous occasions when everything comes together, but we forget the cost of that. I pay tribute, as others have, to the noble Baroness, Lady Kidron. She has worked so hard on this issue and lots of other issues relating to this Bill and has exhausted herself more times than is right for someone of her still youthful age. I am very pleased that she is going off on holiday and will not be with us for a few days; I wish her well. I am joking slightly, but I mean it sincerely when I say that we have had a very high-quality debate. That it has gone on rather later than the Whips would have wanted is tough, because it has been great to hear and be part of. However, I will be brief.

It was such a good debate that I felt a tension, in that everybody wanted to get in and say what they wanted to say be sure they were on the record. That can sometimes be a disaster, because everyone repeats everything, but as the noble Baroness, Lady Harding, said, we know our roles, we know what to say and when to say it, and it has come together very nicely. Again, we should congratulate ourselves on that. However, we must be careful about something which we keep saying to each other but sometimes do not do. This is a Bill about systems, not content. The more that we get into the content issues, the more difficult it is to remember what the Bill can do and what the regulator will be able to do if we get the Bill to the right place. We must be sure about that.

I want to say just a few things about where we need to go with this. As most noble Lords have said, we need certainty: if we want to protect our children, we have to be able to identify them. We should not be in any doubt about that; there is no doubt that we must do it, whatever it takes. The noble Lord, Lord Allan, is right to say that we are in the midst of an emerging set of technologies, and there will be other things coming down the line. The Bill must keep open to that; it must not be technology-specific, but we must be certain of what this part is about, and it must drill down to that. I come back to the idea of proportionality: we want everybody who is 18 or under to be identifiable as such, and we want to be absolutely clear about that. I like the idea that this should be focused on the phones and other equipment we use; if we can get to that level, it will be a step forward, although I doubt whether we are there yet.

Secondly, we must not leave open the idea that there will somehow be different approaches to how we regulate what children see in this space. We are talking about pornography here. Whether it comes through a Part 3 or Part 5 service, or accidently through a blog or some other piece of information, it has to be stopped. We do not want our children to receive it. That must be at the heart of what we are about, and not just something we think about as we go along. The Bill is complicated and difficult to read. It has to be because the services are different, but that should not be at the expense of the ability to say at the end: “We did it”.

Thirdly, I worry about some things that have crept into the debate on the proportionality issue. If “a small number” means that we will somehow let a few children see something, that will not be acceptable. Everybody has said this. Let us be clear about it: this is either 100% or it is not worth doing. If so, the question of whether we do it is not about finding the right form of words, such as “beyond reasonable doubt”; it is about certainty.

As the noble Baroness, Lady Kidron, set out at the beginning of this debate, the amendments in this group have involved extensive discussions among Members in both Houses of Parliament, who sit on all sides of both Houses. I am very grateful for the way noble Lords and Members in another place have done that. They have had those preliminary discussions so that our discussions in the debate today and in preparation for it could be focused and detailed. I pay particular tribute to the noble Baroness, Lady Kidron, and my noble friends Lord Bethell and Lady Harding, who have been involved in extensive discussions with others and then with us in government. These have been very helpful indeed; they continue, and I am happy to commit to their continuing.

Age-assurance technologies will play an important role in supporting the child safety duties in this Bill. This is why reference is made to them on the face of the Bill—to make it clear that the Government expect these measures to be used for complying with the duties to protect children from harmful content and activity online. Guidance under Clause 48 will already cover pornographic content. While this is not currently set out in the legislation, the Government intend, as noble Lords know, to designate pornographic content as a category of primary priority content which is harmful to children. As I set out to your Lordships’ House during our debate on harms to children, we will amend the Bill on Report to list the categories of primary and primary priority content on the face of the Bill.

I am very grateful to noble Lords for the engagement we have had on some of the points raised in Amendments 142 and 306 in recent weeks. As we have been saying in those discussions, the Government are confident that the Bill already largely achieves the outcomes sought here, either through existing provisions in it or through duties in other legislation, including data protection legislation, the Human Rights Act 1998 and the Equality Act 2010. That is why we think that re-stating duties on providers which are already set out in the Bill, or repeating duties set out in other legislation, risks causing uncertainty, and why we need to be careful about imposing specific timelines on Ofcom by which it must produce age-assurance guidance. It is essential that we protect Ofcom’s ability robustly to fulfil its consultation duties for the codes of practice. If Ofcom is given insufficient time to fulfil these duties, the risk of legal challenge being successful is increased.

I welcome Ofcom’s recent letter to your Lordships, outlining its implementation road map, which I hope provides some reassurance directly from the regulator on this point. Ofcom will prioritise protecting children from pornography and other harmful content. It intends to publish, this autumn, draft guidance for Part 5 pornography duties and draft codes of practice for Part 3 illegal content duties, including for child sexual exploitation and abuse content. Draft codes of practice for children’s safety duties will follow next summer. These elements of the regime are being prioritised ahead of others, such as the category 1 duties, to reflect the critical importance of protecting children.

Although we believe that the Bill already largely achieves the outcomes sought, we acknowledge the importance of ensuring that there are clear principles for Ofcom to apply when recommending or requiring the use of age-assurance technologies. I am happy to reassure noble Lords that the Government will continue to consider this further and are happy to continue our engagement on this issue, although any amendment must be made in a way that sits alongside existing legislation and within the framework of the Bill.

I turn to Amendments 161 and 183. First, I will take the opportunity to address some confusion about the requirements in Parts 3 and 5 of the Bill. The Bill ensures that companies must prevent children accessing online pornography, regardless of whether it is regulated in Part 3 or Part 5. The Government are absolutely clear on this point; anything less would be unacceptable. The most effective approach to achieving this is to focus on the outcome of preventing children accessing harmful content, which is what the Bill does. If providers do not prevent children accessing harmful content, Ofcom will be able to bring enforcement action against them.

I will address the point raised by my noble friend Lord Bethell about introducing a standard of “beyond reasonable doubt” for age verification for pornography. As my noble friend knows, we think this a legally unsuitable test which would require Ofcom to determine the state of mind of the provider, which would be extremely hard to prove and would therefore risk allowing providers to evade their duties. A clear, objective duty is the best way to ensure that Ofcom can enforce compliance effectively. The Bill sets clear outcomes which Ofcom will be able to take action on if these are not achieved by providers. A provider will be compliant only if it puts in place systems and processes which meet the objective requirements of the child safety duties.

The provisions in the Bill on proportionality are important to ensure that the requirements in the child safety duties are tailored to the size and capacity of providers. Smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services and will need to make sure these systems and processes achieve the required outcomes of the child safety duties.

The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk of harm to children, such as online pornography. However, companies may use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.

Age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For example, if a user-to-user service such as a social medium does not allow—

I am sorry to interrupt. The Minister said that he would bear in mind proportionality in relation to size and capacity. Is that not exactly the point that the noble Baroness, Lady Harding, was trying to make? In relation to children, why will that be proportionate? A single child being damaged in this way is too much.

The issue was in relation to a provider’s size and capacity; it is an issue of making sure it is effective and enforceable, and proportionate to the size of the service in question. It may also not be the most effective approach for companies to follow to comply with their duties. If there is a company such as a user-to-user service in social media that says it does not allow pornography under its terms of service, measures such as content moderation and user reporting might be more appropriate and effective for protecting children than age verification in those settings. That would allow content to be better detected and taken down, while—

I understand that, but it is an important point to try to get on the record. It is an outcome-based solution that we are looking for, is it not? We are looking for zero activity where risks to children are there. Clearly, if the risk assessment is that there is no risk that children can be on that site, age verification may not be required— I am extending it to make a point—but, if there is a risk, we need to know that the outcome of that process will be zero. That is my point, and I think we should reflect on that.

I am very happy to, and the noble Lord is right that we must be focused on the outcomes here. I am very sympathetic to the desire to make sure that providers are held to the highest standards, to keep children protected from harmful content online.

I know the Minister said that outcomes are detailed in the Bill already; I wonder whether he could just write to us and describe where in the Bill those outcomes are outlined.

I shall happily do that, and will happily continue discussions with my noble friend and others on this point and on the appropriate alternative to the language we have discussed.

On the matter of Ofcom independently auditing age- assurance technologies, which my noble friend also raised, the regulator already has the power to require a company to undertake and pay for a report from a skilled person about a regulated service. This will assist Ofcom in identifying and assessing non-compliance, and will develop its understanding of the risk of failure to comply. We believe that this is therefore already provided for.

I reassure noble Lords that the existing definition of pornographic content in the Bill already captures the same content that Amendment 183ZA, in the name of the noble Baroness, Lady Ritchie of Downpatrick, intends to capture. The definition in the Bill shares the key element of the approach Ofcom is taking for pornography on UK-established video-sharing platforms. This means that the industry will be familiar with this definition and that Ofcom will have experience in regulating content which meets it.

The definition is also aligned with that used in existing legislation. I take on board the point she made about her trawl of the statute book for it, but the definition is aligned elsewhere in statute, such as in the Coroners and Justice Act 2009. This means that, in interpreting the existing definition in the Bill, the courts may be able to draw on precedent from the criminal context, giving greater certainty about its meaning. The definition of pornography in Part 5 is also consistent with the British Board of Film Classification’s guidelines for the definition of sex works, which is

“works whose primary purpose is sexual arousal or stimulation”

and the BBFC’s definition of R18. We therefore think it is not necessary to refer to BBFC standards in this legislation. Including the definition in the Bill also retains Parliament’s control of the definition, and therefore also which content is subject to the duties in Part 5. That is why we believe that the definition as outlined in the Bill is more straightforward for both service providers and Ofcom to apply.

I turn to Amendments 184 and 185. The Government share the concerns raised in today’s debate about the wider regulation of online pornography. It is important to be clear that extreme pornography, so-called revenge pornography and child sexual exploitation and abuse are already illegal and are listed as priority offences in the Bill. This means that under the illegal content duties, Part 3 providers, which will include some of the most popular commercial pornography services, must take proactive, preventive measures to limit people’s exposure to this criminal content and behaviour.

As I have made clear in previous debates, providers will need to protect children from all forms of online pornography, including illegal pornography or content that the British Board of Film Classification refuses to classify. Providers in scope of Part 5 are publishers which directly control the material on their services, and which can already be held liable for existing extreme pornography and child sexual exploitation and abuse offences captured by the criminal law. The most appropriate mechanism for dealing with these services is, rather than a regulatory regime, the criminal law.

I can also reassure noble Lords that the Government’s new offences relating to sharing and sending intimate images without consent will apply to providers in scope of Part 5. They will be criminally liable for any non-consensual intimate images published on their service.

In relation to Amendment 184, as intimate image abuse will already be illegal in criminal law, it is unnecessary to include a specific duty for Part 5 providers to prohibit this content. Any publisher that shares such images on its site would risk breaking the law and could face a prison sentence. The Bill is also not the right mechanism to regulate content produced or published by the adult industry with regard to the consent of performers appearing in pornographic content. Copyright and contract law already gives performers based in the UK the right to authorise the making of a recording of their performance. Any works recorded and made available to the public without the performer’s consent would constitute an infringement of their rights. As a private right, it is for the performer to enforce this, not a broader regulatory regime.

Does my noble friend the Minister recognise that those laws have been in place for the 30 years of the internet but have not successfully been used to protect the rights of those who find their images wrongly used, particularly those children who have found their images wrongly used in pornographic sites? Does he have any reflections on how that performance could be improved?

I would want to take advice and see some statistics, but I am happy to do that and to respond to my noble friend’s point. I was about to say that my noble friend Lady Jenkin of Kennington asked a number of questions, but she is not here for me to answer them.

I turn to Amendment 232 tabled by the noble Lord, Lord Allan of Hallam. Because of the rapid development of age-assurance technologies, it is right that they should be carefully assessed to ensure that they are used effectively to achieve the outcomes required. I am therefore sympathetic to the spirit of his amendment, but must say that Ofcom will undertake ongoing research into the effectiveness of age-assurance technologies for its various codes and guidance, which will be published. Moreover, when preparing or updating the codes of practice, including those that refer to age-assurance technologies, Ofcom is required by the Bill to consult a broad range of people and organisations. Parliament will also have the opportunity to scrutinise the codes before they come into effect, including any recommendations regarding age assurance. We do not think, therefore, that a requirement for Ofcom to produce a separate report into age-assurance technologies is a necessary extra burden to impose on the regulator.

In relation to this and all the amendments in this group, as I say, I am happy to carry on the discussions that we have been having with a number of noble Lords, recognising that they speak for a large number of people in your Lordships’ House and beyond. I reiterate my thanks, and the Government’s thanks, to them for the way in which they have been going about that. With that, I encourage them not to press their amendments.

My Lords, I thank everyone for their contributions this evening. As the noble Lord, Lord Stevenson, said, it is very compelling when your Lordships’ House gets itself together on a particular subject and really agrees, so I thank noble Lords very much for that.

I am going to do two things. One is to pick up on a couple of questions and, as has been said by a number of noble Lords, concentrate on outcomes rather than contributions. On a couple of issues that came up, I feel that the principle of pornography being treated in the same way in Parts 3 and 5 is absolute. We believe we have done it. After Committee we will discuss that with noble Lords who feel that is not clear in the amendment to make sure they are comfortable that it is so. I did not quite understand in the Minister’s reply that pornography was being treated in exactly the same way in Parts 3 and 5. When I say “exactly the same way”, like the noble Lord, Lord Allan, I mean not necessarily by the same technology but to the same level of outcome. That is one thing I want to emphasise because a number of noble Lords, including the noble Baroness, Lady Ritchie, the noble Lord, Lord Farmer, and others, are rightly concerned that we should have an outcome on pornography, not concentrate on how to get there.

The second thing I want to pick up very briefly, because it was received so warmly, is the question of devices and on-device age assurance. I believe that is one method, and I know that at least one manufacturer is thinking about it as we speak. However, it is an old battle in which companies that do not want to take responsibility for their services say that people over here should do something different. It is very important that devices, app stores or any of the supposed gatekeepers are not given an overly large responsibility. It is the responsibility of everyone to make sure that age assurance is adequate.

I hope that what the noble Baroness is alluding to is that we need to include gatekeepers, app stores, device level and sideloading in another part of the Bill.

But of course—would I dare otherwise? What I am saying is that these are not silver bullets and we must have a mixed economy, not only for what we know already but for what we do not know. We must have a mixed economy, and we must not make an overly powerful one platform of age assurance. That is incredibly important, so I wanted to pick up on that.

I also want to pick up on user behaviour and unintended consequences. I think there was a slight reference to an American law, which is called COPPA and is the reason that every website says 13. That is a very unhelpful entry point. It would be much better if children had an age-appropriate experience from five all the way to 18, rather than on and off at 13. I understand that issue, but that is why age assurance has to be more than one thing. It is not only a preventive thing but an enabling thing. I tried to make that very clear so I will not detain the Committee on that.

On the outcome, I say to the Minister, who has indeed given a great deal of time to this, that more time is needed because we want a bar of assurance. I speak not only for all noble Lords who have made clear their rightful anxiety about pornography but also on behalf of the bereaved parents and other noble Lords who raised issues about self-harming of different varieties. We must have a measurable bar for the things that the Bill says that children will not encounter—the primary priority harms. In the negotiation, that is non-negotiable.

On the time factor, I am sorry to say that we are all witness to what happened to Part 3. It was pushed and pushed for years, and then it did not happen—and then it was whipped out of the Bill last week. This is not acceptable. I am happy, as I believe other noble Lords are, to negotiate a suitable time that gives Ofcom comfort, but it must be possible, with this Bill, for a regulator to bring something in within a given period of time. I am afraid that history is our enemy on this one.

The third thing is that I accept the idea that there has to be more than principles, which is what I believe Ofcom will provide. But the principles have to be 360 degrees, and the questions that I raised about security, privacy and accessibility should be in the Bill so that Ofcom can go away and make some difficult judgments. That is its job; ours is to say what the principle is.

I will tell one last tiny story. About 10 years ago, I met in secret with one of the highest-ranking safety officers in one of the companies that we always talk about. They said to me, “We call it the ‘lost generation’. We know that regulation is coming, but we know that it is not soon enough for this generation”. On behalf of all noble Lords who spoke, I ask the Government to save the next generation. With that, I withdraw the amendment.

Amendment 123A withdrawn.

Clause 48 agreed.

House resumed. Committee to begin again not before 9.22 pm.