Skip to main content

Digital Economy Bill (Sixth sitting)

Debated on Thursday 20 October 2016

The Committee consisted of the following Members:

Chairs: Mr Gary Streeter, † Graham Stringer

† Adams, Nigel (Selby and Ainsty) (Con)

† Brennan, Kevin (Cardiff West) (Lab)

† Davies, Mims (Eastleigh) (Con)

† Debbonaire, Thangam (Bristol West) (Lab)

† Foxcroft, Vicky (Lewisham, Deptford) (Lab)

† Haigh, Louise (Sheffield, Heeley) (Lab)

† Hancock, Matt (Minister for Digital and Culture)

† Hendry, Drew (Inverness, Nairn, Badenoch and Strathspey) (SNP)

† Huddleston, Nigel (Mid Worcestershire) (Con)

Jones, Graham (Hyndburn) (Lab)

† Kerr, Calum (Berwickshire, Roxburgh and Selkirk) (SNP)

† Mann, Scott (North Cornwall) (Con)

† Matheson, Christian (City of Chester) (Lab)

† Menzies, Mark (Fylde) (Con)

† Perry, Claire (Devizes) (Con)

† Skidmore, Chris (Parliamentary Secretary, Cabinet Office)

† Stuart, Graham (Beverley and Holderness) (Con)

† Sunak, Rishi (Richmond (Yorks)) (Con)

Marek Kubala, Committee Clerk

† attended the Committee

Public Bill Committee

Thursday 20 October 2016


[Graham Stringer in the Chair]

Digital Economy Bill

Clause 15

Internet pornography: requirement to prevent access by persons under the age of 18

Amendment proposed (this day): 85, in clause 15, page 18, line 20, leave out subsection (5)(a).—(Louise Haigh.)

Question again proposed, That the amendment be made.

I remind the Committee that with this we are discussing the following:

Amendment 87, in clause 15, page 18, line 25, leave out subsection 6.

New clause 7—On-demand programme services: requirement to prevent persons under the age of 18 accessing pornographic material with an 18 classification certificate—

“Section 368E of the Communication Act 2003 (harmful material) is amended as follows—

(a) in subsection (5)—

(i) after subsection (a) insert—

“(aa) a video work in respect of which the video works authority has issued an 18 classification certificate, and that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal,”;

(ii) after subsection (b) insert—

“(ba) material that was included in a video work to which paragraph (aa) applies, if it is reasonable to assume from the nature of the material—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the certificate was an 18 certificate,

“(bb) any other material if it is reasonable to assume from its nature—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that any classification certificate issued for a video work including it would be an 18 certificate.”

(b) in subsection (7) after “section” insert—

““18 certificate” means a classification certificate which—

(a) contains, pursuant to section 7(2)(b) of the Video Recordings Act 1984, a statement that the video work is suitable for viewing only by persons who have attained the age of 18 and that no video recording containing that work is to be supplied to any person who has not attained that age, and

(b) does not contain the statement mentioned in section 7(2)(c) of that Act that no video recording containing the video work is to be supplied other than in a licensed sex shop;””

This new clause requires the extension of measures for UK-based video on-demand programming to protect children from 18 material as well as R18 material.

First, I thank my hon. Friend the Member for Sheffield, Heeley for making such a clear and cogent argument for why the Bill needs further amendment. As I think she said—I am sure that she will correct me if I am wrong—we want to ensure that the Government stick to their manifesto commitment to protect children from all forms of online pornography. That will take consistency and a depth of modesty about the extent of our various levels of knowledge about how the internet works.

The hon. Member for Devizes made a good speech, and I am grateful to her for making the argument about on-demand films, as my hon. Friend the Member for Sheffield, Heeley also did, but the hon. Lady said—please correct me if I am wrong—that there were not many providers of free online pornography. I must respectfully disagree. Given the existence of peer-to-peer sharing and other forms of availability—my hon. Friend mentioned Tumblr and other social media websites—I am afraid that it is incredibly easy, as my nephews and nieces have confirmed, sadly, for a young person to access free online pornographic content in ways that most of us here might not even understand.

I am happy to clarify. My focus was on the Government’s intention to capture free and commercial pornography. The hon. Lady is absolutely right that there is a plethora of free stuff out there, and she is right to focus on the harm that it causes.

I thank the hon. Lady for that clarification. I understand from an intervention made by my hon. Friend the Member for Cardiff West that the reason why we were not allowed to remove the words “on a commercial basis” was that they were deemed out of scope. As I understand it, the word “economy”, if we stick to the letter of it, includes transactions for which there is no financial payment. There are transactions involved, and the word “digital” is in the title of the Bill, so I think it unfortunate that the amendment was not agreed to. Taking out the words “on a commercial basis” would have done a great deal to make consistent across all platforms and all forms of pornographic content available online the restrictions that we are placing on commercial ones.

I support the amendments proposed by my hon. Friend to the wording of clause 15(5)(a) and (6), for reasons that have already been given, and I want to add to the arguments. Hon. Friends and Members may have read the evidence from Girlguiding. As a former Guide, I pay tribute to the movement for the excellent work that it has done. It has contributed a profound and well-evidenced understanding of what young women are saying about online pornography. I will pick out a couple of statistics, because they make arguments to which I will refer in interventions on later clauses. That will make my speeches less long.

In the 2016 girls’ attitudes survey, half of the girls said that sexism is worse online than offline. In the 2014 survey, 66%, or two thirds, of young women said that they often or sometimes see or experience sexism online. It is a place where young women routinely experience sexism, and part of that sexism is the ubiquity of pornography. In 2015, the survey found that 60% of girls aged 11 to 21 see boys their age—admittedly, some of those are over the age of 18, but they are still the girls’ peers—viewing pornography on mobile devices or tablets. In contrast, only 27% of girls say that they see girls their age viewing pornography. The majority of those young women say from their experience that children can access too much content online and that it should be for adults only. In the survey, we see a certain degree of concord among young women in the Girlguiding movement, Opposition Members and the Government manifesto, which pledged, as my hon. Friend said, to exclude children from all forms of online pornography.

The 2015 Girlguiding survey also found that those young women felt that pornography was encouraging sexist stereotyping and harmful views, and that the proliferation of pornography is having a negative effect on women in society more generally. Those young women are the next generation of adults.

I have worked with young men who have already abused their partners. In my former job working with domestic violence perpetrators, I worked with young men of all ages; for the men my age, their pornography had come from the top shelf of a newsagent, but the younger men knew about forms of pornography that those of us of a certain age had no understanding of whatever. They were using pornography in ways that directly contribute to the abuse of women and girls, including pornography that is filmed abuse. I shall come back to that point later, but we need to recognise that young men are getting their messages about what sex and intimacy are from online pornography. If we do not protect them from online pornography under the age of 18, we are basically saying that there are no holds barred.

The hon. Member for Devizes and my hon. Friend the Member for Sheffield, Heeley mentioned loopholes. When we leave loopholes, it creates a colander or sieve for regulation. Yes, the internet is evolving and, yes, we in this Committee Room probably do not know every single way in which it already provides pornography, and certainly not how it will in future, but that is a good reason to provide a strong regulatory framework when we have the chance. We have that chance now, and we should take it. If it remains the case that removing the words “on a commercial basis” is deemed outside our scope, which I find very sad—I think it is a missed opportunity, and I hope the House can return to it at some point and regulate the free content—we must definitely ensure that we are putting everything else that we possibly can on a level playing field. That means that the regulation of video on demand has to be consistent and that we have to close any other loophole we can spot over the next few days.

I hope Opposition amendments will make the Government think about the manifesto commitment they rightly made—I am happy to put on the record that I support it—and take the opportunity to stick to it. Young women want that; young men need it, because my experience of working with young men who have abused their partners and ex-partners is that they felt that they were getting those messages from pornography; and we as a society cannot afford to ignore this problem any longer. We have a chance to do something about it, so let us take that opportunity.

I must have it clearly on the record that I supported that commitment only: not the whole Conservative manifesto, just the bit that says “We want to protect all children from all online pornography.”

I am sure our powers of persuasion will extend that support in the future. The outbreak of support for our manifesto is welcome; this is an incredibly important area, and I am proud to lead the Front-Bench effort to deal with underage people’s access to adult material by introducing age verification. I want to respond in detail to the points made, because it is important we get this right.

Before I come to the specific amendments, I will deal with commercial providers. The measures in the Bill will apply equally to all commercial providers, whether their material is paid for directly or appears on free sites that operate on a different business model. “Commercial” has quite a broad meaning, as my hon. Friend the Member for Devizes said. If a provider makes money from a site in any way, whether or not it makes a profit, it can be caught by the legislation. That is the right distinction, because it targets those who make money and are indifferent to the harm their activities may cause to children.

If the hon. Lady will hold on, I want to explain this in full, rather than in part, before I give way. The age verification regulator must publish guidance on the circumstances in which it will regard a site or app as commercial. It will be for the regulator to judge whether a site is commercial, and there is no definition that states which website platforms are covered. Crucially, the regulator will also be able to take a view if specific social media and other types of sites are ancillary service providers—a person who appears to be facilitating or enabling the making available of pornographic material by non-compliant persons. I think that the capturing of others as ancillary service providers is an important part of making sure that we fully deliver our manifesto commitment, as I believe this Bill does.

We are aware that “commercial” is not limited to sites that require payment. It includes online advertising and other business models, as the Minister has said. However, it is unclear how the regulator will be able to enforce these measures given that the only enforcement available to them is notifying other payment service providers and ancillary services.

No doubt we will come on to enforcement. A number of clauses and amendments are on enforcement. The point is that other social media sites can be classified by the regulator as ancillary service providers for facilitating or enabling the making of available pornographic material. Our view is that enforcement through disrupting business models is more powerful because you are undermining the business model of the provider. However, I do not want to get too distracted, in an out of order way, into enforcement which is rightly dealt with in later clauses.

If the Bill is clearly designed to enable the regulator to focus on social media sites and other ancillary service providers, why was that term “on a commercial basis” included in these sections?

The principle is that there is a distinction between those who are making money by targeting and are indifferent to potential harm and those whose services facilitate the provision of porn to those who are under age. I think it is a reasonable distinction. We are trying to deal with the mass of the problem. By its nature, it is very difficult to get to 100%. I think that leaving the Bill in this way, with flexibility for the regulator to act, has a big advantage over being overly prescriptive in primary legislation and too specific about the way in which the regulator acts, not least because disrupting the business model is the goal of trying to provide enforcement.

I support the Minister’s point about over-prescription, but perhaps he could help me by talking about a particular case. Let us take Tumblr hosting a stream of content which is 18. Who would the regulator target if it issued an enforcement notice? Would it be the content provider, or would it be the social media platform that is hosting that content?

In that case, the platform—I do not want to get into individual platforms, but I am happy to take my hon. Friend’s example—would likely be an ancillary service provider and therefore captured. This is a very important distinction. There is a difference between somebody who is actively putting up adult material and choosing not to have age verification, and a platform where others put up adult material, where it is not necessarily impossible but much harder to have a control over the material. There is an important distinction here. If we try to pretend that everybody putting material onto a platform, for example, the one that my hon. Friend mentions, should be treated the same way as a porn-providing website, we will be led into very dangerous territory and it makes it harder to police this rather than easier. That is my argument.

On the specific amendments, I understand entirely where the argument on demand is coming from. I want to give an assurance which I hope will mean that these clauses will not be pushed to the vote. On-demand audio-visual media services under UK jurisdiction are excluded from part 3 of the Bill because they are regulated by Ofcom under part 4A of the Communications Act 2003. As my hon. Friend the Member for Devizes said, other on-demand services that are not currently regulated in the UK will be caught by the Bill regime.

The amendments and new clause 7 would apply the Bill’s age verification requirements to on-demand audio-visual media services under UK jurisdiction, meaning that we would end up with a double regulation. They would also amend the existing age verification requirement that applies to providers of those services to cover material that the British Board of Film Classification would describe as “18 sex works”, as well as R18 and equivalent. I want to be crystal clear about the aim: it is to have complementary regimes as between on-demand material regulated by Ofcom and material to be regulated by the BBFC, so that although the regulator may be different, the result is the same.

Forgive me, but the Minister just gave a lot of information, and I want to clarify something. Whichever regulator is doing it, will the effect of the legislation as he would like to see it put R18 films and 18-rated films on on-demand services at the same level of age verification? I am not clear on that point.

The aim is that even though the regulator may be different in those two cases, the result would be the same. I can give the hon. Lady that assurance. The Bill will do that without having double regulation. As we discussed earlier with regard to a different part of the Bill, having double regulation in the same area can lead to confusion and worse outcomes, rather than clarity and better outcomes.

A service that falls within part 4A of the Communications Act 2003—that is to say, one that is outwith the proposals —must not contain any specially restricted material, unless that material is made available in a manner that secures that persons under the age of 18 will not normally see or hear it. Specially restricted material includes R18 material and other material that might seriously impair the physical, mental or moral development of persons under the age of 18. Our intention is that such other material should include material that the BBFC would describe as 18 sex works. I think that answers precisely the point that the hon. Lady was making.

This is a genuine inquiry: did the Minister consider not having double regulation but awarding regulatory oversight of all this to a single regime, possibly the BBFC, thereby taking it away from Ofcom? If he considered that idea, why did he reject it?

Partly because the regulation of areas currently covered by Ofcom is considered to be working well, so I did not want to throw that regime up in the air. I did want to deal with the additions and make provisions additional to the existing regime.

The Minister’s response prompts the question: if that is the case, why did he not give the responsibility to Ofcom?

Because I think the BBFC is best at making the very nuanced distinctions between different types of material and their regulation that are required. The way it has landed, with the two regulators sitting side by side, but with the aim that the result of the regulation is the same, is the better way of doing it.

May I seek clarification from the Minister? Is there scope for a mechanism whereby the two regulatory authorities can pass items between each other if one is better suited to judge an item that has been referred to the other?

There is clarity in the Bill about what is under the jurisdiction of one regulator and what is under the jurisdiction of the other. I will, though, take that away and seek to give an assurance that the two regulators will work together to ensure that that boundary is dealt with adequately. There is flexibility in the Bill to ensure that that can happen. I cannot speak for Ofcom or the BBFC, but it would seem to me to be perfectly reasonable and obvious that the boundary has to work properly. I would not like to over-specify that in the Bill because of the nature of changes in technology. The distinction between broadcast and on-demand services is changing as technology develops, and it is better to leave it structured as it is. I am sure that both regulators will have heard the hon. Gentleman’s important point that the boundary between the two needs to be dealt with appropriately and that they need to talk to each other.

Is the Minister reassured, as I am, by the fact that in the evidence sessions there was enthusiastic support from the BBFC for embracing the role, as well as very clear guidance that it had the competence to do so? We have not necessarily heard that from anybody else. The support and enthusiasm for taking on that role is very telling.

My hon. Friend has just given the final paragraph of my speech. With those assurances and the broad support from the BBFC and its enthusiasm to tackle the need for age verification in that way, I hope that the hon. Member for Sheffield, Heeley will withdraw the amendment.

Quite a lot of clarification is needed, and I hope it will come during the Bill’s passage. I do not think that the distinction between Ofcom and the BBFC is clear in this part of the Bill or in later clauses on enforcement. However, given that it states elsewhere in the Bill that the proposal is subject to further parliamentary scrutiny, and as the BBFC has not yet officially been given the regulator role—as far as I am aware—I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

I beg to move amendment 66, in clause 15, page 18, line 24, at end insert

“or an internet service provider.”.

This amendment and amendment 67 ensure that the requirement to implement age verification does not fall on ISPs but commercial sites or applications offering pornographic material; and defines internet service providers.

With this it will be convenient to discuss the following:

Amendment 90, in clause 22, page 23, line 29, leave out

“or ancillary service provider”

and insert

“, ancillary service provider, or internet service provider.”.

Amendment 77, in clause 22, page 24, line 23, at end insert “or

(c) an internet service provider.”.

This amendment and amendment 78 ensure that the definition of an ancillary service provider would include ISPs; and defines internet service providers.

Amendment 91, in clause 22, page 24, line 23, at end insert—

“(6A) In this section an “ancillary service provider” includes, but is not limited to, domain name registrars, social media platforms, internet service providers, and search engines.”.

Amendment 67, in clause 25, page 26, line 2, at end insert—

““internet service provider” has the same meaning as in section 124N of the Communications Act 2003 (interpretation);”.

See the explanatory statement for amendment 66.

New clause 8—Duty to provide a service that excludes adult-only content—

“(1) This section applies to internet service providers who supply an internet access service to subscribers.

(2) For the purposes of subsection (1), “subscribers” includes—

(a) domestic subscribers;

(b) schools; and

(c) organisations that allow a person to use an internet access service in a public place.

For the purposes of the conditions in subsections (3) and (4), if the subscriber is a school or organisation a responsible person within the school or organisation shall be regarded as the subscriber.

(3) A provider to whom subsection (1) applies must provide to subscribers an internet access service which excludes adult-only content unless all of the conditions listed in subsection (4) have been fulfilled.

(4) The conditions are—

(a) the subscriber “opts in” to subscribe to a service that includes online adult-only content;

(b) the subscriber is aged 18 or over; and

(c) the provider of the service has an age verification scheme which meets the standards set out by OFCOM in subsection (4) and which has been used to confirm that the subscriber is aged 18 or over before a user is able to access adult-only content.

(5) It shall be the duty of OFCOM, to set, and from time to time to review and revise, standards for the—

(a) filtering of adult content in line with the standards set out in Section 319 of the Communications Act 2003;

(b) age verification policies to be used under subsection (4) before an user is able to access adult content; and

(c) filtering of content by age or subject category by providers of internet access services.

(6) The standards set out by OFCOM under subsection (5) must be contained in one of more codes.

(7) Before setting standards under subsection (5), OFCOM must publish, in such a manner as they think fit, a draft of the proposed code containing those standards.

(8) After publishing the draft code and before setting the standards, OFCOM must consult relevant persons and organisations.

(9) It shall be the duty of OFCOM to establish procedures for the handling and resolution of complaints in a timely manner about the observance of standards set under subsection (5), including complaints about incorrect filtering of content.

(10) OFCOM may designate any body corporate to carry out its duties under this section in whole or in part.

(11) OFCOM may not designate a body under subsection (10) unless, as respects that designation, they are satisfied that the body—

(a) is a fit and proper body to be designated;

(b) has consented to being designated;

(c) has access to financial resources that are adequate to ensure the effective performance of its functions under this section; and

(d) is sufficiently independent of providers of internet access services.

(12) It shall be a defence to any claims, whether civil or criminal, for a provider to whom subsection (1) applies to prove that at the relevant time they were—

(a) following the standards and code set out in subsection (5),; and

(b) acting in good faith.

(13) Nothing in this section prevents any providers to whom subsection (1) applies from providing additional levels of filtering of content.

(14) In this section—

“adult-only content” means material that contains offensive and harmful material from which persons under the age of 18 are protected;

“age verification scheme” is a scheme to establish the age of the subscriber;

“internet access service” and “internet service provider” have the same meaning as in section 124N of the Communications Act 2003 (interpretation);

“material from which persons under the age of 18 are protected” means material specified in the OFCOM standards under section 2;

“OFCOM” has the same meaning as in Part 1 of the Communications Act 2003;

“offensive and harmful material” has the same meaning as in section 3 of the Communications Act 2003 (general duties of OFCOM); and

“subscriber” means a person who receives the service under an agreement between the person and the provider of the service.”.

This new clause places a statutory requirement on internet service providers to limit access to adult content by persons under 18. It would give Ofcom a role in determining the age verification scheme and how material should be filtered. It would ensure that ISPs were able to continue providing family friendly filtering once the net neutrality rules come into force in December 2016.

New clause 11—Power to make regulations about blocking injunctions preventing access to locations on the internet—

“(1) The Secretary of State may by regulations make provision about the granting by a court of a blocking injunction in respect of a location on the internet which the court is satisfied has been, is being or is likely to be used for or in connection with an activity that is contravening, or has contravened, section 15(1) of this Act.

(2) “Blocking injunction” means an injunction that requires an internet service provider to prevent its service being used to gain access to a location on the internet.

(3) Regulations introduced under subsection (1) above may, in particular—

(a) make provision about the type of locations against which a blocking injunction should be granted;

(b) make provision about the circumstances in which an application can be made for a blocking injunction;

(c) outline the type of circumstances in which the court will grant a blocking injunction;

(d) specify the type of evidence, and other factors, which the court must take into account in determining whether or not to grant a blocking injunction;

(e) make provision about the notice, and type of notice, including the form and means, by which a person must receive notice of an application for a blocking injunction made against them; and

(f) make provision about any other such matters as the Secretary of State considers are necessary in relation to the granting of a blocking injunction by the court.

(4) Regulations under this subsection must be made by statutory instrument.

(5) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before, and approved by a resolution of, each House of Parliament.

(6) In this Part— “Internet service provider” has the same meaning as in section 16 of the Digital Economy Act 2010. In the application of this Part to Scotland “injunction” means interdict.”.

This new Clause empowers the Secretary of State to introduce regulations in relation to the granting of a backstop blocking injunction by a court. The injunction would require an internet service provider to prevent access to a site or sites which do not comply with the age-verification requirements. This would only be used where the other enforcement powers (principally fines) had not been effective in ensuring that sites put in place effective age-verification.

I welcome the Minister’s previous comments, which gave me some real assurances on the parity of content and regulator. I also reassure him of how popular he will be when the Bill finally passes—the Centre for Gender Equal Media said that, in its most recent survey, 86% of people support a legal requirement on companies to prevent children’s access to pornography. We are moving in the right direction.

Amendment 66 seeks to pick through slightly more carefully who is responsible and is captured by the Bill’s language. There are four internet service providers in the UK through which the majority of broadband internet traffic travels, and they have come a long way. Five years ago, they accepted none of our proposals, be it single click protection for all devices in the home or the implementation of a filtering system that required selection—we could not select whether or not the filters were on. They have gone from that to the position now whereby, in some cases, we have ISPs that provide their services with the filters already on as default—something that we were told was absolutely unimaginable. With that regime, the level of complaints is very low and the level of satisfaction is very high.

Amendment 67 is consequential on amendment 66 and both seek to clarify the scope of who exactly would be covered under the wording of clause 15(1), which states:

“A person must not make pornographic material available on the internet on a commercial basis to persons in the United Kingdom except in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18.”

The Government have made it quite clear in the consultation, and the Minister clarified in his previous remarks, that the proposals apply to companies running websites aimed specifically at providing pornographic content for commercial gain, and that they want those who profit from such material being made available online to act in a legal, socially responsible way. It could be argued that ISPs both profit from the material being made available online and also make pornographic material available online, even though they are not the original source of the material. We also heard from the Minister that he is minded to consider social media platforms in that same category. In my view, the regulator must also publish guidance under clause 15(3) about

“circumstances in which the regulator will treat an internet site or other means of accessing the internet as operated or provided on a commercial basis”.

It is my concern that that could also be read as applying to ISPs. The amendments are intended to clarify that. In fact, I can quote from an article from July, which said:

“Internet access providers are likely to feel left in an uncertain position at the moment as, while the Bill does not reference them in this context, the definition of ‘makes pornographic material available’ could be argued as incorporating companies which provide connectivity to servers used for the making available of pornographic material”,

and piping that material into the home.

Paragraph 22 of the explanatory notes makes reference to “commercial providers of pornography”, and that obviously appears to place the onus of this suite of measures firmly on the content providers, but an optimal approach would be to improve the drafting to make the legislative attempt clear. I know we will have further discussions about the role of ISPs, but ISPs have done what we have asked them to do in introducing family friendly filters.

I am trying to understand why the hon. Lady believes that ISPs should not have this responsibility.

Because various other aspects of the Bill capture ISPs. My concern is that the Bill focuses on the commercial content providers where they are. The amendment is intended to probe the Government about how they are thinking about ISPs vis-à-vis commercial content providers in the drafting of the clause.

Our amendments are designed to enable the regulator to ask the internet service provider to block offending sites. This goes back to the point we made earlier on the differences between sites operated “on a commercial basis” and social media sites and ancillary sites. The proposals as they stand do not give the regulator sufficient powers to enforce the mechanisms proposed in the Bill.

Broadening the definition of “ancillary service provider” specifically to include internet service providers would require the regulator to notify them of non-compliant sites. That will put ISPs in the same bracket as payment service providers, which will be required to withdraw their services if other measures have been exhausted. In the case of ISPs, they would be required to block offending sites.

The amendments would create a simple backstop power where enforcement through the Government’s proposals had not achieved its intended objective and commercial providers had not withdrawn their services, either because the fine does not act as a deterrent or because, due to their international status, they do not need to comply. If pornography providers continued to provide content without age verification restrictions, the regulator would then have the power to require ISPs to take down the content.

We believe that, without amendment, the proposals will not achieve the Bill’s aim, as non-compliant pornographers would not be absolutely assured of payment services being blocked. First, the proposals do not send anywhere near a strong enough signal to the porn industry that the Government are serious about the proposals and their enforcement. Giving the regulator the power but not the stick suggests that we are not all that bothered about whether sites comply. Secondly, we can have no reassurance that sites will be shut down within any kind of timeframe if there is non-compliance. As drafted in the explanatory notes, “on an ongoing basis” could mean yearly, biannually or monthly, but it makes a mockery of the proposals if sites could be non-compliant for two years or more before payment services may or may not act. That does not provide much of an incentive to the industry to act.

Throughout the evidence sessions we heard that there are significant difficulties with the workability of this entire part of the Bill. For instance, many sites will hide their contact details, and a substantial number will simply not respond to financial penalties. Indeed, an ability already exists in law for ISPs to be compelled to block images that portray, for example, child sex abuse. There is also an ability to block in the case of copyright infringement. It therefore seems eminently reasonable that in the event of non-compliance, the regulator has a clear backstop power. We believe that even just legislating for such a power will help speed up enforcement. If providers know that they cannot simply circumvent the law by refusing to comply with notices, they will comply more efficiently. That will surely help the age verifier to pass the real-world test, which is integral to the Bill’s objectives.

Similarly, new clause 11 provides for an all-important speed of enforcement. As it currently stands, the Bill provides fairly feeble powers to an enforcer to give notice to a payment service or ancillary service provider that a site has contravened clause 15(1). Indeed, giving evidence to the Committee, David Austin of the BBFC said of his power to notify sites of their contravention of clause 15 that

“some will and some, probably, will not”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 41, Q91.]


He welcomed as a second backstop power the ability to notify the ancillary or payment service provider. If providers still fail to act after that second backstop power is invoked, the regulator’s final power is to issue a fine. That is clearly insufficient, and the process itself would take a great deal of time, during which children under 18 would still be able to access pornography, even though the age verification regulator was well aware that there was a breach of clause 15(1).

The amendment would provide the Secretary of State with the power, through regulations, to issue a blocking injunction preventing access to locations on the internet if a court is satisfied that they are being used to contravene clause 15. The Opposition are clear that the power would be necessary only when the other enforcement powers had proved ineffective. Indeed, in evidence the BBFC was clear that fines by themselves would not be enough. David Austin said:

“For UK-based websites and apps, that is fine, but it would be extremely challenging for any UK regulator to pursue foreign-based websites or apps through a foreign jurisdiction to uphold a UK law. So we suggested, in our submission of evidence to the consultation back in the spring, that ISP blocking ought to be part of the regulator’s arsenal. We think that that would be effective.”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 41, Q91.]

The Government’s own age verification regulator recommends that the amendments be made to the Bill. We very much hope that the Government will consider accepting them.

I am a little puzzled as to what the hon. Member for Devizes has against requiring ISPs to block porn sites. As my hon. Friend the Member for Sheffield, Heeley said, they are already required to block other sites. If we require ISPs to block sites that offend copyright laws, I really do not understand the problem with requiring them to block sites that provide pornography to children.

On a point of order, Mr Stringer. Perhaps this shows my ignorance of doing Committees from the Back Benches, but I intended to go on in my speech to discuss new clause 8, which I have tabled and which defines more clearly what I expect internet service providers to do. Would it be in order for me to deliver those remarks, or have I lost my opportunity?

Let me be clear: we are considering amendment 66 to clause 15, amendments 90, 77, 91 and 67, and new clauses 8 and 11. Members can speak more than once in Committee if they wish to. The hon. Lady has the right to discuss her new clause.

May I please rise again, then? Apologies to the Committee—[Interruption.] I am so sorry; the hon. Member for Bristol West was speaking.

I defer to the hon. Lady. She mentioned something she is going to say in due course; I look forward to hearing it. Nevertheless, I stand by my comments. We need to be clear about whether we are going to fail to require ISPs to do something that we already require them to do for copyright infringement and other forms of pornography involving children. I fail to see what the problem is. Having a blocking injunction available to the regulator would give them another tool to achieve the aim that we have all agreed we subscribe to, which is being able to block pornography from being seen by children and young people.

Mr Stringer, I assume that, like me, you sometimes have the feeling that you have sat down before you have finished what you are saying. I apologise to the Committee. I am rarely short of words, but in this case I was.

I want to respond to the point made by the hon. Member for Bristol West and clarify exactly what we have asked and should be asking internet service providers to do. In doing so, I shall refer to the new EU net neutrality regulations, which, despite the Brexit vote, are due to come into force in December. They cause many of us concerns about the regime that our British internet service providers have put in place, which I believe leads the world—or, at least, the democratic free world; other countries are more draconian—in helping families to make these choices. We do not want all that good work to be unravelled.

Our current regime falls foul of the regime that the European Union is promoting, and unless the Government make a decision or at least give us some indication relatively quickly that they will not listen to that, we may have an issue in that all the progress that we have made may run out by December 2016. I would be grateful if the Minister told us what the Government are doing to get the new legislation on the statute book in line with the schedule set out by his colleague Baroness Shields last December.

We have an effective voluntarily filtering arrangement. I believe—I think that this point is in the scope of ancillary service providers—that we intend to capture internet service providers as part of the general suite of those responsible for implementing over-18 verification, but I want the Government to make crystal clear that they are aware of the responsibilities of internet service providers and intend for the regulator to include them in the basket of those that they will investigate and regulate.

The big missing link in all this has been getting content providers that provide material deemed to be pornographic to do anything with that material. The difference is that content providers of, say, gambling sites have always been required to have age-verification machinery sitting on their sites.

The hon. Member for Bristol West is quite right that we want ISPs to be captured under this regulatory regime, but I am keen to hear from the Minister that all the work that we have done with ISPs that have voluntarily done the socially and morally responsible thing and brought forward family-friendly filters will not be undone by December 2016, when the EU net neutrality regulations are intended to come into place.

Quite a lot of points have been raised, and I seek to address them all. Clause 22 is an important provision containing the powers at the heart of the new regime to enable the age-verification regulator to notify payment service providers and ancillary service providers that a person using their services is providing pornographic material in contravention of clause 15 or making prohibited material available on the internet to persons in the UK.

Amendments 66, 67, 77, 78, 90 and 91 would provide that the requirement to implement age verification does not fall on ISPs and further clarify that ISPs are to be considered ancillary service providers. Amendment 91 would clarify that as well as ISPs, domain name registrars, social media platforms and search engines are all to be considered ancillary service providers for the purposes of clause 22, which makes provision for the meaning of “ancillary service provider”.

This is a fast-moving area, and the BBFC, in its role as regulator, will be able to publish guidelines for the circumstances in which it will treat services provided in the course of business as either enabling or facilitating, as we discussed earlier. Although it will be for the regulator to consider on a case-by-case basis who is an ancillary service provider, it would be surprising if ISPs were not designated as ancillary service providers.

New clause 8 would impose a duty on internet service providers to provide a service that excludes adult-only content unless certain conditions are met. As I understand it, that measure is intended to protect the position of parental filters under net neutrality. However, it is our clear position that parental filters, where they can be turned off by the end user—that is, where they are a matter of user choice—are allowed under the EU regulation. We believe that the current arrangements are working well. They are based on a self-regulatory partnership and they are allowed under the forthcoming EU open internet access regulations.

I think I understand the Minister to be saying that in cases where companies have introduced filters that are on by default, the fact that the users can choose to turn those filters off in the home means that they would not be captured by the net neutrality rules. Is that correct?

That is exactly what I am saying. On that basis, with the Government’s position having been put clearly on the record, I hope that my hon. Friend will not press new clause 8 to a vote.

New clause 11 would empower the Secretary of State to introduce regulations in relation to backstop blocking injunctions. We have looked carefully at the option of blocking by ISPs and have talked to a lot of stakeholders about it. We take the problem seriously, and we think our measures will make a real difference. We are yet to be persuaded that blocking infringing sites would be proportionate, because it would not be consistent with how other harmful or illegal content is dealt with. There is also a question of practicality: porn companies would be able to circumvent blocking relatively quickly by changing URLs, and there is an additional risk that a significant number of sites that contain legal content would be blocked. We would need to be convinced that the benefits of ISP blocking would not be outweighed by the risks.

I am a little confused about how the Minister envisages the provisions being enforced against the free sites we discussed in the previous group of amendments without that additional power, which indeed has been requested by the regulator that the Government have designated.

As the regulator said, the proposals here mark a huge step forward in tackling the problem. We have to make a balanced judgment: there is a balance to be struck between the extra powers to block and the need to ensure that they are proportionate. The powers are not a silver bullet; sites that were actively trying to avoid the Bill’s other enforcement measures would also be able to actively avoid these measures. It is questionable how much additional enforcement power they would bring, given those downsides.

I must press the Minister to consider that children’s charities have told us that this is one of the most important amendments to the Bill. The Minister says that porn sites could simply move their URLs, but that is not a reason not to take a stand by giving the regulator the power that it has asked for and that children’s charities have particularly asked for.

Children’s charities and the regulator have asked for action to solve the problem of needing age verification. That is what the Bill delivers. The question of how to enforce that is incredibly important; there are different considerations to be made, and I think the Bill has ended up with the correct balance.

The BBFC witness explicitly said last week that

“we suggested, in our submission of evidence to the consultation back in the spring, that ISP blocking ought to be part of the regulator’s arsenal.”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 41, Q91.]

The BBFC says that notification of payment providers or ancillary services providers and fines may not be sufficient. I appreciate that porn sites might well use different URLs to evade it, but why has the Minister explicitly removed ISP blocking as a further backstop power? We are not talking about blocking too many sites; we have been very clear that it is intended as a backstop power when other measures fail.

David Austin of the BBFC said:

“We see this Bill as a significant step forward in terms of child protection.”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 42, Q94.]

We think, on balance, that the regulator will have enough powers—for example, through the provisions on ancillary service providers—to take effective action against non-compliant sites. For that reason, I think this is the appropriate balance and I ask my hon. Friend the Member for Devizes to withdraw her amendment.

I think that we are running through two definitions of ISPs: one relating to ancillary service providers and the other to enforcement and blocking. If we include ISPs in the definition of ancillary service providers, we want to make sure that they are captured, either explicitly or as a service provider. Is the Minister saying that he is comfortable with the enforcement regime without blocking? Would it require further legislation for blocking to be carried out if the regulator felt it was an appropriate measure? Are we ruling that out in this legislation?

Order. The hon. Lady is making a speech. If the Minister wants to intervene, he may.

I thank my hon. Friend for giving way. I would like to provide a point of clarity on the speech she has made. Treatment of an ASP will not lead to blocking. I think that is the answer to her question.

I thank the Minister for that intervention. We will return to this subject in a series of amendments around clause 20. I want to thank the Minister for clarifying some of the murkiness around definitions in the Bill. I want to ask him and his team, though, to consider what his colleague had said, which goes back to the net neutrality point.

I accept what the Minister says about the spirit being absolutely clear, that our current filtering regime will not be captured, but Baroness Shields did say that we needed to legislate to make our filters regime legal. I did not hear from the Minister that that legislation is something that the Department is preparing or planning to introduce.

We very much share the hon. Lady’s concerns that the legislation has explicitly excluded the ability of internet service providers to block. We simply cannot understand why the Government have ruled out that final backstop power. We appreciate it is not perfect but it would give the regulator that final power. We will return to new clause 11 at the end of the Bill and be pushing it to a vote when we come to it.

I thank the hon. Lady for making her intentions clear. I am prepared to withdraw or not push my new clause to a vote on the basis of what the Minister said, but I would love to get his assurances—perhaps he will write to me—to be crystal clear on the fact that he believes the Government do not have to legislate in order to push back on the net neutrality regime.

Before the hon. Lady sits down, she did mention the view of Baroness Shields that there should be new legislation. Notwithstanding our remarks about the number of Government amendments, does the hon. Lady believe this Bill could be a useful vehicle to achieve that?

Given the Brexit vote, I would be inclined to accept a letter from the Minister suggesting that we will absolutely resist any attempt to make EU net neutrality apply to what is a very fine, though not perfect, voluntary regime. On that basis, I accept the Minister’s assurances that that is what he intends to do. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 15 ordered to stand part of the Bill.

Clause 16 ordered to stand part of the Bill.

Clause 17

The age-verification regulator: designation and funding

Question proposed, That the clause stand part of the Bill.

In this and related clauses, we seek to strengthen the proposals that the Government have put forward. We have said that the regulation needs to be beefed up to require internet service providers to be notified about non-compliance. We would like to see an injunction power to take down any content which a court is satisfied is in breach of the age-verification legislation, as soon as possible, at the start of the four-tier regulation process the Government have identified in their amendments and letters published to the Committee last week.

That would require a regulator with sufficient enforcement expertise and the ability to apply that injunction and push enforcement at an early stage. As we are aware, however, the BBFC heads of agreement with the Government do not cover enforcement. Indeed, they made perfectly clear that they would not be prepared to enforce the legislation in clauses 20 and 21 as they stand, which is part 4 of that enforcement process, giving the power to issue fines. The BBFC is going to conduct phases 1, 2 and 3 of the notification requirements, presumably before handing over to a regulator with sufficient enforcement expertise, but that has not been made clear so far.

While we welcome the role of the BBFC and the expertise it clearly brings on classification, we question whether it is unnecessarily convoluted to require a separate regulator to take any enforcement action, which will effectively have been begun by the BBFC and which so far has not been mentioned in the legislation. This goes back to the point my hon. Friend the Member for Cardiff West made earlier about the two separate regimes for on-demand programme services.

As I understand it, although it is not clear, the BBFC will be taking on stage 3 of the regulation, meaning it will be involved in the first stage of enforcement—in notification. That is fine, but it will then have to hand over the second stage of enforcement to another regulator—presumably Ofcom. The enforcement process is already incredibly weak and this two-tiered approach involving two separate regulators risks further delays in enforcement against non-compliant providers who are to protect or take down material that is in breach of the law. In evidence to the Committee, the BBFC said:

“Our role is focused much more on notification. We think we can use the notification process and get some quite significant results.”—[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 41, Q83.]

We do not doubt it, but confusion will arise when the BFFC identifies a clearly non-compliant site that is brazenly flouting the law, and it does not have power to enforce quickly but will have to hand it over.

We would also like to hear when the Government are planning to announce the regulator for the second stage and how they intend to work with the BBFC. As far as I can see, this will require further amendments to the Bill. If it is Ofcom, it would have been helpful to have heard its views on what further enforcement powers it would like to see in the Bill, rather than being asked to fill in after the Bill has passed through Parliament. There is a clear danger that the enforcement regulator could be asked to take over enforcement of age verification, which it thinks requires more teeth to be effective.

We therefore have very serious concerns about the process by which clause 17 will be have effect. Although we will not vote against the clause, we want to make it very clear that we would have preferred to have seen an official announcement about who will carry out the enforcement provisions in the Bill before being asked to vote on it.

The debate on clause stand part is about the set-up of the regulatory structure and making sure that we get designation and funding right. It is our intention that the new regulatory powers and the new regulator or co-regulators will deliver on this. As the hon. Lady says, the BBFC has signed up to be designated as the age verification regulator responding for identifying and notifying. This will enable the payment providers and other ancillary services to start to withdraw services to sites that do not comply as soon as possible.

In what kind of timeframe does the Minister envisage the payment service providers acting from notification from the BBFC?

We intend formally to designate the BBFC as regulator in autumn 2017 and expect to be in a position to commence the provisions requiring age verification within 12 months of Royal Assent.

That was not quite my question. How long does the Minister anticipate that ancillary service providers or payment service providers will take to act on receiving notification from the BBFC that a site is non-compliant?

I would expect that to happen immediately. The question of the designation of the backstop enforcement regulator does not stop or preclude the BBFC from getting going on this. As we have heard, it is already working to put in place its own internal systems. As I have just said to the Committee, we have a new commitment that we expect to commence the provisions in terms of getting the system up and running within 12 months of Royal Assent; after that, if the BBFC has designated that there is a problem, I would expect action to be immediate, because I expect the BBFC to ensure through good relations that systems are in place.

I see enforcement very much as a back-up to good behaviour. As we have seen with the taking down of child pornography and material related to terrorism, many providers and platforms respond rapidly when such material is identified. It will be far better if the system works without having to resort to enforcement. We will set out in due course who is best placed to be the regulator for enforcement, but the system is new, and the approach provides the level of flexibility that we need to get it right. I have every confidence in the BBFC’s ability and enthusiasm to deliver on these aims, so I commend the clause to the Committee.

Question put and agreed to.

Clause 17 accordingly ordered to stand part of the Bill.

Clauses 18 and 19 ordered to stand part of the Bill.

Clause 20

Enforcement of sections 15 and 19

I beg to move amendment 68, in clause 20, page 21, line 5, at beginning insert

“If the person in contravention of section 15(1) is resident in the United Kingdom,”.

This amendment and amendments 69, 70, 71, 72, 73 and 74 place a requirement on the age-verification regulator to impose fines where a UK person has contravened clause 15(1) unless the contravention has ceased; or to issue an enforcement notice to person outside of the UK who has contravened clause 15(1).

With this it will be convenient to discuss the following:

Amendment 69, in clause 20, page 21, line 5, leave out “may” and insert “must”.

See the explanatory statement for amendment 68.

Amendment 70, in clause 20, page 21, line 7, after “15(1)”, insert “, unless subsection (5) applies”.

See the explanatory statement for amendment 68.

Amendment 71, in clause 20, page 21, line 10, at beginning insert

“If the person in contravention of section 15(1) is not resident in the United Kingdom,”.

See the explanatory statement for amendment 68.

Amendment 72, in clause 20, page 21, line 10, leave out “may” and insert “must”.

See the explanatory statement for amendment 68.

Amendment 73, in clause 20, page 21, line 16, leave out subsection (4).

See the explanatory statement for amendment 68.

Amendment 74, in clause 20, page 21, line 42, leave out “may” and insert “must”.

See the explanatory statement for amendment 68.

This is a series of consequential and investigatory amendments intended to probe the Minister’s thinking about what the regulator can actually do. At the moment, enforcement operates through a series of financial penalties, which we can discuss further when we debate clause 21, or of enforcement notices. We heard clearly last week from David Austin that the challenge is that almost none of the content-producing sites that we are discussing are based in the UK; in fact, I think he said that all the top 50 sites that the regulator will rightly target are based overseas.

The challenge is how the Government intend to carry out enforcement. I know that the BBFC’s current enforcement role is not carried out through its own designated powers; it is carried out through various other agencies, and the Bill makes further provision for financial penalties. I tabled the amendments to press the Minister on the point that it would be clearer to specify that where a site, or the company that owns a site, is based in the UK, a financial penalty can and will be applied.

For overseas sites, enforcing a financial penalty, if one can even get to grips with what the financial accounts look like, may be difficult, hence the enforcement notice and then a series of other potential backstop actions; I know that the Minister is aware that I do not feel that we have exhausted the debate on blocking. I am trying to probe the Government on whether there is a way to use the Bill to reflect the reality that content providers are unlikely to be based primarily in the UK, and that perhaps a different approach is needed for those based offshore.

We completely support the hon. Lady’s amendments, which propose a sensible toughening up of the requirements of the age verification regulator. We particularly welcome the measures to require the regulator to issue enforcement notices to people outside the UK if they do not comply. That is an attempt to close a large hole in the current proposals. How will the BBFC tackle providers outside the UK?

At the evidence session last week, David Austin said that

“you are quite right that there will still be gaps in the regime, I imagine, after we have been through the notification process, no matter how much we can achieve that way, so the power to fine is essentially the only real power the regulator will have, whoever the regulator is for stage 4”;

we are not yet certain.

He continued:

“For UK-based websites and apps, that is fine, but it would be extremely challenging for”

the BBFC, Ofcom or whoever the regulator is for stage 4

“to pursue foreign-based websites or apps through a foreign jurisdiction to uphold a UK law. So we suggested, in our submission of evidence to the consultation back in the spring, that ISP blocking ought to be part of the regulator’s arsenal.”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 41, Q91.]

That is precisely why we will return to the amendment on ISP blocking, because if we are to pursue foreign-based providers, the ability to block will be integral to that strategy.

I want to state on the record again that we are disappointed that there is no indication in part 4 about the identity of the regulator. The legislation refers to a regulator as though there will be one across all stages of the notification and enforcement process; it has come as quite a surprise to learn that there will two regulators and that the Government cannot offer the Committee any indication about who they will be.

My hon. Friend is making a series of excellent points which I hope the Minister can answer. We keep discovering that there are gaps, inconsistencies and potential confusion in the Bill. She has referred to the witnesses who gave evidence last week. Does she agree that it is really important that we focus carefully on the gaps that children’s charities such as the NSPCC have identified?

Obviously, I completely agree with my hon. Friend. We appreciate that the Government have consulted extensively with partners and representatives of all the relevant stakeholders, but it is not clear to us why they have not allowed ISPs that ultimate backstop power to block. For that reason, and to meet the objective of tackling providers outside the UK, we support amendments tabled by the hon. Lady the Member for Devizes.

I rise to support the amendments. It will not surprise the Committee to learn that I seek clarity about the impact on Scots law. It comes back to the same point: a lot of the issues that are being wrestled with in this place apply in a different legal jurisdiction. Perhaps the Minister could address that.

I should like to add to the comments made by hon. Friends. My concern is that if there are too many gaps and loopholes in the legislation, that may, perversely, put greater pressures on the enforcement authorities, because they will have to seek out so many different mouse-holes down which some of the content providers may run and disappear. I am slightly concerned and ask the Minister to consider the danger of an unintended consequence, because if it is not possible to stamp out content immediately, vital resources and focus will be diverted.

Does my hon. Friend also agree that with too many loopholes in the legislation, the more responsible providers of content will include age verification measures but users who want to avoid those tools will be pushed on to perhaps more extreme or violent pornography and perhaps even in to the deep web?

Yes. I raised this with the gentleman from the British Board of Film Classification, I believe, and I questioned his assertion about the top 50 websites. He said that the process would not stop there but proceed to the next 50, but if those 50 content providers are constantly moving all over the place, it will be rather like a game of whack-a-mole. Unless we have a sufficiently large mallet to give the mole a whack early on—[Interruption.] This is a serious business, and if I am sounding a bit jocular, that is not meant to take away from the serious issue. If we do not have the tools to address those who are deliberately not complying, and those who do not wish to comply with the regulations that we are putting in place to protect our children, I fear that we will be chasing after them too much.

My hon. Friend the Member for Sheffield, Heeley is right that there will also be the danger that investigative authorities use too many of their resources to go after this, when there are other things they need to go after as well. We need to put the tools at the disposal of the investigative and enforcement authorities, to give them the opportunity to make as clean an attack as possible on the providers that are not complying with the desire of this House.

I will return to the evidence on this point to make clear why I support what the hon. Member for Devizes is trying to do. In his evidence last week, the NSPCC’s Alan Wardle—I think I have got that right—said quite clearly:

“I think that is why the enforcement part is so important…so that people know that if they do not put these mechanisms in place there will be fines and enforcement notices, the flow of money will be stopped and, crucially, there is that backstop power to block if they do not operate as we think they should in this country. The enforcement mechanisms are really important to ensure that the BBFC can do their job properly and people are not just slipping from one place to the next.”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 47, Q108.]

So what my hon. Friend the Member for Sheffield, Heeley has just said is summed up very well by the NSPCC in its official evidence, and I hope that the Minister will have an answer for the NSPCC as well as for this Committee.

I am thankful for the opportunity to respond. I will actually respond to the points made about these amendments, which were tabled by my hon. Friend the Member for Devizes, rather than the reiteration of the blocking debate, which we have had and will no doubt have again on further clauses.

First, clause 17 clearly makes provision for the Secretary of State to designate more than one person as a regulator. Secondly—a crucial point—the complexity in regulation is deciding who is satisfying the rules and who is not, and that is for the BBFC to determine, whereas issuing fines is essentially a matter of execution and could be fulfilled by a variety of bodies. We will come forward with more detail on that in due course.

I think the whack-a-mole analogy inadvertently made the point, which is that when we are trying to deal with a problem on the internet, where people can move about, we can deal with the mainstream of the problem, which comes from reliable providers of adult material, who are already engaged and want to ensure they comply with the law. In future, once this measure becomes law, refusing to put age verification on adult material will be illegal, so we will be dealing with illegal activity. That will mean that the vast majority of people will comply with the law, and we heard that very clearly in the evidence session. The question then is how to deal with non-compliance and on the internet we know that that is very difficult. The proposals are to deal with non-compliance by disrupting business models and by imposing financial penalties.

I understand what my hon. Friend is trying to do. She is trying to strengthen the imposition of financial controls. Inadvertently, however, her amendments would reduce the regulator’s discretion by obliging the it to apply sanctions when they are available, and they would remove the power to apply financial penalties to non-UK residents.

We want to be able to fine non-UK residents—difficult as that is—and there are international mechanisms for doing so. They do not necessarily reach every country in the world, but they reach a large number of countries. For instance, Visa and other payment providers are already engaged in making sure that we will be able to follow this illegal activity across borders.

Therefore, while I entirely understand where my hon. Friend is coming from, the amendments would inadvertently have the effect of removing the ability to apply an enforcement notice to a UK resident, although I am certain that that is not what she intended. So I resist the amendment but I give her the commitment that we have drafted the clause in such a way as to make it as easy as possible for the enforcement regulator to be able to take the financial route to enforcement.

On the point made by the hon. Member for Berwickshire, Roxburgh and Selkirk, the provisions do extend to Scotland, with necessary modifications to Scottish law. I am sure that he, like me, will have seen clause 17(5) and clause 20(11)(b), which refer to modifications needed to be consistent with Scottish law. On the basis of that information, I hope that my hon. Friend will withdraw the amendment.

I thank the Minister for that clarification and for the mention of support. The intention was to help to provide a practical solution rather than cut off aims. He has persuaded me that I do not need to press the amendment to a vote. Although I take the point about shared regulation, I would ask him to consider in setting up the BBFC as the primary regulator that it is working reasonably well in the video-on-demand world, but this may be having them stray into a new sphere of expertise in terms of finding, identifying and sending out enforcement notices or penalties, particularly for foreign-based companies. I think the whack-a-mole analogy is entirely consistent—they will shut their doors and reopen in another jurisdiction almost overnight. Given the anonymity principles, it is sometimes almost impossible to know where they actually are. If the Minister is assuring us that everyone is aware of the problem, he believes the powers allow the regulator to be flexible, and it is something that his Department will consider, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

I beg to move amendment 86, in clause 20, page 21, line 40, leave out paragraph (b) and insert—

“(b) “during the initial determination period fix the date for ending the contravention of section 15(1) as the initial enforcement date.”.

With this it will be convenient to discuss the following:

Amendment 88, in clause 20, page 21, line 40, at end insert—

“(c) after the initial determination period fix a period of one week for ending the contravention of section 15(1)”.

Amendment 89, in clause 20, page 22, line 13, at end insert—

‘(14) In this section, “initial determination period” means a period of 12 months from the date of the passing of this Act to the initial enforcement date.”.

This group of amendments goes even further—they have the straightforward intention of continuing the process of strengthening the powers and, crucially, of speeding up the enforcement period, to help the Government achieve their manifesto commitment. The Bill would give the regulator the power to set a lengthy, if not indefinite, period for ending the contravention of section 15. The amendment would speed up the enforcement, requiring the regulator to issue an enforcement period of one week. Given that we do not anticipate that the BBFC will be the official regulator or have these powers for another 12 months on Royal Assent, we do not anticipate that a one-week enforcement period would be too onerous on content providers.

The group should be seen in tandem with our other amendments providing a backstop power requiring ISPs to block a site, and would send a clear message to content providers that the Government would treat any contravention of section 15 with the utmost seriousness and that continuing to provide content without age verification for a prolonged period of time would not be tolerated. We believe that, if the enforcement powers under clauses 20 and 21 are toughened up, the message will spread throughout the industry and it will make it clear that age verification is not an optional extra, but a central requirement in the effort to tackle what under-18s can see.

I am sympathetic to the purpose of this group of amendments. We think that decisions on when and how to enforce should be left to the regulator, but I see the point of trying to put a week into the Bill. However, it is overly prescriptive to do so in primary legislation. Our aim is for a proportionate regime, where the regulator can prioritise and deal with problems in a way that is aligned with its goals of protection, rather than having to fulfil legal requirements that might lead to unintended consequences.

No, but I cannot—and she cannot—foresee all the circumstances that the regulator will have to deal with. It is far better to have a regulator with flexibility to respond and clear aims and intentions, rather than it having to fulfil an arbitrary timescale because that is in primary legislation.

Can the Minister confirm whether the legislation enables the regulator to set a time limit for enforcement?

Yes, it will allow the regulator that flexibility. I would rather have that flexibility at the level of the regulator than in primary legislation. I think that is a reasonable approach. The regulator will then be able to act in the way that it is clear from this debate is intended. I hope that on that basis, the amendment may be withdrawn.

It is useful to have on the record the Minister’s agreement that one week is a suitable enforcement period. On that basis, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

I beg to move amendment 62, in clause 20, page 22, line 13, at end insert—

“(14) Within 12 months of this Act coming into force, the Secretary of State shall commission a review of the effectiveness of the enforcement of sections 15 and 19 and shall lay the report of the review before each House of Parliament.”

With this it will be convenient to discuss amendment 81, in clause 82, page 80, line 18, at end insert—

“(4A) Part 3 will come into force at the end of the period of one year beginning on the day on which the Act is passed.”

This amendment ensures that Part 3 will be implemented by ensuring the Part comes into effect a year to the day the Act is passed, rather than on the day the Secretary of State determines through regulations.

It took me a while to get out of my seat: I was astonished that we actually got some agreement there. Perhaps we have a new spirit of progress as we near the end of the day.

I doubt it too, but never mind. It is better to be an optimist, especially on the Opposition Benches.

No comment. Had we made more progress, amendment 62 might not have been necessary, but as I feared, we have not. I am confident that we all agree on the merit of the intent of this part of the Bill. We all want to protect young children from accessing inappropriate pornographic material. I do not want any of my children doing so, and I know how much they use electronic devices. My youngest, Robert, is only seven, and he is phenomenally tech savvy. It would not be that difficult in this world to stray, even with some of the blocking systems that are in place.

A lot of the problems that we have here are to do with international sites. I am dismayed at the Government’s unwillingness to move and not even so much as listen to Opposition Members, the regulator or charities, who all insist that ISP blocking is the kind of extra measure that we should put in place. Given that broader context and the Minister’s conviction, which I believe is sincere, that he has a package of measures that will work, in light of our concerns and those of many others, a review should be put in place. I know that in the past the answer to anything involving a review has been, “That’s what the Select Committee process is for; they will have a review,” but we should not leave something as important as protecting young children to a Select Committee. The Government should take responsibility rather than abdicate it to a Select Committee. The Government should put ISP blocking in the Bill, show that they treat the issue seriously and have a review to ensure that we get the outcome that we all want: a safer environment for our children on the internet.

Given that the Government have been so intransigent on the sensible suggestions for how their proposals could be strengthened, certainly on the issue of internet service provider blocking, I completely agree with the hon. Gentleman. The Minister keeps saying that he does not want to be too prescriptive, but we argue that the phrase “on a commercial basis” is too prescriptive and limits the powers of the age-verification regulator. Given the broad support for additional powers, we want the age-verification regulator and any other regulator involved in enforcement to come back to the House and tell us what additional powers they need to make this work. There are significant loopholes in the Bill and it could have serious unintended consequences for our young people. We completely support the SNP amendment.

I entirely understand the enthusiasm for commencement, and I have given the commitment that we would expect it within 12 months of Royal Assent. I hope that that deals with the demand for a timing of commencement to be put on the face of the Bill. Unfortunately, that renders the SNP amendment slightly impractical, because it would require a review within 12 months of Royal Assent, but if the Act commences only 12 months after Royal Assent, a review at that point might not show as much progress as we would hope.

I like the way the Minister is engaging. Is he telling me that he likes the idea, but it is just that we have worded it slightly wrongly? If that is the case, I would happily move the review 12 months on, if that is what he is suggesting.

Unfortunately, the hon. Gentleman has lost his opportunity for that because the deadline for tabling amendments has passed. We should have an enduring assessment of the effectiveness of the Bill and an ongoing review of how effective the policy is. Select Committees have an important role to play in doing that. I resist the amendment on the grounds that it is impractical, because of the timings I have discussed, and because it is far better that such matters are reviewed constantly, rather than just on a one-off.

In my experience, ongoing reviews tend to mean never. If you do not have a deadline or target, that gives you the scope just to say, “We are doing it and will carry on doing it for some time,” without there ever being a point at which you say, “Here’s a review.” An annual review is such an easy thing to which to commit; why not do it?

Order. I remind the hon. Lady that I am not going to do anything with regard to the Bill. She should return to using normal parliamentary speech.

We thought you might be the regulator for part 4, Mr Stringer.

I suppose this is the difference between the two sides of the House: for the Opposition, an ongoing review means never; for the Government, an ongoing review means always.

My background is in telecoms, latterly as a global consultant coaching front-line leaders. People always said to me, “Oh yeah, we always have reviews,” but unless there is a cadence on it and it is put down in black and white, it is not done properly. They would not do it in the business world, and Opposition Members would not do it; perhaps Government Members are a bit more blasé than we are.

That tells us all we need to know about consultants. There we are. I commit that we will keep the effectiveness of the legislation under review. I know that that will happen anyway because I know that my hon. Friend the Member for Devizes is not going to let this one go.

We will have a continuous review of the ongoing review. With that, I urge the hon. Gentleman to withdraw the amendment.

Question put, That the amendment be made.

Question proposed, That the clause stand part of the Bill.

I will not test the Committee’s patience further by going over arguments that we have already had, but there is one further area of clause 20 that we wish to touch on—the lack of an appeals process in the legislation. The Minister may expect the regulator to build that appeals process in: it would be helpful to have some clarity from him on that.

As I understand it, the BBFC will use analytics to identify sites that should have age verification. Analytics are not foolproof, so obviously an appeals mechanism will be needed for websites incorrectly prevented from operating. Previous such systems have wrongly filtered out websites such as breast cancer charities or forums for gay and transgender people. That is incredibly important: let us put ourselves in the shoes of a young gay man or woman, growing up in a religious household perhaps, who does not know where to turn to ask the questions that would plague any teenager coming to terms with their sexuality and who seeks refuge and solace in internet forums with other people going through the same issues. As risky as the internet can be, it can also be an incredibly empowering, transformative space that can literally save lives in such situations. Such lifelines must absolutely not be filtered out by ASPs or made subject to age verification; the Bill should include a mechanism that allows for correction when they have been mistakenly identified.

We also need clarification on who will develop the analytics, the data they will be based on and whether it will be done in consultation with the tech industry. We can only assume that this is an oversight that will be corrected when working out how the regulator is to proceed.

The hon. Lady raises an important point about access to information about sex education, sexuality, abortion and all sorts of things that are incredibly valuable. She is right to draw attention to safe forums. I reassure her that many of the same issues came up with respect to the question of voluntary filtering and, despite what some of those giving evidence said, the incidence of false blocking of such valuable sites is incredibly low. The BBFC as regulator is really good: it is not in the business of defining based on imagery, and it has fairly detailed algorithms. I share her concern, but I want to offer some comfort.

I am grateful. I heard the BBFC or the Open Rights Group say that the incidence was very low, but it would do no harm to build an appeals process into the legislation to ensure that where sites that should not be blocked or require age verification have fallen through the cracks, that can be resolved at the behest of the regulator.

The hon. Lady is absolutely correct that there needs to be an appeals process. That process is provided for in clause 17(4):

“The Secretary of State must not make a designation under this section unless satisfied that arrangements will be maintained by the age-verification regulator for appeals”.

I agree with everything else she said. It is worth remarking on the recent announcement that gay and bisexual men will now be pardoned over abolished sexual offences—that is not in the Bill, so that remark was completely out of order, but I still think it was worth making. Appeals are important; I hope she is satisfied that they are provided for.

Question put and agreed to.

Clause 20 accordingly ordered to stand part of the Bill.

Clause 21 ordered to stand part of the Bill.

Clause 22

Age-verification regulator’s power to give notice of contravention to payment service providers and ancillary service providers

I beg to move amendment 75, in clause 22, page 23, line 28, at end insert; “and

(c) the person has been the subject of a enforcement notice under section 20(2) and the contravention has not ceased.”

With this it will be convenient to discuss the following:

Amendment 76, in clause 22, page 23, line 29, leave out “may” and insert “must”

This amendment places a requirement on the age-verification regulator to give notice to payment or ancillary service providers that a person has contravened clause 15(1) or is making prohibited material available on the internet to persons in the United Kingdom.

Amendment 79, in clause 22, page 24, line 24, leave out “may” and insert “must”

This amendment places a requirement on the age-verification regulator to issue guidance about the services that it determines are enabling or facilitating the making available of pornographic or prohibited content.

New clause 6—

“Requirement to cease services to non-complying persons

‘(1) Where the age-verification regulator has given notice to a payment-services provider or ancillary service provider under section 22(1), the payment-services provider or ancillary service provider must cease the service provided to the person making pornographic material available in the United Kingdom.

(2) A payment-services provider or ancillary service provider who fails to comply with a requirement imposed by subsection (1) commits an offence, subject to subsection (3).

(3) No offence is committed under subsection (2) if the payment-services provider or ancillary service provider took all reasonable steps and exercised all due diligence to ensure that the requirement would be complied with.

(4) A payment-services provider or ancillary service provider guilty of an offence under subsection (2) is liable, on summary conviction, to a fine.

(5) In this section “payment-services provider” and “ancillary service provider” have the same meaning as in section 22.”

This new clause requires payment and ancillary services to block payments or cease services made to pornography websites that do not offer age-verification if they have received a notice of non-compliance under section 22(1). This provision would only apply to websites outside of the UK. This would enhance the enforcement mechanisms that are available under the Bill.

New clause 18—Approval of Age-verification providers

‘(1) Age-verification providers must be approved by the age-verification regulator.

(2) In this section an “age-verification provider” means a person who appears to the age-verification regulator to provide, in the course of a business, a service used by a person to ensure that pornographic material is not normally accessible by persons under the age of 18.

(3) The age-verification regulator must publish a code of practice to be approved by the Secretary of State and laid before Parliament.

(4) The code will include provisions to ensure that age-verification providers—

(a) perform a Data Protection Impact Assessment and make this publicly available,

(b) take full and appropriate measures to ensure the accuracy, security and confidentiality of the data of their users,

(c) minimise the processing of personal information to that which is necessary for the purposes of age-verification,

(d) do not disclose the identity of individuals verifying their age to persons making pornography available on the internet,

(e) take full and appropriate measures to ensure that their services do not enable persons making pornography available on the internet to identify users of their sites or services across differing sites or services,

(f) do not create security risks for third parties or adversely impact security systems or cyber security,

(g) comply with a set standard of accuracy in verifying the age of users.

(5) Age-verification Providers must comply with the code of practice.

(6) To the extent that a term of a contract purports to prevent or restrict the doing of any act required to comply with the Code, that term is unenforceable.”

We promised to return to the topic of enforcement and blocking, and we have reached it today. That is very good; it suggests that our progress on the Bill is excellent.

The purpose of these amendments and new clause 6 is to clarify and strengthen the enforcement process. We have already discussed fruitfully how clause 20 will be used, particularly for sites based overseas, and I was reassured by what the Minister said, but I want to turn to the “what ifs”. What happens if the regulator acts, has clarity about whether they are imposing a fine or an enforcement notice, and nothing actually happens—none of the sanctions in the current regime leads to a website imposing age verifications? I welcome what the Bill says about involving a direct relationship between not just the regulator and the platform or the website, but the payment providers. As the Minister said, cutting off the business model—the cash flow—is a very effective way of making enforcement happen.

I have a series of questions relating to the process. First, it is not clear when the regulator will inform providers that such a contravention is happening. Some questions were asked about how long it will be and what the time period will be, but when does the regulator actually issue a notice? Amendment 75 states that the regulator has a power to issue a notice under clause 22 when an enforcement notice has been issued and the contravention has not yet ceased. I think websites ought to be given the opportunity to respond to the regulator’s intervention before the payment providers and ancillary services are involved. That process should be very clear. It is the same if we have an issue with service provision at home: we know what our rights are, what period of time we have to complain and what happens when that period expires.

Secondly, as I read the Bill—I am in no way setting myself up as somebody who understands every aspect of the legal jargon—there appears to be no requirement for the regulator to inform the payment providers and ancillary services of a contravention. It may just be implicit, but amendment 66 would make it mandatory for the regulator to inform the payment providers and ancillary services if there were a contravention. I would be interested to hear the Minister’s views on that.

I am pleased that we have returned to enforcement and compliance, and I hope we are going to spend more time on blocking. The hon. Lady’s amendment uses the term “ancillary service provider”, to which she referred earlier. I would be very grateful if she spent some time spelling out in a bit more detail what an ancillary service provider is. Does it include ISPs? I think she alluded to that earlier, but I am not sure. Can she help clear up the confusion with some detail, please?

I apologise if I have caused any confusion. I will let the Minister specify exactly what he thinks. In tabling these amendments, I wanted to ensure that as wide a group of people and companies as possible is involved in doing something we all think is very valuable—implementing these age verification mechanisms. As I read the Bill as drafted, it does not contain a clear distinction between ISPs and ancillary service providers; they are included in the same bucket. I want to clarify that I think that both ISPs and ancillary service providers—in my mind, ancillary service providers are the platforms that we discussed by name earlier—have a duty and a legal responsibility to ensure that the age-verification mechanisms are in place.

The hon. Lady will have to forgive me. We are going to hear from the Minister shortly, but I would like to know if, in her amendment, ancillary service providers definitely include internet service providers. I know it is a difference of just one word, but I would be grateful for her clarification.

I share some of the hon. Lady’s uncertainty—I was going to say confusion, but it is not—about the terminology. Would the definition include, for example, telecoms providers over whose networks the services are provided?

I am perhaps going to let the Minister spell that out exactly. The hon. Gentleman raises a very important point: we all know now that access to internet services is often done entirely over a mobile network. I can again give some comfort on this issue. The BBFC, which is an excellent choice, has worked for many years with the mobile service providers—a witness gave evidence to this effect—so they already offer a blocking service based on the BBFC’s definition of 18-plus and 18-minus material. It is essentially an opt-in service. Someone has to say that they are under 18 and checks are carried out. The providers already offer the service, and it seems to work reasonably effectively.

I apologise for inadvertently misleading the Committee —perhaps it reflects some of the confusion in the wording—and I want to be very clear about who we are trying to capture with the amendments. We would all support the idea of spreading the net as widely as possible in ensuring the right behaviour, but it is important to make clear that ISPs are to be expected and legally mandated to carry out the same checks.

Another point I wanted to make with amendment 79 was to ask the regulator to issue guidance on the sort of businesses that will be considered to be ancillary services. The reason for putting that in the Bill is that, as we debated extensively in earlier sittings, the world changes. We had very good debates about why 10 megabits per second might not be appropriate in a couple of years’ time and why the USO as originally construed was laughably small. We all try to do the right thing, but of course the world changes. The reference by the hon. Member for City of Chester to Whac-A-Mole was interesting. What will the consequences be of implementing the Bill? We are a very substantial revenue stream for many websites, and new service models might arise. Someone might be scrutinising the letter of the law and thinking, “We are not captured by this, so we are not captured by these regulations.” Asking for the regulator to issue guidance on the types of businesses that will be considered to be ancillary services could future-proof some of the Bill.

I am grateful for the hon. Lady again allowing me to intervene. I apologise for interrupting her sentence; that was not my intention. I am pleased to see her amendments. This discussion is helping me and perhaps all of us to come to some form of understanding. I have a little metaphor in mind. If a cinema was allowing children to see pornography, we would hold the ticket seller responsible, as well as the organisation running the cinema, but not the bus driver who drove the bus the child took to get to the cinema. Does that metaphor help?

It depends whether the bus driver was paid for by the cinema. That is the point. Businesses pop up. There might be a bespoke Odeon cinema. My point is that we need to ensure that the regulator has as much flexibility as possible to respond to changing definitions. The current definition of an ancillary service provider is quite clear, although I would like the Minister to clarify it, but my amendment would try to future-proof the definition.

In raising the issue of whether the bus driver was paid for by the cinema, the hon. Lady has helped me to hit on something else. Are we not considering the role of search engines in this matter and whether they are driving things or complicit? I do not know the answer to that question. She has raised a helpful analogy in response to my analogy.

How long has the Committee got to hear about search engines? The hon. Lady raises a fascinating point. It was through a very strong cross-party effort and with the leadership of the former Prime Minister that we got the search engines to do some compelling things. Let me give her an example. It was clear that search engines in Europe were happy to allow terms to be typed in that could only lead to sexual images of child abuse being returned. I had the important but unenviable job, as the Prime Minister’s special adviser on the issue, of sitting down with the parents of April Jones, the little girl murdered in Wales, and trying to explain to them why, when their daughter’s killer typed in “naked little girls in glasses”, they received an image. It took many levels of conversation, including a personal conversation between me and the head of Google Europe, saying, “How do you as a parent feel about this? I don’t care about you saying ‘We serve up everything at all times’; I don’t care that the search terms themselves are not illegal. What I care about is your duty. You have a duty to do no evil, and in my view, you are breaching that.”

This is why I am so proud of what the Government have done. With all that effort and by recruiting Baroness Shields, who has been a worthy addition, we got the internet service providers not only to not return illegal imagery but, with the help of experts, not to return anything at all to a whole series of search terms that were found to be used by paedophiles in particular. I am sure that the hon. Lady will have seen that the Government then went further. It all comes down to what is legal. Your porn is my Saturday night viewing. [Laughter.] Theoretically.

I may have come up with a Daily Mirror headline. My point is that the whole debate about pornographic material has always ended in the cul-de-sac of freedom of speech. That is why we worked with internet service providers, saying, “Let parents choose. Let’s use the BBFC guidelines. They have years of experience defining this stuff based on algorithms.” It is not for the hon. Lady or me to decide what people should not be viewing; we quite properly have an independent agency that says, “This is appropriate; this is not.”

However, the hon. Lady has eloquently raised the point that for too long, we have treated the internet as a separate form of media. We accept in cinemas, whether or not the bus driver is working for them, that if a film is R18, we are pretty negligent if we take our kids to see it, but we are helped to see that. We do not let our kids wander into the cinema and watch the R18 stuff with nobody stopping them along the way, but for too long, that has been the situation with the internet. The hon. Lady has raised a good point about search engines. I can assure her that the world has changed significantly, certainly in the UK, although other jurisdictions may not have been so influenced.

I should probably declare that prior to becoming an MP, I worked at Google. Does my hon. Friend agree that this is where it becomes complex? A search engine, to use another analogy, is a bit like a library. The books are still on the shelves, but the search engine is like the library index: it can be removed and changed, but the content is still there. That is why we need to do much more than just removing things from the search engine: the content is still there, and people can find alternative ways to get to it. We must do much more.

I defer to my hon. Friend’s knowledge. Of course we all agree that certain instances of countries taking things down are utterly abhorrent; I am thinking of information about human rights in China, or about female driving movements in Saudi Arabia. We do not want to be in the business of over-specifying what search engines can deliver. We have not even touched on Tor, the dark web or the US State Department-sponsored attempts to circumvent the public internet and set up some rather difficult places to access, which have increasingly been used for trafficking illegal material.

We need to keep hold of the search engine issue for a moment, because search engines are part of the process. To restate the bus driver analogy, a search engine is also like a sign saying to adults, and children, “You can go here to see pornography”.

I think we will let the Minister talk about that. Again, think about the practical series of keystrokes. Let us take gambling for a moment. It is quite a good analogy, because we mandated in the Gambling Act 2005 that there should be age verification. The search engine host provides access to a site, and users must go through an age verification mechanism. Age verification is incumbent on the site, and the service provider is legally responsible. I shall let the Minister discuss search engines in his speech.

Finally, from my reading of the Bill, there does not appear to be a power to require the providers or services to take any action. The Government said that because the law is clear about non-compliance,

“we do not think it would be appropriate or necessary to place a specific legal requirement on these payments companies to remove services.”

That is, payment providers are part of the solution but they are not legally mandated to stop payments. I suppose the Government are relying on companies acting on the fact that their terms and conditions require merchants to be operating legally in the country, so if they breach the legislation they are in breach of the laws in the country. Nevertheless, it would be helpful to hear some assurance. Perhaps it is based on responses to the consultation saying that the payment service providers stand by, ready and willing to stop the financial flows, which will be very important in disrupting this business model.

New clause 7 would require payment service providers to act and remove their services from contravening websites, and suggests that if they fail to act they will be committing an offence. With regard to new clause 7, the first line of defence is financial transaction blocking and mandatory blocking—

I rise to speak to new clause 18, which stands in my name and that of my hon. Friend the Member for Cardiff West. I also support the amendments tabled by the hon. Member for Devizes. The Government’s proposals really do rely on an awful amount of good will among all the stakeholders involved in the legislation. It makes sense to create a backstop power for the regulator to require payment services to act should they not do so in the first instance.

New clause 18 comes from a slightly different perspective. It would oblige the age-verification regulator to ensure that all age verification providers—the companies that put the tools on websites to ensure compliance—are approved by the regulator; to perform a data protection impact assessment that they make publicly available; and to perform an array of other duties as well.

The new clause is designed to address some of the concerns about the practicality of age-verification checks, ensuring that only minimal data are required, and kept secure; that individuals’ privacies and liberties are protected; and that there is absolutely no possibility of data being commercialised by pornographer. We raise the latter as a potential risk because the proposals were drafted with the input of the pornography industry. That is understandable, but the industry would have a significant amount to gain from obtaining personal data from customers that might not currently be collected.

As we said earlier, we have full confidence in the BBFC as regulator, but, as with the proposals in part 5 of the Bill, it is vital that some basic principles—although certainly not the minutiae—are put on the face of the Bill. We are certainly not asking anything that is unreasonable of the regulator or the age-verification providers. The principles of privacy, anonymity and proportionality should all underpin the age-verification tool, but as far as I am aware they have not featured in any draft guidance, codes of practice, or documents accompanying the Bill.

The Information Commissioner agrees. The Information Commissioner’s Office’s response to the Department for Culture, Media and Sport’s consultation on age verification for pornography raised the concern

“that any solution implemented must be compliant with the requirements of the DPA and PECR”—

the Data Protection Act 1998, and the Privacy and Electronic Communications (EC Directive) Regulations 2003 that sit alongside it. It continues:

“The concept of ‘privacy by design’ would seem particularly relevant in the context of age verification—that is, designing a system that appropriately respects individuals’ privacy whilst achieving the stated aim… In practical terms, this would mean only collecting and recording the minimum data required in the circumstances, having assessed what that minimum was. It would also mean ensuring that the purposes for which any data is used are carefully and restrictively defined, and that any activities keep to those restricted purposes…In the context of preventing children from accessing online commercial pornography, there is a clear attribute which needs to be proven in each case—that is, whether an individual’s age is above the required threshold. Any solution considered needs to be focussed on proving the existence or absence of that attribute, to the exclusion of other more detailed information (such as actual date of birth).”

The Commissioner made it clear that she would have

“significant concerns about any method of age verification that requires the collection and retention of documents such as a copy of passports, driving licences or other documents (of those above the age threshold) which are vulnerable to misuse and/or attractive to disreputable third parties. The collection and retention of such information multiplies the information risk for those individuals, whether the data is stored in one central database or in a number of smaller databases operated by different organisations in the sector.”

I understand that the Adult Provider Network exhibited some of the potential tools that could be used to fulfil that requirement. From the summary I read of that event, none of them seem particularly satisfactory. My favourite was put forward by a provider called Yoti, and the summary I read describes the process for using it as follows:

“install the Yoti App…use the app to take a selfie to determine that you are a human being…use the app to take a picture of Government ID documents”—

passport or driving licence, I imagine—

“the app sends both documents to Yoti…Yoti (the third party) now send both pictures to a fourth party; it was unclear whether personal data (e.g. passport details) is stripped before sending to the fourth party…Fourth party tells Yoti if the images (selfie, govt ID) match…Yoti caches various personal data about user”

to confirm that they are over 18. The user can then visit the porn site—whatever porn site they would like to visit at that time—and then the

“porn site posts a QR-like code on screen…user loads Yoti app…user has to take selfie (again) to prove that it is (still) them…not a kid using the phone…user scans the on-screen QR-code, is told: ‘this site wants to know if you are >18yo, do you approve?’…User accepts…Yoti app backchannel informs porn site…that user >18yo”

and then the user can see the pornography.

I do not know whether any Committee members watch online pornography; I gather that the figure is more than 50% of the general population, and I am not convinced that hon. Members are more abstinent than that. I ask Members to consider whether they would like to go through a process as absurd as the one suggested.

The hon. Lady has got ahead of the potential Daily Mail headline when the freedom of information request comes in for her Google search history.

I am not convinced that anybody would want to go through a process as the one I have just described, or even one significantly less convoluted. I suggest that instead they would seek entertainment on a site that did not impose such hurdles. The BBFC in its evidence made the telling point that the majority of the viewing population get their content from the top 50 sites, so it is very easy to target those—we see that entrenched in clause 23. The problem with that, as my hon. Friend the Member for City of Chester pointed out, is that targeting those sites may push viewers to the next 50 sites, and so on. We therefore need to ensure that the process is as straightforward and as minimal as possible.

My concern about users being pushed to the next 50 sites is that those sites are much less regulated, and I hazard a guess that they are much more likely to be at the extreme end of the spectrum.

That is exactly my concern. I imagine that the top 50 providers are not as hardcore, are less extreme and may not include such violent images; as we move on to the next 50 or the 50, there is a danger of images becoming more extreme.

The solution must not result in the wholesale tracking or monitoring of individuals’ lawful online activities or the collection of data with a view to unlawful profiling of individuals. I am not convinced that the BBFC is properly resourced to undertake the significant additional workload, nor am I convinced that the practicalities of the software that have so far been exhibited, or their implications, have been properly worked out.

My hon. Friend is generous in giving way. She is absolutely right about resourcing. I am no technical expert, but does she agree that such a database may be a prime target for hackers unless it is properly resourced and defended?

That is absolutely right, and I will come to that point. We heard evidence from the BFFC that it intended potentially to use age-verified mobile telephony to ensure that sites are properly age verified, but I am afraid that that approach is also flawed. First, there is the obvious issue that there is nothing to stop an underage child using the information attached to that phone—be it the phone number or the owner’s name—to log on and falsely verify. Equally, there are enormous privacy issues with the use of mobile-verified software to log on.

The BBFC said clearly that it was interested not in identity but merely in the age of the individual attempting to access online pornography, but as we all know, our smartphones contain a wealth of information that can essentially be used to create a virtual clone. They are loaded with our internet conversations, financial data, health records, and in many cases the location of our children. There is a record of calls made and received, text messages, photos, contact lists, calendar entries and internet browsing history—the hon. Member for Devizes may want to take note of that—and they allow access to email accounts, banking institutions and websites such as Amazon, Facebook, Twitter and Netflix. Many people instruct their phones to remember passwords for those apps so they can quickly be opened, which means that they are available to anyone who gets into the phone.

All that information is incredibly valuable—it has been said that data are the new oil—and I imagine that most people would not want it to be obtained, stored, sold or commercialised by online pornography sites. The risks of creating databases that potentially contain people’s names, locations, credit card details—you name it—alongside their pornographic preferences should be quite clear to anyone in the room and at the forefront of people’s minds given the recent Ashley Madison hack. I am not condoning anyone using that website to look for extramarital affairs, nor am I privileging the preferences or privacy of people who wish to view online pornography over the clearly vastly more important issue of child protection. However, one consequence of that hack was the suicide of at least three individuals, and we should proceed with extreme caution before creating any process that would result in the storing of data that could be leaked, hacked or commercialised and would otherwise be completely private and legitimate.

That is the reasoning behind our reasonable and straightforward amendment, which would place a series of duties on the age-verification regulator to ensure that adequate privacy safeguards were provided, any data obtained or stored were not for commercial use, and security was given due consideration. The unintended consequences of the Government’s proposals will not end merely at the blocking of preferences, privacy or security issues, but will include pushing users on to illegal or at the very least non-compliant sites. We are walking a thin tightrope between making age verification so light-touch as to be too easily bypassed by increasingly tech-savvy under-18s and making it far too complicated and intrusive and therefore pushing viewers on to either sites that do not use age verification but still offer legitimate content or completely illegal sites that stray into much more damaging realms. These provisions clearly require a lot more consultation with the industry, and I am confident that the BBFC will do just that, but the Opposition would feel a lot more confident and assured if the regulator was required to adhere to these basic principles, which we should all hold dear: privacy, proportionality and safety.

The hon. Lady rightly gets to the great concern that somehow, in doing something good, an awful lot of concern can be created, and I am sympathetic to her points. I remind her that it is not as if these sites do not know who is visiting them anyway. One of the great conundrums on the internet is that every single keystroke we take is tracked and registered. Indeed, that is why shopping follows us around the internet after we have clicked on a particular site. Unless people are very clever with their private browsing history, the same is the case for commercial providers.

Although the hon. Lady is right to be concerned about the conflation of identity and data, there is absolutely no sense that this information is not already out there. It could be used for malicious purposes, should somebody so intend. I remind her that 86% of the public think that putting in place age verification measures is a good thing. I have always wanted to unleash this country’s technological brilliance in coming up with a system. When we were looking at how to ensure filters are correctly turned off and on by adults, because kids are often more tech-savvy than their parents—we heard about the tech-savvy seven-year-old of the hon. Member for Berwickshire, Roxburgh and Selkirk—and to ensure filter management is done by an adult, we came up with a neat solution. A person has to be over 18 to enter into a contract to have the internet service; therefore, ensuring that emails are sent to the account holder is a way of restoring that loop. Of course, passwords can be shared among families, but really good attempts were made to try to work out who is over 18 in the household.

I am sure the hon. Lady agrees that we do not want the perfect to be the enemy of the good. These are all very important points to make. The BBFC is very experienced, and it ought to be able to design an age verification system that meets her concerns.

I absolutely support the Government’s intention here. We just want to ensure it is done in the right way and balances both sides of the argument. I think it is absolutely right that internet service providers are offering this filter, but does the hon. Lady share my concern that very few families take it up and very many families turn it off?

There are Ofcom data. One of the requirements we asked for was for Ofcom to monitor. Take-up improved, and, as I said, some internet service providers now have an automatic “on” system, whereby a person has to intervene to take the filters off. I am told that only about 30% of families choose to do so. Here is the savvy thing: we all know that people live in households with multiple ages and multiple requirements on the internet, so many ISPs now offer a service that enables people to disable the filters for a period and automatically reinstate them the following day. They do not have to do anything if they want the filters to be in place, but they might want to access over-18 content as an adult.

I want to discuss some of the other issues that have come up in this conversation, in the process of finally speaking about these amendments. Is it in order to do so, Mr Stringer?

It is if it is covered by the amendments and new clauses 6 and 18, but I cannot tell until you start speaking.

Then I will carry on, because it definitely is. I think I misspoke at the beginning when I talked about new clause 7. I was actually referring to new clause 6; it was just my note-taking.

I was trying ensure that we put in place series of protections, including enforcement notices that are acted upon, financial penalties that make a difference and the ability to stop income streams moving from the payment providers to the various content providers. I want to press the Minister on the question of blocking, because it comes back to the issue of why anyone would care. If somebody does not respond to an enforcement notice—if, for example, the fine is not sufficient to make them stop —how can it be that we are not considering blocking? Of course, we do that for other sites. I know it is not applicable to every form of illegal content, but I am very struck by copyright infringement, which generates take-down notices very swiftly, and upon which the entire provision of internet service providers and ancillary services act. I would be really interested to hear from the Minister why blocking has been rejected so far. Could it be put in place as a backstop power? I worry that, without it, all of this amazing progress will not have teeth.

It is sometimes said that Parliament skates over matters and does not get under the skin of things, but in the discussion we have just had Committee members displayed a great deal of analysis, experience and wisdom, and our debate on the Bill has been enriched by it. I am very grateful to hon. Members on both sides of the Committee who made very good contributions to help us get this right.

Exactly as the hon. Lady the Member for Sheffield, Heeley said, getting this right involves walking a tightrope between making sure that there is adequate enforcement and appropriate access for those for whom it is legally perfectly reasonable to access adult content. We must get that balance right. With that mind, we have drafted the clauses, particularly clause 22, to allow the regulator to operate with some freedom, because we also need to make sure that, over time, this remains a good system and is not overly prescriptive. It was ironic that in a speech about privacy, the hon. Lady started to speculate about which MPs enjoyed watching porn. I am definitely not going to do that.

The truth is that age verification technology is developing all the time. Online personal identity techniques are developing all the time, and indeed, the British Government are one of the leading lights in developing identity-verification software that also minimises the data needs for that verification and does not rely on especially large state databases to do that, and therefore does it in a relatively libertarian way, if I can put it that way. Providing for verification of identity or of age, because age without named identity is what is really being sought here, but is difficult to achieve, is an incredibly important issue. A huge amount of resource is going into that globally to get it right, and it ties closely to cyber security and the data protection requirements of any data.

The UK Data Protection Act has a broad consensus behind it and follows the simple principle that within an institution data can be shared, but data must not be shared between institutions. The institution that holds the data is responsible for their safekeeping and significant fines may be imposed for their inadvertent loss. The forthcoming General Data Protection Regulation increases those fines. Rather than reinventing data protection law for the purposes of age verification in this one case, it is better to rest on the long-established case law of data protection on which the Information Commissioner is the lead.

We had a very informed debate on the role of search engines. The regulator will be able to consider whether a search engine is an ancillary service provider. Although we do not specify it, I would expect ISPs to be regarded as ancillary service providers, but that will be for the regulator.

On the name of payment providers who are already engaged, rather than enforced engagement, we already have engagement from Visa, MasterCard, UK Cards Association and the Electronic Money Association, and clearly there a lot more organisations that can and should be engaged.

It is interesting that the Minister feels able to say that he would expect ISPs to be regarded as an ancillary service provider, but he did not use the same terminology when talking about search engines. To press him on that, would he expect search engines in some cases, or may be in all cases, to be considered as ancillary service providers?

I do not draw any distinction between the two, but the decision is for the regulator. The legislation provides that they could be, and it depends on the circumstances whether they would be. Of course, obviously, they play different roles.

Just to clarify, I think the right hon. Gentleman is saying that in making no distinction, he would be able to apply the word “expect” to search engines as well as to ISPs. That is what I was probing him to find out.

I am choosing not to use that word because I want to leave it to the regulator, rather than leaving an implication that it should move one way or the other. The regulator should define what is an ASP according to the legislation.

The Minister is therefore making a distinction between the two. In one case he clearly has an expectation that it will happen, and in the other case he does not. The Committee will be interested to know why he is making that distinction, which he denies he is making, because it is important to our understanding the reluctance in the Bill to involve search engines in some of these regulations.

They should be treated the same in that the same provisions in the Bill should be applied to each, but each performs a different role and ISPs are inevitably more closely connected to the provision of content because the content goes through an ISP, whereas a search engine may or may not be the route through which content is found. For implementation, it is clear that that is for the regulator to decide within the provisions set out in the Bill.

I refer the Minister to the point made by the hon. Member for Devizes, who mentioned the murder of April Jones and the fact that her killer was able to type certain words into a search engine that I cannot bear to repeat. Search engines have the power to change their algorithm—we know they do.

The point that my hon. Friend the Member for Devizes was making is that, owing to her work, the search engines made precisely those sorts of changes on a voluntary basis. At the request of the Government and others, they now undertake millions of changes to their algorithms and millions of take-downs for both child porn and terrorist-related purposes. That system is working well, and it does not need to be underpinned by regulation.

There is then a wider question. I am straying to the limits of order to discuss this, but my hon. Friend very effectively argued that the principle that the internet should provide the freedom that it provides within the framework of a regulated structure. We agree with that, and we are providing for some of that regulated structure in this Bill. There is a first amendment-type argument, if we are thinking about it in an American way, that the internet is free and laissez-faire and that we should not regulate it. There are people who say, for instance, that we should not recreate national jurisdictional boundaries on the internet and that we should not regulate it, that it should be completely free. We reject that argument, which is why we are prepared to introduce legal requirements on age verification for the provision of information over the internet in the UK jurisdiction. We reject the argument because, at a principled level, the freedoms that we enjoy are freedoms that do not harm others, which applies offline just as much as it applies online. Because the internet is relatively new, we are still in the early days of applying such a principle to the internet. That is a much bigger debate than in clause 22, and therefore I should not go into further detail.

I believe that the Minister has just answered the question of my hon. Friend the Member for Cardiff West on whether a search engine is an ancillary service provider. The Minister acknowledges that search engines, as well as ISPs, should be considered as such.

All I did was set out the principles behind the Government’s response to the amendments to clause 22. The Committee must know those principles in order to understand the direction that we are taking on regulation.

I will move on to some of the other points that were made. I will respond to new clause 18 and amendment 79.

New clause 18 calls for an age verification regulator to approve age verification providers, and would require the regulator to publish a code of practice. Amendment 79 would require the regulator to publish guidance under clause 22(6), rather than having discretion to publish it. I do not think these measures are necessary, not least because the regulator has the power to publish guidance about the circumstances in which it will treat services as enabling or facilitating, and going further is not necessary given the BBFC’s commitment to creating proportionate and robust regulatory regimes.

Also, decisions on age verification method or tools, which are an important part of the debate, are a very significant part of what we are putting forward. The regulator is required under clause 15 to publish guidance setting out the types of arrangements that it will treat as compliance. Therefore, I do not think that it is necessary to insert such arrangements into clause 22 as well.

Having given that response to the points that were made, I hope that these amendments will be withdrawn, but I thank the members of the Committee for the contributions that they have made in our consideration of these matters.

I thank the Minister for that response. I would have liked to hear him say a little bit more about how the payment service providers are involved in the game and whether we are relying on them to do the right thing because they are large corporate companies, or whether, as new clause 6 proposed, there was an opportunity to strengthen the wording of the Bill.

I apologise; there were so many interesting points made that I did not get to that one.

The provision of pornography without an age verification in the UK will become illegal under this Bill. There is a vast panoply of financial regulation requiring that financial organisations do not engage with organisations that commit illegal activities, and it is through that well-embedded, international set of regulations that we intend to ensure that payment service providers do not engage with those who do not follow what is set out in the Bill. Rather than inventing a whole new system, we are essentially piggybacking on a very well-established financial control system.

That is a very reassuring reply and I thank the Minister for it. We have had a very good debate. I know that his officials will be listening and thinking hard about what has been said, and I do not think it would serve the Committee any purpose to press my amendments or my new clause to a vote.

I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

It was interesting to hear the Minister refer to financial regulations. I was not present on Second Reading because I was not then in the position that I occupy now, but having read that debate I do not believe that there was any such reference. So we would like some clarity on who will be the regulator of the payment service providers and what work has already been done with the Financial Conduct Authority—I assume it will be with the FCA in this circumstance—to ensure that it will be regulating those providers, to make sure that they act with speed and due diligence on receiving notification from the age verification regulator under clause 15.

It is disappointing that the Government do not consider new clause 18 necessary to amend the Bill. I appreciate that the BBFC has been given powers to establish a code of practice, but given the very serious consequences that could result from that not being done correctly, some basic principles need to be embedded into the process, based on the issues that I raised earlier in our discussion.

I will just add that we will return to this issue on Report.

We have been engaging directly with payment service providers, although—no doubt as and when necessary—engagement with financial authorities will be made. Payment service providers can withdraw services from illegal activity under their existing terms and conditions, so the provision is already there for the measures to take effect.

Question put and agreed to.

Clause 22 accordingly ordered to stand part of the Bill.

Clause 23

Exercise of functions by the age-verification regulator

I beg to move amendment 80, in clause 23, page 25, line 1, at end insert—

‘(3) The age-verification regulator must consult with any persons it considers appropriate, about the option to restrict the use of its powers to large pornography websites only.’

This amendment requires the age-verification regulator to consult on whether, in the exercising of its function, it should restrict its powers to large pornography websites only.

With this it will be convenient to discuss new clause 12—Code of practice by age verification regulator

‘(1) The age verification regulator must issue a code of practice giving practical guidance as to the requirements of any provision under this Part of the Act.

(2) The following persons must, in exercising their functions under this Part and in the design and delivery of their products and services, adhere to the code of practice, and ensure that the safety and wellbeing of children is paramount—

(a) relevant persons;

(b) internet service providers;

(c) ancillary service providers;

(d) payment-service providers; and

(e) any such other persons to whom the code of practice applies.

(3) Any code of practice issued by the age verification regulator under subsection (1) above must include standards in relation to the following—

(a) how content is managed on a service, including the control of access to online content that is inappropriate for children, and the support provided by the service for child safety protection tools and solutions;

(b) the assistance available for parents to limit their child’s exposure to potentially inappropriate content and contact;

(c) how the persons specified in subsection (2) above shall deal with abuse and misuse, including the provision of clear and simple processes for the reporting and moderation of content or conduct which may be illegal, harmful, offensive or inappropriate, and for the review of such reports;

(d) the action which must be taken in response to child sexual abuse content or illegal contact, including but not limited to, the co-operation with the appropriate law enforcement authorities;

(e) the action to be taken by the persons specified in subsection (2) above to comply with existing data protection and advertising rules and privacy rights that address the specific needs and requirements of children; and

(f) the provision of appropriate information, and the undertaking of relevant activities, to raise awareness of the safer use of connected devices and online services in order to safeguard children, and to promote their health and wellbeing.

(4) The age verification regulator may from time to time revise and re-issue the code of practice.

(5) Before issuing or reissuing the code of practice the age verification regulator must consult—

(a) the Relevant Minister;

(b) the Information Commissioner;

(c) the Scottish Ministers;

(d) the Welsh Ministers;

(e) the Northern Ireland Executive Committee;

(f) the persons specified in subsection (2) above;

(g) children;

(h) organisations and agencies working for and on behalf of children; and

(i) such other persons as the age verification regulator considers appropriate.

(6) As soon as is reasonably practicable after issuing or reissuing the code of practice the age verification regulator must lay a copy of it before—

(a) Parliament,

(b) the Scottish Parliament,

(c) the National Assembly for Wales, and

(d) the Northern Ireland Assembly.

(7) The age verification regulator must—

(a) publish any code of practice issued under subsection (1) above; and

(b) when it revises such a code, publish—

(i) a notice to that effect, and

(ii) a copy of the revised code; and

(c) when it withdraws such a code, publish a notice to that effect.

(8) The Secretary of State may by regulations make consequential provision in connection with the effective enforcement of the minimum standards in subsection (3).

(9) Regulations under subsection (8)—

(a) must be made by statutory instrument;

(b) may amend, repeal, revoke or otherwise modify the application of this Act;

(c) may make different provision for different purposes;

(d) may include incidental, supplementary, consequential, transitional, transitory or saving provision.

(10) A statutory instrument containing regulations under subsection (8) (whether alone or with other provisions) which amend, repeal or modify the application of primary legislation may not be made unless a draft of the instrument has been laid before and approved by a resolution of each House of Parliament.

(11) In this Part—

“ancillary service provider” has the meaning given by section 22(6);

“child” means an individual who is less than 18 years old.

“Information Commissioner” has the meaning given by section 18 of the Freedom of Information Act 2000

“Internet service provider” has the same meaning as in section 16 of the Digital Economy Act 2010.

“Northern Ireland Executive Committee” has the meaning given by section 20 of the Northern Ireland Act 1998

“payment-service providers” has the meaning given by section 22(5) “relevant Minister” has the meaning given by section 47(1)

“relevant persons” has the meaning given by section 19(3)

“Scottish Ministers” has the meaning given by section 44(2) of the Scotland Act 1998

“Welsh Ministers” has the meaning given by section 45 of the Government of Wales Act 2006.’

This new Clause gives the power to the age verification regulator to introduce a code of practice for internet content providers. The code of practice would be based on existing industry and regulatory minimum standards (such as the BBFC classification system) and require providers to ensure that the safety and wellbeing of children is paramount in the design and delivery of their products and services.

I promise this will be the last time I speak today. I am afraid I have had a slight change of heart. I tabled this amendment around many points that have been raised today on the difficulty of focusing the BBFC’s efforts on the fact that much of this traffic is not simply going to the larger websites. As we have heard, many other free sites are providing information. However, in reading my amendment, I have decided that it is almost a vote of no confidence in the BBFC’s ability to be flexible and I would therefore like to withdraw it.

New clause 12 would give the power to the age verification regulator to introduce another code of practice—the Opposition are very fond of them—for internet content providers. [Interruption.] And reviews, we are very fond of reviews.

We have made it clear throughout that we want enforcement to be as tough as possible and for all loopholes to be closed, but we also want to ensure that children are as safe in the online world as they are offline. There absolutely needs to be that parity of protection. That is one reason why we are disappointed, as I mentioned, that these measures came forward in a Digital Economy Bill, where it was incredibly difficult to look at the issues of child protection online in a thoroughly comprehensive way.

The new clause proposes that the regulator should work with industry to create a statutory code of practice, based on BBFC guidelines for rating films and the principles of the ICT Coalition for Children Online. The code would establish a set of minimum standards that would apply consistently to social networks, internet service providers, mobile telecommunication companies and other communication providers that provide the space and content where children interact online.

This is not intended to be an aggressive, regulatory process. We envisage that it will be the beginning of a much broader debate and conversation between regulators and content providers about just how we keep our children safe on the web. This debate will encompass not only ideas such as panic buttons, but education about the online world, which must run in parallel for any process to be effective.

A statutory code would work with providers to lay out how content is managed on a service and ensure that clear and transparent processes are in place to make it easy both for children and parents to report problematic content. It would also set out what providers should do to develop effective safeguarding policies—a process that the National Society for the Prevention of Cruelty to Children has supported.

As I said, this will clearly be a staged process. We envisage that in order to be effective, the development of a code of practice must involve industry, child protection organisations such as the NSPCC and, crucially, the children and families who use online services. But this code of practice would be based on existing industry and regulatory minimum standards and would require providers to ensure that the safety and wellbeing of children is paramount in the design and delivery of their products and services. The new clause would also empower the Secretary of State to make regulations to ensure effective enforcement of the minimum standards in the code of practice.

The online world can be an enormously positive force for good for our children and young people. It makes available a scale of information unimaginable before the internet existed and there is compelling evidence that that constant processing of information will lead to the most informed generation of children the world has known, but it needs to be made safe to realise that potential. The new clause would give assurance to Opposition Members that we will enable that to happen.

I am grateful to my hon. Friend the Member for Devizes for saying that she will not press her amendment and for what she said about the BBFC. Anybody reading the transcript of this debate will see the universal support for the BBFC and its work.

On the point about statutory guidance, through the UK Council for Child Internet Safety we have made guidance available to providers of social media and interactive services to encourage businesses to think about safety by design and help make platforms safer for children and young people under the age of 18. The amendment would make something similar into statutory guidance. I see where the hon. Lady is coming from, but the scale and scope of the internet makes this an unprecedented challenge. Some of the biggest sites have over 2 billion visits per year and UK audiences make up a very large proportion of those. It would be very difficult to have statutory guidance that would be policeable in any complete way. Rather than statutory guidance that could not be dealt with properly, it is better to have non-statutory guidance that we encourage people to follow.

On that point, does the Minister share my concern about the levels of discontent among those children who are trying to report online through social media? Some 26% received absolutely no response at all and of those that did receive a response, only 16% were satisfied. What more can we do to strengthen that?

I do recognise that. My point is that making non-statutory guidance statutory will not help in that space, but there is clearly much more to do. I hope that, with that assurance, my hon. Friend the Member for Devizes will withdraw the amendment.

I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

This is a very curious clause, which renders much of the well-informed—as the Minister said—and useful discussion that we have had today about enforcement, targeting smaller providers and restricting access across the web, completely and utterly redundant. If the clause as I read it goes forward unamended, it will provide the regulator with the ability to target only the largest providers of online pornography, perhaps even limiting its ability to target only them.

As we have discussed at length, this is an incredibly difficult area to police, which I appreciate. It is obviously going to be far easier to tackle the 50 largest providers, not least because I assume many of them are already providing some level of age verification and are probably more at the responsible end of online pornography content providers. I would remind the Committee of the Conservative party’s manifesto, which said:

“we will stop children’s exposure to harmful sexualised content online, by requiring age verification for access to all sites containing pornographic material”.

That does not make any reference to commercial providers or whether the provider has a large or small turnover, is on WordPress, Tumblr, Twitter, Facebook or Snapchat. Today’s debate has very much suggested that the role of the regulator will be to focus on those sites that are operated on a commercial basis. Given the Minister’s reluctance to implement internet service provider blocking, I do not believe that the manifesto commitment will be achieved.

My hon. Friend is making a very interesting point. The clause refers to

“a large number of persons”


“a large amount of turnover”.

“A large number of persons” might be 1,000; it might be 1 million. Has there been any indication from the Government of what they mean by that?

As far as I am aware, we have had no indication from the Government at all. It would be very interesting to hear the Minister’s comments on that and on why the clause exists at all.

The Minister has been saying at length that he does not want to be too prescriptive to the regulator, but he is putting into primary legislation that the BBFC will be able to target, first and foremost, the larger providers and those that are more easy to target. I would imagine that a regulator in any regulatory system would go after the bigger and less problematic providers before those that are more difficult to tackle—no reasonable person would expect anything different. I find this confusing: why should the provision be in primary legislation, given the Minister’s overtures about not being too prescriptive and giving sufficient flexibility?

The operative word from that manifesto commitment last year is that children will be protected from “all” harmful sexualised content. I and Members on the Opposition Benches—I can see them shaking their heads—simply do not understand how the clause fulfils that commitment. That is quite apart from understanding what exactly constitutes

“a large number of persons”

among the millions of users, as my hon. Friend the Member for Cardiff West asked. Given that 37% of all net traffic is online pornography of some description, we would be very keen to hear how that number translates into

“a large number of persons”.

Also, what constitutes

“a large amount of turnover”

among the many millions of pornography sites available on the internet is anyone’s guess.

We are very concerned by the intent behind the clause. Is it inserted as a semi-admission by the Government that they will simply be unable to enforce clause 15 on “all” sites, as their manifesto promised, and so gives them an excuse to wriggle out of their commitment?

I hope I can provide some assurances to the perfectly reasonable questions from the hon. Lady. The clause is not an attempt to wriggle out of our manifesto commitment. We will deliver our manifesto commitment in full, and the Bill does that.

The clause provides discretion for the regulator to exercise its functions in a targeted way. It is needed so that the regulator does not break its statutory duties if it goes after the big providers first. As it set out in evidence, the regulator wants to go for the big providers first, and then move on to the smaller and then move on to the next. I want to allow for that to happen, so we need a clause such as this.

If I am not mistaken, the Minister just said “in a targeted way”. I fail to understand how phrases such as “a large number” or “a large amount” are in any way targeted.

The clause gives discretion to the regulator. If the regulator went after the big porn sites first, it would not have the vires to distinguish and go after those who do the most harm earliest. It is important that it has the ability to make the legislation work in practice.

That sounds pretty thin. It is almost like saying that the police would be acting in an ultra vires manner if they did not go after murderers ahead of shoplifters in terms of devoting their resources to their duties. Is that really the reason why this provision is in the Bill? If it is, it is a novel innovation by the Minister that is not often seen in legislation setting up a service.

As I have just mentioned in the discussion on the previous clause, some of the biggest sites on the internet have more than 2 billion visits a year. As the hon. Member for Sheffield, Heeley said, many sites are involved. Allowing discretion for a targeted approach is important. The clause also allows the regulator to

“carry out, commission or support…research…for the purposes of exercising, or considering whether to exercise”

the powers. That is important, too, because we want the regulator to have the power to conduct research to inform its views. Both those things are important parts of the execution of age verification.

The Minister said just now that the clause will stop the BBFC—we are to assume that it will become the age verification regulator—from being in breach of its statutory duties if it goes after the largest pornography providers first. Putting aside the analogy that my hon. Friend the Member for Cardiff West made, which was absolutely right, is it not the case that the age verification regulator does not have many statutory duties? That was the whole purpose behind the amendments of the hon. Member for Devizes. The regulator is required only to—well, it is not required to; it may—give notice to any payment services or ancillary service provider. I fail to see how targeting any content provider first, last or in any other way would put the regulator in breach of any requirement under the Bill.

I want to make it clear that it can target in order to work as effectively and as soon as it can. I am slightly surprised to find Opposition Members against that principle.

Part of my reason for withdrawing my amendment was that I was encouraged by the word “principally” on line 35 of this page. It is not a restriction; the regulator certainly has the power under the clause to go after it. My issue is that there is a worry, although not with this regulator, that success will be defined by the number of websites or the number of enforcement notices issued. It is not about the number of websites; it is about the number of eyeballs going to them, so it is absolutely right that the regulator focuses on larger sites first. The wording of the Bill allows the regulator discretion to go after any site.

On the basis that I agree with that explanation also, I commend the clause to the Committee.

Question put, That the clause stand part of the Bill.

Clause 23 ordered to stand part of the Bill.

Clause 24

Requirements for notices given by regulator under this Part

Question proposed, That the clause stand part of the Bill.

I will speak to the clause, just in case we have an unexpected hiccup. Clause 24 sets out requirements to apply where the regulator wishes to seek information or send a notice of infringement to an infringing website, payment services provider or ancillary service provider. The designation is to do so by post or email. We will work with the BBFC in its new role to ensure that the system is effective. Due to the nature of the sector, of course there will be times when notices are not seen or purposefully ignored. In the case of unco-operative non-compliant sites, the clause will allow us to disrupt their business regardless through the withdrawal of supporting services by payment and ancillary providers. I commend the clause to the Committee.

Clause 24 accordingly ordered to stand part of the Bill.

Clause 25 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Graham Stuart.)

Adjourned till Tuesday 25 October at twenty-five past Nine o’clock.

Written evidence reported to the House

DEB 51 BT Group

DEB 52 British Property Federation

DEB 53 techUK

DEB 54 Virgin Media

DEB 55 DCMS (further amendments)

DEB 56 Adult Providers Network

DEB 57 Administrative Data Research Centre

DEB 58 Information Commissioner (follow-up)

DEB 59 Economic and Social Research Council with input from the Medical Research Council