Motion to Approve
That the draft Regulations laid before the House on 10 October be approved.
Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 38th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B).
My Lords, the Digital Economy Act 2017 introduced a requirement for commercial providers of online pornography to have robust age-verification controls in place to prevent children and young people under 18 accessing pornographic material. Section 14(2) of the Act states:
“The Secretary of State may make regulations specifying … circumstances in which material is or is not to be regarded as made available on a commercial basis”.
In a sense, this is a small part of the legislative jigsaw needed to implement age verification: indeed, it is the last piece. I therefore beg to move that the draft regulations and the guidance published by the British Board of Film Classification, which is the designated regulator in respect of these measures, on age-verification arrangements and ancillary service providers be approved.
I bring to the attention of the House the concerns of the Joint Committee on Statutory Instruments and the Secondary Legislation Scrutiny Committee and thank them for their work. I will address their concerns in a moment and the Motion to Regret later, but before considering the specific points related to this debate, I want to remind the House of why the Government introduced this requirement.
In the offline world, there are strict rules to prevent children accessing adult content. This is not true of the online world. A large amount of pornography is available on the internet in the UK, often for free, with little or no protection to ensure that those accessing it are old enough to do so. This is changing the way that young people understand healthy relationships, sex and consent. A 2016 report commissioned by the Children’s Commissioner and the NSPCC makes that clear. More than half of the children sampled had been exposed to online pornography by the age of 15 and nearly half of boys thought pornography was “realistic”. Just under half wished to emulate what they had seen. The introduction of a requirement for age-verification controls is a necessary step to tackle these issues and contributes towards our commitment to making the UK the safest place in the world to be online. I urge noble Lords, in the ensuing debate, to bear this primary objective in mind and help us ensure the commencement of age verification as soon as possible.
The draft Online Pornography (Commercial Basis) Regulations set out the basis on which pornographic material is to be regarded as made available on a commercial basis. The regulations cover material on websites and applications that charge for access and they also cover circumstances where a person makes pornographic material available on the internet for free but that person receives other payment or reward in connection with doing so, for example through advertising revenue. It was clear from debates during the passage of the Digital Economy Act that it was not Parliament’s intention that social media sites on which pornography is only part of the overall content should be required to have age verification. That is reflected in the draft regulations we are debating today. We have set a threshold to ensure proportionality where material is made available free of charge. Thus there is an exemption for people making pornographic material available where it is less than one-third of the content on the website or application on which it is made available. This will ensure that websites that do not derive a significant proportion of their overall commercial benefit from pornography are not regarded as commercial pornographic websites. However, should such a website or app be marketed as making pornographic material available, a person making pornographic material available on that website or app will be considered to be making it available on a commercial basis, even if it constitutes less than one-third of the total.
This is a proportionate way of introducing a new policy. I am confident that these measures represent the most effective way of commencing this important new policy, but my department will, of course, keep it under review. Indeed, the Secretary of State must report on the regulatory framework within 12 to 18 months of commencement. In addition, the upcoming online harms White Paper will give us an opportunity to review the wider context of this policy.
We have also laid two pieces of BBFC guidance: the Guidance on Age-verification Arrangements and the Guidance on Ancillary Service Providers. The guidance on AV arrangements sets out the criteria by which the BBFC will assess that a person has met the requirements of Section 14 of the Digital Economy Act to ensure that pornographic material is not normally accessible by those under 18. The criteria mandate: an effective control mechanism at the point of access to verify that a user is aged 18 or over; strict requirements on age-verification data; a requirement to ensure that “revisits” do not allow automatic re-entry; and prevention of non-human operators exercising the age-verification regime. The BBFC also provided examples of non-compliant features to help interested companies. The latter guidance provided a non-exhaustive list of ancillary service providers that the BBFC will consider. This list is not exhaustive, to ensure that this policy remains flexible to future developments. The BBFC published draft versions of both pieces of guidance and ran a public consultation for four weeks on the content. The draft guidance laid before this House takes account of comments received from affected companies and others.
I turn to the views of the JCSI, to which I referred earlier. We have been clear that although it will be a major step forward, age verification is not a complete answer to preventing children viewing online pornography, and we know that we are doing something difficult. Indeed, we are the first country anywhere in the world to introduce such a measure. We have considered the JCSI concerns carefully. We do not believe that the variation in the language of the legislation, between “met” and “applied”, will be difficult for a court to interpret. As for the committee’s concerns about the content threshold, the committee anticipates difficulty with the application and interpretation of the regulation. As I have already said, the regulation will not apply in a case where it is reasonable for the age-verification regulator to assume—those words are important—that pornographic material makes up less than one-third of the content. As is stated in the BBFC guidance, the BBFC will seek to engage and work with a person who may be in contravention of the requirement before commencing enforcement action.
I am aware that the committee has also drawn the special attention of both Houses to these two draft pieces of guidance, because in its view they fail to contain the guidance required by Section 25(1) of the 2017 Act and contain material that should not have been included. Section 3, paragraph 5 of the Guidance on Age-verification Arrangements sets out the criteria on age-verification arrangements which the regulator will treat as complying with age verification. The guidance then goes on, in paragraph 6, to give examples of features which, in isolation, do not comply with the age-verification requirement. This approach ensures fairness. It takes a product-neutral approach and, rather than recommending a particular solution, sets out principles to encourage innovation. The ancillary services providers’ guidance provides a non-exhaustive list of classes of providers which the age-verification regulator may consider as within scope in Section 3, paragraph 3. However, in order to ensure that this policy remains flexible for future developments, it is necessary that this is a non-exhaustive list. Where new classes of ancillary services appear in the future, the BBFC’s guidance explains the process by which these services will be informed.
The guidance includes additional material, as this is a new policy, and the regulator considered that it was important for stakeholders that its guidance set out the wider context in which the age-verification regulator will carry out regulation. This includes valuable guidance on matters such as the BBFC’s approach and powers, and material on data protection. We find it somewhat perverse that it should be prevented from including helpful guidance simply because it was not specifically mentioned in the Act.
We are also aware of the Secondary Legislation Scrutiny Committee’s special interest report. That committee raised some similar concerns to the JCSI; for example, on the content threshold and the requirements in the BBFC’s guidance. The responses to the concerns of the SLSC on these points are the same as the responses we have just given to the JSCI reports.
However, the SLSC also suggested that the House might want to ask what action the Government would take to tackle pornographic material available that does not fall within the criteria set out in the regulations. I appreciate that some pornography is available by means not covered by our regulations. This was the subject of extensive discussion during the passage of the Act. In particular, concern has been expressed about social media platforms. We expect those platforms to enforce their own terms and conditions and to protect children from harmful content. Indeed, the Government have been clear that online platforms must do more to protect users from such harmful content. We will set out our plans for new legislation to ensure that companies make their platforms safer, in the forthcoming online harms White Paper.
I recognise that age verification is not a complete answer but I am proud that this Government are leading the way internationally in our actions to protect children online. I beg to move.
My Lords, I am pleased to speak in general support of the regulations and guidance. They relate to matters which I and others raised during the passage of the Digital Economy Bill in 2017 and, more broadly, to issues debated by the House a couple of years ago in a balloted debate that I introduced. The subject of that debate was the impact of pornography on our society. While there was some disagreement over the impact of pornography on adults, there was virtual unanimity that children needed to be protected from pornography—as far as this could reasonably be achieved. I seem somehow, by default, to have become the episcopal expert on pornography. I am trying to live that down. It is just the way it has fallen—although I often find myself talking from these Benches about things I have not had much experience of.
The regulations deal with protecting children through the introduction of robust age-verification procedures for accessing at least some pornographic sites. I welcome them but I note that there remains good evidence for believing that adult access to pornography is also often harmful. The recent report on sexual harassment by the Women and Equalities Select Committee in the other place made this point in a new context, particularly in relation to violent pornography. My welcome of the regulations and guidance is also tempered by some questions which they pose, and which I would like to put to the Minister.
My main concern relates to access to pornography on websites that do not charge for access. Provided their pornographic content is limited to one-third of their total content, they are exempted from the regulations. They may not charge but they may make money from advertising and other sources. What is the rationale for choosing one-third and not, say, 10%? Parents really do not want their children to stumble across online pornography and arguably children are more likely to do that if it is a website that does not charge in the first place. Why is it one-third? I realise that enforcement against every site would be a challenge, but surely the obligation to use access by age verification should be on all sites which promote pornography. What we need is a culture change in relation to child protection and not a partial, piecemeal and limited approach, which I fear these regulations, in some respects, provide.
In this spirit, I also point out that, as the Minister said, social media platforms such as Twitter are not within the scope of the regulations. I am not a tweeting Bishop. It is not something that I engage with much so I speak in some ignorance of just how Twitter works on the grand scale, but I note that just a month ago, a Member of your Lordships’ House, the noble Baroness, Lady Kidron, stated that 500,000 pornographic images were posted daily—yes, daily—on Twitter; that was on 12 November at col. 1766. Now, 500,000 is a round figure. I do not know what the figure is but clearly if it is anything like that, there is a whole world there of the promotion of pornography which these regulations do not catch. I believe we will need to return to the role played by social media platforms in conveying pornography. They may be difficult to regulate. It is a difficult and complex area, as the Minister said, but that is not an excuse for not trying our best. I believe there is more that we will need to do.
I welcome the regulations and the guidance, as far as they go, but I am sure there are related issues to which we will need to return in due course.
My Lords, I know that the Minister has carefully considered the definition of “commercial pornography”, and I am grateful that he has engaged with my comments on previous drafts of the regulations and that we have met in person to discuss these. Further to those conversations, I am happy to say that I support the regulations and the guidance, and certainly encourage other noble Lords to do the same, although I have a number of concerns I would like to highlight.
First, I note that it has taken more than 18 months since Third Reading to get to the point where this House and the other place are considering the regulations to determine what is deemed commercial pornography and the regulator’s guidance on age verification. I hope the Minister can assure us that the full implementation of age verification for pornographic websites is now very close. Indeed, it would be even better if he could tell the House when he expects it to be operational.
Secondly, I note that in its report on the Bill, Sub-Committee B of the Secondary Legislation Scrutiny Committee said that the measures available to the BBFC, as the age-verification regulator, should be applied “fairly and transparently”. I certainly hope that they will be. To this end, I ask the Minister to place a letter in the Library nine months after age verification goes live, with an update on the number of websites with AV in place and how many enforcement actions have taken place. I hope that that will be possible.
Thirdly, I cannot address the regulations and guidance that will help give effect to Part 3 of the Digital Economy Act without reflecting on the fact that, thanks to amendments introduced by your Lordships’ House, Part 3 will no longer address some very serious problems as effectively as it would have done. When Part 3, as amended, is implemented, there will be nothing in it to prevent non-photographic and animated child sex abuse images, which are illegal to possess under Section 62 of the Coroners and Justice Act 2009, being accessed behind age verification. This is a serious problem. In 2017, 3,471 reports of alleged non-photographic images of child sexual abuse were made to the Internet Watch Foundation, but since none of these images was hosted in the UK, it was unable to act.
Of course I appreciate that technically the amendments to the Digital Economy Bill, which removed from the regulator the power to take action against such material when it is behind age verification, did not have the effect of legalising possession of this material. The 2009 Act remains in place. However, as virtually all this material is beamed into the UK from other jurisdictions, the arrival of the Digital Economy Bill in your Lordships’ House meant that for the first time we had a credible means of enforcing that law online. There is no need for a regulator to be in the same jurisdiction as a website that it determines to block.
As I said at the time, greeting the first really credible means of enforcing that law online by removing the relevant enforcement mechanism from key parts of the Bill inevitably called into question our commitment to the law. I appreciate that there is arguably a mechanism for trying to enforce the law: the National Crime Agency can work with overseas agencies if websites with this material are identified. However, the mechanism is slow and expensive, and it remains unclear how it can have any effect if the domestic laws of the countries in question permit non-photographic child sex abuse images. To this extent, it was no surprise to me that in response to a Written Parliamentary Question in September 2018, the Government were unable to say whether the NCA had taken action against any websites, or whether any sites had been removed by overseas jurisdictions. ComRes polling published in the summer shows that 71% of MPs think that the regulator should be empowered to block these sites. Only 5% disagree.
The other loophole, of course, relates to all but the most extreme forms of violent pornography. Given that under the Video Recordings Act 1984 it is not legal to supply this material, it was entirely proper that the Digital Economy Bill, as it entered your Lordship’s House, did not accommodate such material. However, amendments were introduced in this House to allow it behind age verification. As I observed at the time, this sent out the message loud and clear that violence against women—unless it is “grotesque”, to quote the Minister on Report, at col. 1093—is, in some senses, acceptable.
My concerns about the impact of such material remain and have been mirrored by those of the Women and Equalities Select Committee in its report, which I referred to earlier. Of great importance, it states:
“There is significant research suggesting that there is a relationship between the consumption of pornography and sexist attitudes and sexually aggressive behaviour, including violence. The Government’s approach to pornography is not consistent: it restricts adults’ access to offline pornography to licensed premises and is introducing age-verification of commercial pornography online to prevent children’s exposure to it, but it has no plans to address adult men’s use of mainstream online pornography”.
I appreciate that we cannot deal with these problems today. The Government must, however, urgently prioritise how to address them. They could deal with the matter very quickly if they were to make time for my very short two-clause Digital Economy Act amendment Bill, which addresses the matter in full. With these caveats, I warmly welcome the regulations and the guidance.
My Lords, I welcome the Government’s decision finally to lay this guidance and the regulations for the House’s approval. It has not come a moment too soon. As the Minister knows, I have been concerned for some time that we should progress implementation of Part 3 of the Digital Economy Act and stop dragging our feet while harm is being done to our children. Almost every week, I hear of cases of children as young as four experiencing the traumatic horror of accidently discovering pornographic material online. This can be devastating for young minds, causing them anxiety and depression.
This is ground-breaking child protection legislation and we should be proud, because it will be the first of its kind in the world. The UK is leading the way in online safety and setting an example for other countries that are looking to introduce similar controls. We can demonstrate that it is possible to regulate the internet to ensure that children can be protected from online pornographic material that we would never let them near in the offline world.
There is an abundance of evidence showing how harmful this material can be and, significantly, that children often do not seek it out but stumble across it. Research by the NSPCC found that children are as likely to stumble across pornography by accident as to search for it deliberately. Also significantly, the NSPCC reports that children themselves support age-verification. Eighty per cent of young people felt that age-verification is needed for sites that contain adult content.
The age-verification regulator, the British Board of Film Classification, has been working on implementing the legislation for a number of months and has kept me briefed on its progress. I am confident that it will successfully deliver age-verification in the UK to prevent children stumbling across and accessing pornography. Its guidance sets out principle-based standards which will encourage even more innovation and allow for new means of age-verifying consumers in the future. This is important because if this regime is to work, age-verification needs to be robust and privacy must be protected.
My concern, as always, is with child protection, but I recognise the need to ensure that this regime is seamless enough to prevent commercial incentives to avoid compliance. For this reason, I am pleased that the BBFC has said in the annex to the guidance that it intends to introduce a voluntary scheme to bring in a higher privacy standard than the GDPR—which is already of a high standard.
I would like the Minister to reassure us that this scheme will be in place shortly and that the Government will fully support it. It is most important that, as the age-verification regulator, the BBFC will have a range of enforcement powers, including requesting ancillary service providers and payment service providers to withdraw their services to non-compliant websites, and instructing internet service providers to block them. These powers should be highly effective in achieving the legislation’s objectives and should be used as swiftly as possible to encourage compliance. I ask the Minister: how will the Government encourage ancillary service providers, who can only be “requested” to take action, to co-operate fully with the BBFC? I have been told by the BBFC that PayPal, Visa and MasterCard have already indicated that they will withdraw services where there is non-compliance. I also welcome the support that I understand will be given by the ISPs and mobile network operators. Their role will be crucial.
While I welcome the regulations and guidance before us today, I want to put on record my great sadness that when Part 3 is implemented under their direction, this will not block non-photographic child sex abuse images, which it is illegal to possess. The vast majority of non-photographic child sex abuse images accessed within the UK, which can be computer generated and incredibly lifelike, are beamed into Britain from websites in other jurisdictions beyond the easy reach of UK law, or indeed the Internet Watch Foundation. Instead of seizing what was an effective enforcement mechanism for our non-photographic child sex images legislation, sadly, amendments were put forward to prevent the regulator using the one enforcement tool which could work in addressing this problem: IP blocking. Rather than waiting for the review of the terms of Part 3 in a year or 18 months’ time, I urge the Government to take action now by giving time to support the very short Digital Economy Act amendment Bill of the noble Baroness, Lady Howe. It has only one or two clauses—we might discount the clause dealing with extent—and could address this issue.
I am also disappointed that the regulations before us do not cover all online pornography, for example on social media and through search engines. There is plenty to be seen out there. I understand that the BBFC is set to report back to Government 12 months from these regulations entering into force, and annually thereafter. I believe it may recommend alternative or additional means of achieving the legislation’s child protection objectives. I look forward to seeing what recommendations it makes to protect children further online and to help make the UK the safest place for children to be online. I can assure your Lordships that I will be monitoring its progress closely.
I would welcome an assurance from the Minister that the Government will take a flexible and proactive approach if changes need to be made to reflect changes in technology or behaviour. This is such an important child protection measure that we must all give it the support it deserves. Children should not be able to access pornographic material. While this law might not stop all access, it will act as a speed bump to save a child’s mind from being blighted for life. I often visit Rye Hill prison near Rugby, which houses only sex offenders—over 680 of them. Many of them say to me that they wish they had not been exposed to pornography at an early age, and would do anything to prevent children today being able to do the same and thereby becoming addicted to porn.
This has been like running a marathon but we are nearing the finish line and taking huge strides in the right direction, as this sets a fundamentally important precedent in online regulation. As I always say, childhood lasts a lifetime, so let us give children happy memories as they go forward.
My Lords, I first apologise to the House that I missed the first two minutes of the Minister’s contribution. I would like to make some comments. Some have already been said and I hope that by repeating them, it will not lessen their impact. I am pleased to join other noble Lords in supporting the Government on bringing these regulations and the guidance before the House to implement age verification. However, I have some questions about implementation.
First, I note that the regulations apply to all pornographic websites that charge a fee or, if access is free, where there is benefit in some other way from pornographic content, perhaps through advertising. In cases of the latter, at least one-third of the site’s content must be pornographic for it to be required to provide AV, unless the website specifically markets itself as providing pornographic content, in which case AV requirements apply regardless of how much pornographic material is made available. This arrangement has caused Sub-Committee B of the House of Lords Secondary Legislation Scrutiny Committee to ask two key questions, which I put to the Minister today. First, how will the BBFC measure pornographic content on a free website so that it can come to a determination that one-third of the content is pornographic? Secondly, how will we protect children from pornography on free websites where less than a third of the content on the site is pornographic?
It seems to me that, in introducing this legislation, the Government have very properly recognised that it is not appropriate for children to stumble upon online pornography; they should be protected from this material through age verification. Having conceded this point, however, what justification can there be not to protect children from accidentally stumbling across pornography on a free site where 30% of the content is pornographic? Is there not a sense in which children are more likely to stumble accidentally on pornographic content located on websites with other content than on a site that is completely focused on providing pornography?
Turning to the guidance document, I note that page 9 suggests that age verification may not need to be conducted every time someone visits a website. Does this mean that if a child uses a computer that has previously been age verified by a parent, they will automatically be able to access adult sites without any further checks to establish that the computer is being used by an adult? What protections will be applied to prevent this happening? Moving on to page 7 of the guidance, I note that a website found to be in breach of the age-verification requirements will be given a “prompt timeframe for compliance”. However, what does “prompt” mean in practice? Will a website be required to rectify the deficiency within a day, a week or a month, or maybe longer? I hope the Minister will be able to make that clearer.
One of the enforcement mechanisms that has caused some questions is the ability to issue fines. At Second Reading of the Digital Economy Bill, almost two years ago to the day, I raised some practical questions about how fines would work in practice, as many of the sites are based overseas. I remind your Lordships that, when that Bill was in Committee in the other place, the Government said that it was possible in some circumstances to fine sites in other jurisdictions. They said:
“We want to be able to fine non-UK residents—difficult as that is—and there are international mechanisms for doing so. They do not necessarily reach every country in the world, but they reach a large number of countries. For instance, Visa and other payment providers are already engaged in making sure that we will be able to follow this illegal activity across borders”.—[Official Report, Commons, Digital Economy Bill Committee, 20/10/16; col. 217.]
I tabled some probing amendments in Committee here on 2 February 2017 about the use of fines, which I was then concerned would have “limited utility”. I can conclude only that the Government have come to the same view, as they are not proposing to bring those parts of the Act into effect. However, given that at the time of the Bill the Government were adamant that the ability to fine was needed, I hope there will be an analysis of the effectiveness of the other enforcement mechanisms going forward and that, if there is a gap that could be met by fines, the Government will bring the fining provisions into effect and designate a regulator to collect the fines—something the BBFC is not designated to do.
Finally, I echo what other noble Lords have said about non-photographic child sex abuse images, which it is illegal to possess under the Coroners and Justice Act 2009, and all but the most violent pornography, which it is illegal to supply under the Video Recordings Act. As I said at the time, I believe that a terrible mistake was made in moving amendments to prevent the regulator blocking this illegal content. I am pleased that Section 29 of the Act requires a review of the Part 3 definitions 12 to 18 months after the implementation of that part, which could make good this shortfall. This delay, however, is too long. I call on the Government to address this shortfall in the new year by making time for the Digital Economy Act amendment Bill proposed by the noble Baroness, Lady Howe. It is a very short Bill, the substance of which is all in a single clause. Crucially, however, that clause addresses all the presenting issues. I very much hope that the Government will seize this opportunity.
My Lords, I have some concerns about the regulations before us, although I have to say that the Government are trying to do something that is very difficult, if not impossible, which is to regulate the internet. I am perhaps not as enthusiastic as my noble friend Lady Benjamin on this.
These regulations do not adequately protect children from pornography or, to use the Minister’s words in his introduction, they are not the complete answer. No system of age verification can do that as, as other noble Lords have said, pornography is available on sites other than commercial pornography sites and the potential controlling measures could not in reality be used for those other sites. Asking UK internet service providers to block Tumblr and Twitter is not a runner in the real world, and these are free-to-use services, so asking financial institutions not to take payments for non-compliant sites would not work either.
In fact, using a virtual private network to appear to be in a country that does not have age verification is a free and easy way to get around any age-verification process. As the noble Lord and my noble friend said, we are the first country to try this and therefore there are plenty of countries that one can pretend to be in in order to get round the system.
Age verification without statutory guidelines to protect the privacy of adults seeking to access legal pornographic material on the internet is a significant threat to people’s privacy but, having said that, the arrangements that the British Board of Film Classification has made around a voluntary code, where certification is given to companies providing age verification to give people some confidence that their privacy will be protected, is a second-best but welcome measure.
These measures, in addition to creating the risks to privacy and failing to thwart curious and determined young people, are of use in preventing children accidentally stumbling across pornography, as my noble friend Lady Benjamin said, but only on commercial porn websites. Does the Minister feel that this could as easily be achieved by making it mandatory for websites that contain pornography, whether one-third or more of the website, to have “Adult only” warnings before the browser of the internet can access pornographic images as opposed to an age-verification system?
The other use of these regulations is an attempt to restrict access to extreme pornography, which is a bit like Brexit—it is not enough for some and goes too far for others, as the noble Baroness, Lady Howe of Idlicote, indicated. I am not sure that, as some have suggested, age verification limits access to educational LGBT+ resources; I am not sure that commercial porn sites contain such beneficial information. As I have said before and will say again, what is really needed is compulsory, age-appropriate, inclusive sex and relationship education for all children, including telling even very young children what they should do if they encounter online pornography—that is, to turn off the computer immediately and inform a parent or guardian—as that is unfortunately something that will inevitably happen despite these regulations.
My overall message is that we should not delude ourselves that these measures are going to be wholly effective in preventing children viewing online pornography or that they will adequately protect the privacy of adults seeking to access legal material on commercial porn websites and that as a result we should be careful that we do not lull ourselves into a false sense of security just by passing these measures.
My Lords, I want to say a few words before the summing up. We need to remind ourselves that the purpose of these regulations is to protect children, including those coming up to adulthood. We are trying to prevent them thinking that some fairly unsavoury habits that are not medically good for them are normal. That is the challenge. These websites have teaser adverts to try to get people drawn into pornography sites to buy harder-core or more detailed pornography. We are not trying to do anything about people who are willing to enter into a payment arrangement with the site but to make sure that children are stopped at the front end and are prevented from seeing the stuff that will give them the wrong impression about how you chat to a girl or a girl chats to a boy and how you behave with members of the same sex or the opposite sex in a sexual relationship. We need to be quite quick on this sort of stuff because if we are going to try to stop this being widespread we need to block it.
There is awful lot of guff in this. It has taken a long time for these regulations to get here—we really expected them about a year ago. I do not know what DCMS has been doing during this time. I know it had some draft guidelines a long time ago, but perhaps they were so young that they were uneducated too and tried to learn about these things—I do not know.
The point about the adverts is they sit there in front. We are probably going to have buttons on the front of the website stating that people have to verify their age. That will take people off, probably to third-party sites which know them and anonymously verify that they are over 18 and that is when they can get into the website. However, the website is going to want to put something up for that first encounter. I wonder whether this is not an opportunity to think positively and perhaps put up something about understanding the beginning of a relationship and how you can get excited and go forward without going to the harder aspects which involve penetrative sex et cetera. There may be an opportunity there. That is a bit of a red herring because we are talking about the regulations, but it may be a positive thought for the future.
The thing that worries me particularly is paragraph 2.5 of the BBFC guidance which refers to sites that are,
“mostly frequently visited, particularly by children”,
“most likely to be sought out by children”.
Social media may not be marketed as carrying or giving access to pornography, but it does so on a huge scale. This one-third rule is very odd because it is easily abused. There are about 39 million UK users of Facebook, so do we say that if 12 million are putting up pornography that is okay because it is under the one-third threshold? Earnings would be very hard to measure, given Facebook’s turnover, so how are we going to do the one-third? It is very odd. The purpose of this is to protect children, so I do not think we should be having very high thresholds to let people get away with it.
There are two things that really worry me. Paragraphs 2.6, 2.7 and 2.8 of the guidance are on enforcement. It is going to be very slow. By the time the BBFC has sent out a warning and it is received, given another notification, published this, waited for the website to write back, et cetera, how long will it take? Websites that want to get round it will game the system. If they start doing that, the big websites—they are on side with this and want to help because they have got teenage children and are not paedophiles but are trying to sell adult pornography to adults and therefore want to help, believe it or not—will lose too much business; they will have to go with the flow and play the same game, in which case the whole thing will get wrecked.
If the Internet Watch Foundation, without a true legal basis, can get sites blocked immediately, why cannot we, with proper law? Everyone has had warning about it. The whole of the industry around the world has apparently been talking about it for the past year. The BBFC has spoken at such events. Everyone knows, so I cannot understand why we cannot act more quickly and go live from day one. If anyone does not comply, that is bad luck. We could set up some pre-notification stating: “If you do not comply by tomorrow, you have had it”.
The other matter is the certification scheme, which is voluntary. A big hole is that because this is under a DCMS Bill, it could not touch privacy and data security. That is an ICO responsibility. The security of people’s data is regulated elsewhere, and the ICO has only recently started to show an interest in this, because it is overloaded with other things. There is now a memorandum of understanding between the BBFC and the ICO, which is very good. They could be brought together in a certification scheme. The BBFC cannot enforce data security and privacy, because that is an ICO responsibility, but a certification scheme could state that a site cannot be certified unless it complies with all the legal standards—both the Data Protection Act 2018, which the ICO is looking at, and the BBFC rules on age verification for websites and providers. That could be good.
If your Lordships want to know how to do it, I fear I shall give a plug for the British standard for which I chaired the steering group, BS 1296; it includes a whole section on how to do the GDPR stuff, as it was then called. We could not mandate it in the British standard because other standards mandate it, but that tells you how to do it.
The certification needs to be clear, otherwise there will be a whole lot of wishy-washy stuff. I am not sure that a voluntary scheme is a good idea, because the BBFC will have a lot of hard work trying to check sites that decide not to comply, so it will have to certify them by another method. That will be difficult.
However, at the end of the day, there is a lot of willingness between all the parties to try to get this to work. The world is watching us—quite a few other countries are waiting to see whether this will work here. That will help enormously. We should try to get a lot of cross-stakeholder information and co-operation, a round table of all interested parties from child protection all the way through to those running the adult sites. Perhaps some good could come out of that. Certainly, everyone wants to help the BBFC and DCMS, the parent body. Everyone wants to help the ICO. We would like to get this to work: there is a lot of good will out there if only we could get moving to make it work properly.
My Lords, we on these Benches want the regulations and draft guidance to come into effect. The child protection provisions are a significant element of the Digital Economy Act which, although not entirely in line with what we argued for during its passage, we supported in principle at the time and still do, while realising, as my noble friend Lord Paddick said, that they are not the conclusive answer to children’s access to pornography. As he also said, a number of areas need to be addressed in the course of today’s debate.
For a start, as several noble Lords said, it seems extraordinary that we are discussing these sets of guidance nearly two years after the Digital Economy Act was passed and nearly a year after the Government published their guidance to the regulator, the BBFC. What was the reason for the delay?
Next, there is the question of material that falls within the definition of being provided on a commercial basis under the Online Pornography (Commercial Basis) Regulations, the subject of today’s debate. Several noble Lords mentioned this. As drafted, they do not currently include social media or search engines and on these Benches, we regret that the Government have decided to carve out social media from the definition. This is a potentially significant loophole in the regime. It is important that it is monitored and addressed if it damages effectiveness. It is in particular a major concern that social media and search engines do not have any measures in place to ensure that children are protected from seeing pornographic images.
The Secretary of State’s guidance to the AV regulator asks the BBFC to report 12 to 18 months after the entry into force of the legislation, including commenting on the impact and effectiveness of the current framework and changes in technology which may require alternative or additional means of achieving the objectives of the legislation. In addition, under Section 29 of the Digital Economy Act, 12 to 18 months after the entry into force of the scheme, the Secretary of State must produce a report on the impact and effectiveness of the regulatory framework.
This is therefore a clear opportunity to look again at social media. The Government have made some reference to legislating on social media, but it is not clear whether they intend to re-examine whether the definition of commercial pornography needs to be broadened. Can the Minister assure the House that this will be dealt with in the internet safety White Paper, that the Secretary of State’s report will cover the level of co-operation by services such as social media and search engines, which are not obliged to take enforcement action on notification, and that, in doing so, it will firmly tackle the question of access by children to pornography via social media?
Next is the question of resources for the age-verification regulator. This is a completely new regime, and with fast-changing technology, it is vital that the BBFC, as the AV regulator, has the necessary financial resources and stable grant funding to meet the important child protection goals. Can the Minister assure us that the Government will keep resources available to the BBFC in its AV regulator role under review and undertake explicitly in the Secretary of State’s annual report to deal with the question of resources enabling the BBFC to carry out its work?
Next is the question of the BBFC having chosen to adopt a voluntary scheme. On these Benches, we welcome the voluntary scheme for age-verification providers referenced in annexe 5 to the draft Guidance on Age-verification Arrangements. In fact, it bears a striking resemblance to the scheme that we proposed when the Act was passing through Parliament, which would have ensured that a scheme involving third-party companies providing identity services to protect individual privacy and data security would be engaged. As I recall, the noble Earl, Lord Erroll, helped greatly in convening providers of digital identity schemes to show what was possible. I think he is still ahead of us today.
Our key objections were that what was originally proposed did not sufficiently protect personal privacy. The BBFC is to be congratulated on establishing the certification scheme. As I understand it, it already expects all the major providers to undertake the certification process. Furthermore, because the scheme is voluntary, these assessments will be for foreign-based as well as UK providers, which is a major achievement and could not be accomplished with a UK statutory scheme.
The key to the success of the voluntary scheme, however, is public awareness. I hope that the Minister can tell us what the DCMS is doing to support the promotion of the BBFC’s kitemark in the three months before the scheme comes into effect.
Next, there are the JCSI criticisms set out in its report on 28 November. This House rightly always takes the criticisms of the JCSI seriously, and the Minister set out a careful response to them. I do not always pray a government memorandum in aid, but the BBFC was following the Secretary of State’s guidance to the AV regulator. Under the terms of Section 27 of the Digital Economy Act, as a result of amendments in the Lords during its passage, the BBFC was charged with having regard to the Secretary of State’s guidance. The JCSI suggests that the BBFC could have chosen to ignore “incorrect” Secretary of State guidance, but that would have put it in an impossible position.
I shall not adumbrate all the different areas, but the inclusion of what was necessary in compliance with Section 27, the advice on best practice, the annexe setting out the voluntary scheme and the role of the ICO all seem to be helpful as part of the guidance and proportionate in terms of what the AV regulator prioritises.
There are a number of other aspects of these sets of guidance worthy of mention too. As we have heard, this age-verification framework is the first of its kind in the world, and there is international interest in it. Are the Government discussing with the BBFC what lessons there are in terms of encouraging robust AV for younger age groups and for other types of potentially harmful content? Will the Government use the expertise developed by the BBFC as the age-verification regulator in the internet safety White Paper?
Finally, although we have not had the benefit of the arguments of the noble Lord, Lord Stevenson, because of the way in which procedures today have operated, I come to the criticisms made by the noble Lord on the failure to bring Section 20 into effect. I agree in principle that it would be desirable to have the full set of powers envisaged by the Digital Economy Act, but it seems that the BBFC is very confident of its current enforcement powers, particularly in relation to ensuring that payment service providers cut off services under Section 21 for those who are non-compliant, as my noble friend Lady Benjamin mentioned, and of course, the last resort—the powers requiring ISPs to block access to material under Section 23 of the Digital Economy Act. Either of these will have major consequences for non-compliant providers of pornographic material.
I very much look forward to hearing the Minister’s response. Of course, this guidance and these regulations are not the be-all and end-all and not the total solution, but I very much hope that they will form part of the solution.
My Lords, this has been a very good debate, and I thank the Minister for his introduction, which allows us to range quite widely over the issues in play. I would observe—and I would not have it any other way—that over the last couple of years, the noble Lord, Lord Ashton, and I, and, indeed, one or two noble Lords who have spoken today, have spent a great deal of time together discussing and debating legislation and regulations which might apply to all pornography, and specifically in relation to protecting children. Some people bond over a coffee, football, the arts or shared hobbies; we do it with porn. In that sense I am with the right reverend Prelate who felt that he had to live it down in some way. I share his pain.
We have covered a lot of ground in this area and, although on the surface it is quite a narrow issue, getting the balance right between personal liberty and necessary regulation is never easy, and it is particularly hard to do given the technological changes that we are witnessing—in particular, the way in which information is now flowing through the internet.
I have been reading back at some of the debates we had on the Digital Economy Act, as have others, and at some of the original regulations that we have already looked at which appointed the BBFC as the is the AV regulator. I want to make it clear that we do not want to hold up these statutory instruments—as noble Lords have already mentioned, they are already quite delayed. I have come to a provisional conclusion that what we have before us will not achieve what the Government intend, and may actually have unintended consequences and run the risk of stalling other, better alternatives, which I think we may have to consider in due course. Others have said this before, but it is worth repeating: these regulations are not future proof; they are not comprehensive; they do not catch social media; they do not deal with overseas providers; they will not deal with non-photographic images and other more elaborate ways in which pornography is now being purveyed; and they do not bind together the companies involved to try to find a solution.
I will go through the regulations and make comments which are very similar to those that are already there and I will speak a bit to my own regret Motion. I will come back at the end of my remarks to where I think we need to go if we are going to take this issue further.
The general point on which I wish to start, before going on to the points raised by the scrutiny committees, is the argument I made before that a lot of the difficulty we have today with these regulations stems from the fact that we are trying to give statutory powers to a body that is essentially a private company. This is compounded—this comes up in the committee reports —by the fact that Parliament is not used to seeing regulations over which it has no direct authority, because they will be implemented through an arrangement between the department and a private body: the BBFC. In a sense, we are reading largely independent guidelines, fulfilling a mandate agreed within legislation but not subject to the specific scrutiny of this House, or indeed of the other place.
The BBFC is not a statutory body. It has no royal charter, so it cannot be assumed that it will act in the public good. It has a reasonable record, and it has statutory responsibility for videos and DVDs—but its work, for example in classifying films shown in the cinema, is done without any statutory authority. Will this issue be picked up in either the White Paper or the review which the Minister mentioned in his introduction?
My second point relates to the first in the sense that we have still not bottomed out the question of appeals that might arise as a result of the decisions being taken by the BBFC. We tried in the Digital Economy Bill to exert considerable pressure on the Government to get a separate regulator appointed as an appeals body. Indeed, we suggested that Ofcom would have been appropriate. Now we have a situation where the BBFC is the organisation of preliminary determinations and the body of first instance, but it is also the body for appeals. In principle, I do not think it is right that any body, statutory or otherwise, should be both judge and jury in its own cases. I look forward to hearing the Minister’s response. Can this be reviewed as part of the process?
Thirdly, we are skating round the question of what exactly is obscene material. Why do we have two existing definitions—one that is repeated in full in the documents before us but also one that derives from the definition of extreme pornography which is in another Bill? We had a good discussion about this during the DEA. The noble Baroness, Lady Howe, mentioned some of the ideas that were considered and turned down at that time, but it was also raised in the Data Protection Bill—so it will not go away. I think that in the review that is coming, it is really important that we nail what exactly we are trying to say. Either it has to be done in terms of perception or in terms of physical activities. I do not think that it can be both.
Turning to the instruments themselves, on the electronic communications one, which was referenced by Sub-Committee B of the Secondary Legislation Scrutiny Committee and the Joint Committee, the issue seemed to be, as has already been said, the rather odd definition of a “commercial basis”. We are looking for assurances from the Minister in relation to how that will apply, particularly in relation to children who come across internet sources which do not fall within the criteria specified. The second point, which has also been picked up, is the question of one-third of the overall content, which is a very odd way of trying to approach what I think is a sensible idea—that there should be some de minimis limit on what is considered a commercial provider of pornography, but measuring it in the way that has been suggested. Even with the comments made by the department to the committee, the Government have not taken that trick. I look forward to the Minister’s comments in the hope that he will deal with some of the examples given by the Joint Committee, which seem to raise issues.
On the AV guidance contained within the statutory instrument on that matter, again there are suggestions from both committees. The first point is the rather nuanced one made by the Secondary Legislation Scrutiny Committee that, as the BBFC has not provided an exhaustive list of approved age-verification solutions, the Minister himself should explain more fully the types of arrangement which were deemed adequate. He may find that that is better done by correspondence.
The question raised by several speakers of why the Government have not brought forward the power under Section 19 to impose financial penalties is the focus of my regret Motion, and I shall deal with that now. Both Sub-Committee B and the Joint Committee found this a very strange decision, and others have mentioned it as well. I hope that the Minister will be able to respond in full. The argument is very straightforward. Since we have doubts about the whole process and the concerns that exist are about the lack of effective solutions to protect children, one would have thought that the only way in which we can make progress on this is to ensure that the regulator has the effective firepower to get compliance if required to do so. It is interesting that in the documentation, and in the other regulation before us, search engines are fingered. Providers of IT services and providers of advertising can be hit. It is clear from the parallel situation in the gambling world that the support of the payment providers has been absolutely crucial in stamping out illegal practices there. Why have the Government not taken these powers?
On the same issue, but approaching it from the other end, I had problems with the guidance about a non-compulsory, additional, voluntary, non-statutory assessment and certification of age-verification solutions package, which is shown in annexe 5 of the documents before us. I gather that it will be an external agency, probably one of the large auditing firms. I found this very difficult to understand, and would be grateful if the Minister could explain what exactly is going on here. How is it that the ICO, an independent statutory body, is down as having developed this solution in consultation with the BBFC? If that is the case, it seems that its independence has been compromised and I do not see how that can work. In any event, adding another non-mandatory voluntary system seems to be just another way of complicating an already difficult area, as well as raising considerable issues of privacy along the lines raised by the noble Lord, Lord Paddick. Is this a wise step to take at the very start of a new venture? The whole question in relation to making a success of this seems to be in doubt. Will the Minister comment?
Finally, during the debate we held on the first order in this clutch of statutory instruments, which confirmed the BBFC as the age-verification regulator, the Minister confirmed that it was not the BBFC’s job to determine whether what is being offered on its sites to adult users is lawful. Can the Minister confirm that, despite the slightly ambiguous wording in some places in the draft guidance, the role of the BBFC is, as stated in the regulations, limited to assessing that a person offering such services,
“has met with requirements of section 14(1) of the Act, to secure that pornographic material is not normally accessible by children and young people under 18”?
In conclusion, I ruminated earlier about whether this was the right approach, given the need to get a proper grip of the situation. Let us put in context the fact that, through the Data Protection Act, we have set up and now brought to fruition a data ethics and innovation commission, which will deal with issues of personal data, privacy and the way in which they interrelate. We have begun to see the new, age-appropriate design approach to the way in which internet service providers have to look after the rights of children who get on to their sites. We have discussed the precautionary principle in relation to internet services more generally.
Finally, I will pose a question to the Minister. We have in front of us top-down, traditional approaches to regulation: setting limits, engaging in the possibility of serious action if the limits are breached, and making sure that—as far as possible—we are able to contain a situation that we think is now unacceptable. However, the only way to get by on this is if the companies themselves are involved, so a duty of care approach might be much more fruitful as a way forward. I would be grateful for the Minister’s comments on that.
My Lords, I thank noble Lords for their contributions and for the myriad questions which I will try to answer, in a slightly random order. It is important that we take a bit of time to discuss these; as many noble Lords have said, this is the start of something quite complicated. As I said at the beginning, we ought to bear in mind that we are trying to protect children. In the debates during the passage of the Digital Economy Bill, the Government always acknowledged that they would not have a complete solution, as many noble Lords said and as I mentioned during my opening remarks. We will take on board noble Lords’ comments. Indeed, we have shown—this is a partial answer to the question of why it has taken so long—that we have consulted quite widely; we have discussed the wording of the regulations themselves and the guidelines; and the Secretary of State’s guidelines to the BBFC, which the noble Lord, Lord Clement-Jones, mentioned, were available during the passage of the Digital Economy Bill.
We have tried to involve people, which is right given that we are at the beginning of something unique in the world. When we come to talk—I put a certain amount of emphasis on this—about social media and some of the areas that we do not cover in these regulations, we will look at those either in the review to come within 12 to 18 months or in the online harms White Paper. We are still discussing that White Paper and are still open to ideas about what it should include. I am pleased to say that the Secretary of State will make a meeting available to all Peers to discuss what they think should be in the White Paper. We will do that as soon as we can; I will let Peers know about it in due course.
I turn to the regret Motion tabled by the noble Lord, Lord Stevenson. The concern pertains to the fact that the regulations and BBFC guidance do not bring into force the provisions of the Digital Economy Act 2017, which would have given the regulator powers to impose a financial penalty on persons who have not complied. The noble Lord, Lord Stevenson, is not the only noble Lord to have mentioned that.
The regulator will have powers to issue enforcement notices and enforce these through civil proceedings such as proceedings for an injunction, and to give notice to payment service providers, ancillary service providers or direct internet service providers to block access to non-compliant material. It will have the flexibility to exercise these powers on a case-by-case basis, depending on what it thinks will be most effective. I say to the noble Earl, Lord Erroll, that, while there is no plan to block sites on day one, because a proportionate approach that gets people on side without needing to do so is preferable, the regulator will have the power to do that if it wants to. The Government and the BBFC believe that these powers will provide a sufficiently strong incentive to comply with the age-verification requirement. As we have said, there is a mandatory requirement for a review within 12 to 18 months. The Secretary of State already has the power, in the Act, to extend the powers of the BBFC if necessary.
I turn to some of the specific questions asked by noble Lords—I apologise for the slightly random order. The noble Lord, Lord Clement-Jones, asked whether the Government are discussing with the BBFC other possible forms of verification for younger groups. We will work with the industry to ensure its terms and conditions are upheld. We will also work with the tech sector to identify new approaches. The joint DCMS/Home Office White Paper will be published this winter and will set out a range of measures which could include that; however, as I said, this has not yet been fixed. We welcome noble Lords’ input.
The noble Lord, Lord Stevenson, has always had concerns about the BBFC and he mentioned those not only during the passage of the Act but when we designated the BBFC earlier this year. On appeals, it has considerable experience of administering an independent appeals procedure for the classification of film. We published the BBFC’s proposed appeals arrangements when the designation proposal was laid. It is important to note that the independent appeals panel will not include the regulator, the Government or affected industries. I believe there has not been a successful appeal from the film side for nearly 10 years, so we are content with the way things stand.
We are relying on the fact that the BBFC is a respected organisation with expertise in classifying content; it has done so for cinema releases since 1912 and for video content since 1984. It has a trusted reputation and is good at making difficult editorial judgments and giving consumers, particularly parents and children, clear information about age-appropriate content.
We feel that the existing financial enforcement powers will be okay, but of course we will be able to look at that in the review. The noble Earl, Lord Erroll, asked whether enforcement will take too long. A wide range of regulatory sanctions is available and there has already been engagement with ISPs. Based on that engagement, we are confident that the sanction will be effective. The wording of the Secretary of State’s guidance to the regulator stresses the need to take a proportionate approach and that is what is intended.
Several noble Lords mentioned the timeline and asked why it has taken so long to put the regime in place. The number of questions raised in this debate and the potential critiques of where we could have improved the overall regime show why it has taken so long. We have tried to consult and to get as much consensus as possible. We have always said that this is not a perfect solution, but age verification will make a substantial difference and prevent many children accidentally stumbling across pornography. To that extent, we think that it is a good thing.
There is no legal deadline for bringing the requirements into force but we are now in the final stages of the process. If your Lordships agree to these age verification arrangements, we will have reached the end. Following parliamentary approval, we will ensure that there is a sufficient period for the public and industry to prepare for age verification. I think we have said that there will be a minimum of three months, so we anticipate that enforcement will begin around Easter 2019, give or take some weeks.
The noble Lord, Lord Clement-Jones, mentioned resources, and we will obviously keep an eye on that. We understand that if a regulator is asked to do something but has inadequate resources, that is suboptimal. I thank the noble Lord for his remarks on the JCSI. We will, as he suggested, also take into account the experience that has been developed by the BBFC and will have regard to that for our White Paper if we think it appropriate.
The noble Lord, Lord Paddick, talked about privacy. It is of course crucial that users are able to verify their age in a way that protects their privacy. I do not know whether there was some misunderstanding on the part of the noble Earl, Lord Erroll, but the age-verification procedures have to meet the requirements of the Data Protection Act and the GDPR. That is a given, and the ICO will make sure that that is the case. We wanted an even better standard of protection. We will effectively have a gold standard to build trust, but every age-verification site will have to obey the GDPR, so in any event a strong privacy and personal data protection standard will be in force. The age-verification solutions that offer the most robust data protection, as set out in the gold standard, will be set following an independent assessment and will be published on the BBFC website for everyone to see.
Many noble Lords talked about the definition of extreme pornographic material. This was debated extensively—I will not forget it in a hurry—during the passage of the Bill. It is not within the scope of this debate, focusing entirely on the definition of commercial availability. However, because the primary legislation requires the Secretary of State to consult on the definitions before publishing a report on the impact and effectiveness of the regulatory framework, I think that is where we can continue that discussion. I assure noble Lords that we will revisit this issue. I suspect that I do not need to give that assurance and that it will be brought up anyway, but I assure the noble Baroness, Lady Benjamin, that we will be flexible and proactive.
I am sure that the noble Baroness, Lady Howe, was about to leap to her feet but, to save her doing so, I mention to the Minister that he did not answer the question which she posed, and which was picked up by the noble Baroness, Lady Benjamin, about whether he would find time for the excellent two-paragraph Bill which she has in process and which would solve many of these problems.
I had not forgotten that. It would obviously be difficult for me to commit to finding the necessary time but I will take that back to the department. I am not sure that it is currently within the plans of the Chief Whip to bring forward that legislation but I will ask. I understand the point that is being made but, as I said, the issue may well be covered within the review. I am afraid I cannot go any further than that tonight.
As for ancillary service providers, the BBFC and the DDCMS have been engaging with several companies. They have already agreed to act, as doing so is in line with their current terms of service. Therefore, we are optimistic that the voluntary approach will work, and of course that will be reviewed.
The right reverend Prelate, the noble Earl, Lord Erroll, and others talked about the rationale for choosing one-third of content as the appropriate threshold. During the passage of the Bill, it was established that the focus should be on commercial pornography sites and not on social media. There were good reasons for that but I do not want to revisit them—that is what was decided. The one-third threshold was regarded as proportionate in introducing this new policy where sites make pornography available free of charge. However, websites that market themselves as pornographic will also be required to have age verification, even if less than a third of the content is pornographic.
A third is an arbitrary amount. It was discussed and consulted on, and we think that it is a good place to start on a proportionate basis. We will keep this matter under review and, as I said, it will be one of the obvious things to be taken into account during the 12 to 18-month review. The noble Lord, Lord Morrow, asked how it will be measured. It will be measured by assessing the number of pieces of content rather than the length of individual videos. It will include all pornographic images, videos and individual bits of content, but the point to remember is that the threshold is there so that a decision can be made on whether it is reasonable for the regulator to assume that pornographic content makes up more than one-third of the entire content. This will be done by sampling the various sites.
The noble Earl, Lord Erroll, asked about ISP blocking and suggested that everyone would try to game the system to get out of meeting the requirements. That is not what we believe. The BBFC has already engaged with ISPs and we are confident that this will be an effective sanction. The wording in the guidance indicates that the regulator should take a “proportionate approach”. However, we are grateful for the noble Earl’s help. I am sure that he will also help during the review and later in the process when it comes to online harms. I see that he wants to help now.
It is not the ISPs that I am worried about; it is the websites that will game the system on notification, appeals and so on. That is the bit that will take a long time.
We are confident it will work, but we will have to see when it comes to the review. It is an arbitrary figure that we came to by consensus. I will leave it at that.
The noble Lord, Lord Paddick, talked about education for children about sex and relationships. We are extending that by making relationships education compulsory in all primary schools. Relationships and sex education is compulsory in all secondary schools and health education compulsory in primary and secondary schools. We understand that it is important. Together with the protection of children we are introducing today, we will have to keep an eye on it. I notice that the DCMS committee in the other place is launching an inquiry into, among other things, the effects of social media on people’s attitudes, including those of children. In a sense, we are all learning as we go, because the technology is developing. It is something we are aware of and keeping an eye on, and we take the point.
As for the big issue of the evening, and why social media sites are not in the scope, that was a decision taken after a debate during the passage of the Digital Economy Bill. We did not want to prevent the benefits of social media sites. But I confirm to the noble Lord, Lord Stevenson, that we will consider that in the online harms White Paper. Noble Lords will be welcome to add their thoughts on that very soon—either just before or after Christmas.
As noble Lords have mentioned, there is a memorandum of understanding that clarifies the role of the ICO and what powers it will have instead of the BBFC. The BBFC will administer the voluntary certification scheme that will hold AV services to the highest standards of privacy protection and cybersecurity. We expect the vast majority of AV services to seek accreditation. Furthermore, the BBFC will inform the ICO of any non-certified age-verification solutions it finds, and the ICO will be able to take a look at them. Even if they do not want to apply for voluntary certification, the ICO will make sure they are subject to the full rigours of the GDPR.
I have covered most of the main points; I will look at Hansard and write to noble Lords if I have not covered any. I think it is evident from all the contributions from across the House that this is a complex and novel policy that requires sensitive handling. Having listened to all contributions and heard limited support for the regulations as they stand, albeit with some suggestions for improvement, I remain of the view that these regulations set out clearly what will fall within their scope. I think the guidance from the BBFC sets out clearly how it will assess the requirements of Section 14 and clarifies the BBFC’s approach to payment and ancillary service providers.
We are on the verge of doing something important that has the potential to make a real difference to the experience children have online and to make the internet a safer place for them, so I finish where I began. We are here to protect children, and for that reason I ask the noble Lord, Lord Stevenson, to withdraw his Motion, or indeed not to move it, and respectfully ask the House to approve the two guidances and the statutory instrument.