With the leave of the House, we will debate motions 1, 2 and 3 together. I say that on the assumption that there is no objection. There appears to be no objection, so we will have a single debate for up to 90 minutes. I shall call the Minister to move motion 1 and to speak to all three instruments. Motions 2 and 3 will be moved formally at the end of the debate.
I beg to move,
That the draft Online Pornography (Commercial Basis) Regulations 2018, which were laid before this House on 10 October, be approved.
With this we shall consider the following motions:
That the draft British Board of Film Classification Guidance on Ancillary Service Providers 2018, which was laid before this House on 25 October, be approved.
That the draft British Board of Film Classification Guidance on Age-verification Arrangements 2018, which was laid before this House on 25 October, be approved.
The Digital Economy Act 2017 introduced the requirement for commercial providers of online pornography to have robust age-verification controls in place to prevent children and young people under 18 from accessing pornographic material that is made available on a commercial basis.
Section 14(2) of the Act states:
“The Secretary of State may make regulations specifying…circumstances in which material is or is not to be regarded as made available on a commercial basis.”
So, in a sense, this is a small part of the legislative jigsaw needed to implement age verification. It is the last piece. I therefore ask that the draft Online Pornography (Commercial Basis) Regulations 2018, and the two pieces of guidance published by the British Board of Film Classification on age-verification arrangements and on ancillary services providers be approved.
I should bring to the attention of the House the concerns of the Joint Committee on Statutory Instruments and the Secondary Legislation Scrutiny Committee, which I thank for their work. I will address their concerns in a moment but, before coming to the specific points related to this debate, I remind the House why we are introducing this requirement.
In the offline world, there are strict rules to deter children from accessing adult content. A large amount of pornography is available on the internet in the UK, often for free, with little or no protections to ensure that those accessing the content are old enough to do so.
This legislation is long overdue and I really welcome it. I ask the House to indulge me for a second. A parent came to my surgery soon after I was elected to discuss how their child had suffered a sexual assault in a school. During the conversation, the mother mentioned to me, at some length, the ready availability of online pornography at school, how this was a motivating factor—or at least a contributory factor—to the assault and the fact that it just damages our relationships with one another, particularly in young minds.
I thank my hon. Friend for that heartfelt intervention. I am very sorry indeed to hear about that case. There is great validity to the concerns the mother expressed to him.
In the Women and Equalities Committee, as a response to the awfulness of the #MeToo campaign, we took evidence that showed that the consumption of pornography is associated with higher levels of violence, including rape and sexual harassment. I therefore thank my hon. Friend for introducing age verification and for making sure that it works.
Public places are not age specific, and the Committee also suggested that viewing online pornography in public places, such as on buses and trains, should be restricted. Do the Government intend to go further by introducing a restriction on viewing online pornography in public places?
I must congratulate the Women and Equalities Committee on its extremely valuable work in this area. It exposed some very concerning issues and backed up its recommendations with evidence. Although the regulations do not touch on the viewing of pornography in public places, we have heard the recommendation of the Committee and what my hon. Friend has just said about that problem. That might be an indirect way of making such material accessible to the very children and young people whom the regulations are designed to help.
I understand that, in 2015, Ofcom said that that was the pivotal time when we switched from people viewing the internet on desktop computers to more people viewing it on handheld devices such as smartphones. The potential for people to view such things inadvertently in public, which has been identified by my hon. Friend the Member for Chelmsford (Vicky Ford), has therefore increased dramatically, as has the potential for children to be exposed to it.
I follow my hon. Friend’s logic. That was the conclusion that Ofcom reached. It is definitely worth considering the recommendation that he and my hon. Friend the Member for Chelmsford (Vicky Ford) have made on reviewing the law on viewing pornography in a public place.
I am sure that there is consensus across the House about protecting young people under the age of 18 from illegal or inappropriate material. What steps are being taken to ensure that, in any of the regulations or any of the wider efforts the Government are taking, we do not accidentally prevent young people from accessing age-appropriate material about sex and relationships education? I am aware of material for young lesbian, gay, bisexual and transgender people on YouTube and other platforms that has been erroneously caught up in age filters and other restrictions. That prevents young people from finding out in a healthy and age-appropriate way about their sexuality and the key things they need to understand as they are growing up.
The hon. Gentleman makes some very good points. I am aware of some of the cases to which he refers. When I explain the detail of the regulations, it should reassure him that we are seeking to catch the commercial provision of pornography on sites where at least two thirds of the content is of an adult nature. I think that should allay his concerns. However, we should keep the issue he raises closely under review.
I very much welcome today’s debate and the Government’s proposals, but parents who are listening to this debate may go away with the impression that everything on the internet will be subject to an age barrier. Will the Minister be clearer, for the benefit of parents who are listening, that the regulations will not include social media? What is she doing to ensure that social media platforms do not inadvertently become the way that young people under the age of 15 access pornography in the future?
I welcome the intervention from the Chair of the Women and Equalities Committee. Let me clarify here and now that the regulations are a very important step forward in preventing children from viewing pornography online. In particular, we are closing the loophole whereby children can stumble across such material inadvertently. However, my right hon. Friend is right that the regulations do not extend to social media platforms that contain pornographic content that is a relatively small minority of the content that they provide. This is not a foolproof guarantee that young people and children will not be exposed to pornography online. It is a significant step forward, but there is, as my right hon. Friend points out, the potential for people to access material on social media platforms, which do not fall within the scope of the regulations unless more than a third of their average content is pornographic.
Will the Minister give way?
Will my hon. Friend give way?
I am sorry that I am taking a long time to answer my right hon. Friend’s point, but it is an important one. I will finish with the last intervention before I take more interventions.
The Government are keeping a weather eye on the availability of pornography on social media platforms. I shall talk more about that, but I reassure my right hon. Friend that we will introduce further measures. My right hon. and learned Friend the Secretary of State for Digital, Culture, Media and Sport, who is in the Chamber, has a duty to report back on the impact of the regulations 12 to 18 months after their commencement and he will look at just the issues my right hon. Friend the Member for Basingstoke (Mrs Miller) has raised. I will make a little progress before taking further interventions.
There is no doubt, going back to the work of the Women and Equalities Committee, that the large amount of pornography available on the internet in the UK, often for free and with no protections to ensure that those accessing it are old enough to do so, is leading to a change in the way that young people understand healthy relationships, sex and consent. I know that that is a major issue of concern to everybody across the House. A 2016 report commissioned by the Children’s Commissioner and the National Society for the Prevention of Cruelty to Children made that absolutely clear. More than half of the children sampled had been exposed to online pornography by the age of 15, nearly half of the boys thought that the pornography they had seen was realistic, and just under half wished to emulate what they had seen.
The introduction of a requirement for age-verification controls is a necessary step in tackling those issues and it contributes towards our commitment to make the UK a safer place to be online, particularly for children and young people.
Does my hon. Friend agree that, when children have such experiences at a very young age, it can affect them for the whole of their life and have a big impact on their relationships as they grow up and get married? Indeed, it can affect not just their relationship with their partner, but their relationships with their children as well.
I very much agree that, if children see hardcore pornography when they are too young to understand it, it can have long-lasting and very negative impacts on their development and future relationships. My hon. Friend is absolutely right.
The draft Online Pornography (Commercial Basis) Regulations set out the basis on which pornographic material is to be regarded as
“made available on a commercial basis”.
The regulations cover material on websites and applications that charge for access. They also cover circumstances where a person makes available pornographic material on the internet for free, but then receives payment or reward for doing so, for example, through advertising revenue.
It was clear from the debates in this House during the passage of the Digital Economy Act that it was not Parliament’s intention that social media sites on which pornography is only a small part of the overall content should be required to have age verification.
My hon. Friend raises some important points. We do have rising expectations of social media platforms; we expect at the very least that they enforce their own terms and conditions. Some enforce to a greater extent than others, especially in terms of this particular issue. Facebook takes down posts that include nudity, which is its way of enforcing its own terms and conditions, but what about the private groups that operate on that platform? There is much more to be done. We expect social media platforms to uphold their terms and conditions across their platforms, not just in the public-facing parts of it.
My hon. Friend is making an excellent speech dealing with this critical issue that any of us who are responsible for young people will feel very strongly about. Does she agree that there is a rapid evolution in the technology sector? She talked specifically about pornographic sites that charge for access. Will she say a bit more about how she would deal with those sites that offer slightly different business models—for example, the premium model, where it is free to go on to the site and it then captures people’s details and makes them pay a subscription fee later? I am sure that she has considered that as part of her response, so I would be grateful if she could update us on it.
These regulations will apply even to pornographic sites that make their initial offer free of charge. The rule is that, if a site offering a service where more than 30% of its content is pornographic does so on a commercial basis—which can be free of charge if it is backed up by advertising revenues—it comes within the scope of these regulations, whether or not it provides those services free of charge. These draft regulations will capture such sites as are of concern to my hon. Friend.
I chaired the UK Council for Child Internet Safety for two and half years. While I applaud the regulations that the Minister is bringing forward, this is scratching the surface. The problem is that these days very few young people pay to access hardcore pornography on the internet. Unless we have some form of verifiable, age-based permission such as the use of a credit card—even if that is not charged for—we are not going to prevent this from happening. Actually, the much bigger problem is on social media, with sexting and everything else that goes on. Social media companies, including ones that we have had in front of the Home Affairs Committee, are turning a blind eye to the hosting of exceedingly dangerous material that young people are accessing and normalising, and then they are transferring that to their relationships during impressionable years. We really have got to do so much better than this.
I congratulate my hon. Friend on the work that he has done while chairing that important body, the UK Council for Child Internet Safety. I have already made clear in my answer to my right hon. Friend the Member for Basingstoke, the Chair of the Select Committee, that we do recognise that these regulations are a first step. Although we have high expectations of what they can achieve, we are fully aware that they do not go as far as to be able to satisfy the vast majority of our concerns where social media platforms are concerned, where the majority of content is not pornographic.
I would like to reassure the House, however, that I do believe that these regulations will be more effective than my hon. Friend fears, because they will cover sites that make pornography available free of charge. As he rightly points out, the majority of young people access pornography without paying for the service. However, if they access it from a site that is predominantly pornographic and is offering a pornographic service on a commercial basis, then, whether it is free of charge or paid for, the regulations will capture both. I would like to reassure him that these regulations will bring into scope the sites about which he is concerned that currently provide these services free of charge.
My hon. Friend will also be reassured to know, when I go on to explain a little more about the actual process of age verification, that it is not simply a matter of being able to offer a credit card. The rigour of age verification provision will be stricter than that. That will also help to counter the growing trend of young people accessing pornography before they attain the age of 18.
Further to the comments by my colleague on the Home Affairs Committee, the hon. Member for East Worthing and Shoreham (Tim Loughton), we have raised a series of concerns with social media companies and other technology companies about access to inappropriate, violent or extreme content, as well as the content that we are discussing today. Will the Minister and the Government look much more closely at peer-to-peer sharing sites like Snapchat and closed messaging groups on Instagram, Kik and other messaging sites? It is my understanding, from speaking to a lot of young people in my constituency, that that is where a lot of this content is. No age verification goes on, and it is simply done in encrypted sharing. Some of it is self-generated content where people are doing revenge porn, sexting and sharing types of images that not only constitute committing an offence because they are creating child pornography, but are well outside the scope of what one would find on a commercial site. Will she reassure us that serious work will be undertaken to look at that area?
I can reassure the hon. Gentleman that serious work is being undertaken as we speak, as we prepare the online harms White Paper. We are looking at encryption within the context of that White Paper. He will appreciate the difficulties of privacy versus the public need to reduce the exposure of young people to pornographic material. We are looking at this very seriously. We will be bringing forward the White Paper in the new year and will welcome his input on that.
We have set a threshold of 30% to ensure proportionality where material is made available free of charge. Thus there is an exemption for people making available pornographic content on a website where it makes up under one third of that content. This will ensure that websites that do not derive a significant proportion of their overall commercial benefit from pornography are not regarded in these regulations as commercial pornographic websites. Nevertheless, should a website or app be marketed as making available pornographic material, a person making such material available on that site will be considered to be making it available on a commercial basis even if it constitutes less than one third of the total. This is a proportionate way to introduce the new policy.
I am confident that these measures represent the most effective way to commence this important new policy, but our Department will of course keep it under review. Indeed, as I said, my right hon. Friend the Secretary of State will be reporting on the regulatory framework within 12 to 18 months of commencement of the regulations. In addition, as I just mentioned in response to the hon. Gentleman, the forthcoming online harms White Paper will provide us with another opportunity to review the wider context of this policy.
In conjunction, we have laid two pieces of British Board of Film Classification guidance—first, on age verification arrangements and, secondly, on ancillary service providers. The first piece of guidance sets out the criteria by which the BBFC will assess whether a person has met the requirements of section 14 of the Digital Economy Act 2017 to ensure that pornographic material is not normally accessible to those under 18. The criteria mandate four things: an effective control mechanism at the point of access to verify that a user is aged 18 or over; strict requirements on age verification data; a requirement to ensure that revisits to a site do not permit the bypassing of age verification controls; and the prevention of non-human operators—for example, bots—from exercising the age-verification regime.
Does the Minister believe that the BBFC has sufficient resources and skills to do what the regulations require of it?
I would like to reassure my hon. Friend that I certainly think it has the experience, expertise and resources to undertake this role. It has more than a century of experience in the control of film content. It has additional resources and moneys with which it can hold to account age-verification providers and, most importantly, the websites that are providing the pornographic content.
In addition to the criteria that the BBFC will use to verify the effective control of age-verification arrangements, it has provided typical examples of features that it would regard as non-compliant in the arena of age verification.
The second piece of guidance provides a non-exhaustive list of ancillary service providers that the BBFC will consider. That list is not exhaustive, to ensure that the policy remains flexible to future developments. The BBFC has published draft versions of both pieces of guidance and has run a public consultation for four weeks on their content. The draft guidance laid before the House takes account of comments received from affected companies, age-verification providers and other interested parties.
I have been clear that age verification is not a silver bullet, and we know that what we are doing is difficult. Indeed, we are the first country in the world to introduce such a measure. I am aware of the concerns expressed by the Joint Committee on Statutory Instruments about the drafting of the Online Pornography (Commercial Basis) Regulations 2018. I have considered its concerns carefully, and we are grateful for its work, but we do not believe that the variation in the legislation between the terms “met” and “applied” will be difficult for a court to interpret.
The Committee expressed concerns about the content threshold because it anticipates difficulty with the application and interpretation of the regulation. As I have said, the regulation will not apply in a case where it is reasonable for the age-verification regulator to assume that pornographic material makes up less than one third of the content of such a site. As stated in the BBFC guidance, the BBFC will seek to engage and work with a person or company who may be in contravention of the requirement in advance of commencing enforcement action.
I am aware that the Committee has also drawn the special attention of both Houses to these draft pieces of guidance because, in its view, they fail to contain the guidance required by section 25(1) of the 2017 Act and contain material that should not have been included. Section 3, paragraph 5, of the age-verification guidance sets out the criteria that the regulator will treat as complying with age verification. The guidance goes on in paragraph 6 to give examples of features that, in isolation, do not comply with the age-verification requirements. That approach ensures fairness and is product-neutral. Rather than recommending a particular solution, the guidance sets out principles that will encourage further innovation.
I wonder whether I could press the Minister on the robustness of age verification, which is of interest to the wider debate. It seems that certain types of checks, such as those that run off a credit card, are extremely robust, but younger people do not have access to credit cards, so that becomes more difficult, although we can layer up different types of information to give a best guess. Of the long list of checks that she has mentioned, which is favourable in terms of robustness and quality?
Age-verification providers will have to demonstrate that they have a foolproof system of identifying whether somebody is aged 18 or over. The sort of effective control mechanisms they are considering are credit cards, passports and driving licences—items that a lot of 18-year-olds will have at least one of. My hon. Friend rightly points out that a great deal of work is going on to improve age-verification systems. That is precisely because the sorts of items I have mentioned are, in general, only held by people who are aged 18 or over—with the exception of driving licences, which can be obtained at the age of 17.
For those reasons, it is much more difficult to ascertain how we can require age verification in other areas. For example, in the Data Protection Bill, we set the qualifying age at which someone can consent to a contract with a social media platform as 13, but it is very difficult for someone to prove that they are 13, because those items are normally held by people aged 18 or over.
Should I be concerned by reports that a company called AgeID, which operates the ID verification system for Pornhub and YouPorn, is considering the idea of “porn passes”, which could be bought from a newsagent and would allow people to access porn online anonymously, so that they do not have the embarrassment of their credit cards being recorded against such a site?
The Minister is being very generous in taking a great many interventions, and I appreciate that she is giving thorough answers to the questions she is being asked, but we only have 58 minutes left, and many Members want to take part in the debate. She might want to bear that in mind.
Thank you for your guidance, Madam Deputy Speaker, which I will take on board, but I will just deal with the point raised by my hon. Friend. The measures that will be acceptable to the BBFC will be of greater rigour than the examples he gave. I hope that I will be able to satisfy his concerns, but I may write to him, rather than dwell at length on the important issue he raises.
I now turn to the guidance on ancillary service providers. Paragraph 3 of section 3 provides a non-exhaustive list of classes of providers that the age-verification regulator may consider as within scope. However, to ensure that this policy remains flexible for the benefit of future developments, it is necessary that this is a non-exhaustive list. Where new classes of ancillary services appear in the future, the BBFC’s guidance explains the process by which these services will be informed.
The guidance includes additional material, as this is a new policy and the regulator considered that it was important for its stakeholders that the guidance set out the wider context in which the age-verification regulator will carry out regulation. This includes valuable guidance on matters such as the BBFC’s approach, powers and material on data protection.
We are aware of the Secondary Legislation Scrutiny Committee’s special interest report. The Committee raised similar concerns to the JCSI—for example, on the content threshold—and the responses to the SLSC’s concerns on these points are the same as the responses we have given to the JCSI reports. However, the SLSC also suggested that the House may want to ask what action the Government will take to tackle pornographic material available on a non-commercial basis. We have already debated these issues during my remarks.
I appreciate that pornography is of course made available by means not covered by the regulations. We have already covered those issues, but they were also the subject of extensive discussion during the passage of the Digital Economy Bill. In particular, concern has been expressed about social media platforms. As I have said in response to hon. Members’ interventions, we expect those platforms to enforce their own terms and conditions and to protect children from harmful content. Indeed, the Government have been clear that online platforms must do more to protect users from such harmful content.
How do the Government intend to ensure that these regulations can keep up with technological advancements and developments within these markets so that the legislation and regulations this place passes are not obsolete by the time they come into force?
My hon. Friend raises a very important point. The principal way in which we are future-proofing these regulations is by making the specificities that the BBFC operates by and the guidance sufficiently flexible and not too prescriptive. As technology advances, it will be able to adapt such regulations and guidance without the need for this House continually to bring in further legislation.
Before I conclude, I would add in response to my hon. Friend that, as I have said, this is not a silver bullet and it is only one of the measures we are taking. We are working on the online harms Bill to tackle issues and concerns in the area of the provision of pornography that are not captured by these regulations. I trust that my hon. Friend is reassured.
As I have said, I recognise that the age-verification regulations are not a panacea, but I am proud that we are leading the way internationally in the action we are taking to give far more protection to children and young people than is currently available.
Order. Before I call the Opposition spokesman, let me say it will be obvious that many people wish to speak. This debate runs until 8.36 pm and I see people with large wads of notes. It might be helpful for colleagues to know now that they should edit down their notes to some three or four minutes.
On a point of order, Madam Deputy Speaker. You can see by the number of people who want to speak and the amount of notes we have that this is something we are really keen for the Government to get right. May I therefore ask whether there is any opportunity to extend the debate, at least towards its allocated time?
That is a perfectly reasonable point of order, but not now. There was a point when Mr Speaker asked whether the House agreed to take the three matters we are discussing this evening together or separately. At that point, anyone could have objected and each would have been taken separately; thus there would have been a much longer debate, but I am afraid that that moment has passed. However, it is very good, just for once, to have a point of order that is a real point of order, and I thank the hon. Lady for it.
I will be as brief as I can, because I know that the whole House will want to hear from my hon. Friend the Member for Rotherham (Sarah Champion), given the level of expertise she brings to this debate.
The Minister will be pleased that I am able to start on a note of cross-party consensus; we do not have many of those at the moment. I think we can agree across this House that this is an important debate because it gives us the opportunity to say, when it comes to legislation in this territory, that we have rights to honour. We have rights to honour because we have duties to honour—duties to our children. As Baroness Kidron in another place has put it so well, “Children are children until they reach the age of maturity, not until they pick up a smartphone”.
If those duties bite on us, as legislators and indeed as parents, those duties should also bite on companies and indeed on social media companies. These measures go a little distance towards imposing some of those duties on commercial providers. They do not go far enough, and I will explain why there are shortcomings. However, they come so late and are needed so urgently that we will not oppose them or divide the House this evening.
These measures are a stopgap. I hope the Minister will at some point during these proceedings explain just how long this stopgap is expected to last. At the moment, we have the situation, as the Information Commissioner has put it, that the internet has become something of a “wild west”. As the Minister has been candid enough to admit in her really quite helpful explanatory remarks this evening, these regulations may touch on the problem, but they absolutely do not solve it. We need a very different regulatory approach to the online harms we are seeking to police.
In debating the shortcomings of these regulations, I hope we are able to help the Minister and the Secretary of State, who is good enough to be on the Front Bench tonight, to get two crucial reforms right. We asked for these reforms in the Data Protection Bill. They are the age-appropriate design code, which was promised under the Data Protection Bill, and the internet safety strategy, which I know the Secretary of State is hoping to bring forward as soon as he can get his civil servants back from no-deal planning and get them back on to the Department’s important business.
I hope the Minister is able to set out for us how long she expects this stopgap to last, and I want to flag up to her the 10 obvious deficiencies that leap out from the measures and the explanatory notes to them. I will rattle through them fairly quickly, in the interests of time.
The first problem is the very strange conclusion in the regulations of a de minimis of content at which the regulator will deem it necessary to trigger a safety wall of age-verification software. It is really not clear why a third was chosen. I appreciate that the Minister has to start from somewhere, but there are obvious flaws in this plan, not least providers simply filling their sites with virtuous content in order to get around the regulations. It strikes me we can fully anticipate that even at this stage of the legislation.
As has been highlighted by a number of hon. Members, some of whom are not now in their place, these regulations do not bite on social media firms. This is lunacy. This is surely one of the most dangerous areas in which our children are exposed to these kinds of online harms, so bringing forward a set of measures without explicit reference to their non-applicability to social media firms seems to me to be a shortcoming. As the House will know, the reason why this is such a problem is that when we took the Data Protection Bill through this place, we exercised a derogation under European law that allowed us to deem that children were basically unfettered on social media platforms from the age of 14, not 16 as other European countries insist. Debating the right protections for our children on social media platforms is extremely important, and hon. Members are absolutely right to clock that the orders do not touch on that important arena.
The second problem is the odd definition of “commercial basis” that is used as the trigger for requiring age-verification systems. We have had a useful exchange about business models that entice users by offering free content—the money is made either by advertising or through premium content. The orders and the explanatory notes are not terribly clear about the sins that will be allowed through the net because of that odd definition.
The third problem, which was debated in the other place, is the challenge of what definition of “obscene material” to use. At least a couple of definitions are knocking around different bits of legislation and it is not clear that the orders are all-encompassing in the definitions used.
That brings us to the fourth issue, which was championed by Baroness Howe in the other place. The definitions that have been used create a couple of important new gaps. I am grateful for the briefing circulated by Christian Action Research and Education, which has set out the challenge in important ways. The Government have changed what the BBFC can ask internet service providers to block from so-called “prohibited material” to the much narrower definition of “extreme pornography”. In so doing, they exclude the power to ask ISPs to block non-photographic, animated child abuse images. Those are illegal to possess under section 62 of the Coroners and Justice Act 2009 but, at the moment, they are outwith the protections of the orders. If those images are located outside the UK, they are not within the remit of the Internet Watch Foundation. Given the number of such images that we know are available, that is a serious shortcoming in the orders. It is a great concern to the House that neither the Internet Watch Foundation nor the BBFC has the power to deal with those images.
That brings us to the fifth issue. Just as significant is another challenge. Because of the same use of definitions, it is not possible to prohibit violent pornography that is illegal under the Video Recordings Act 1984. I understand that Baroness Howe has a Bill in the other place to step on and do away with these problems, and perhaps at some point we might learn whether the Minister is minded to support that legislation. I am not sure whether the Minister gets a chance to wind up under the rules of tonight’s debate, but she might want to intervene if a box note is forthcoming.
The sixth problem is that the orders give power to what is essentially a private company. When the orders were passed to give the BBFC the role we are debating this evening, the Opposition raised significant concerns about whether, despite its extensive experience, the measures constitute mission creep for the BBFC. The Opposition and other hon. Members have serious doubts about whether it is resourced enough to do the job. This is a new departure in its business, and it does not have a track record. It does not have a royal charter, and it cannot de facto be assumed to be operating in the common good. The basic challenge hon. Members have is this: who will watch the watchmen? How will we ensure that that private organisation, which is blessed by us with statutory powers and statutory regulatory oversight, executes the task we give it effectively? We cannot rely on its mission. I welcome the fact that the Minister says that the Secretary of State will come back to the House in 12 to 18 months with a progress report, but that is rather a long time in the future if the BBFC is found to be seriously failing in the execution of its duties at a much earlier stage.
The seventh problem is that there is not an exhaustive list of age-verification solutions. The Minister will say that the technology moves on and that we need to preserve a degree of flexibility to allow the legislation to keep up to date but, none the less, the lack of specificity worries me. It worries me that the BBFC is not yet able to insist on minimal regulations and solutions for age-verification systems. The eighth problem is that the guidance on what is appropriate in systems is vague.
When we take those eight objections together, we see that the orders are half-measures. The reality is that, this year, we have learned about and debated a great many different approaches to clamping down on the harms that may hurt our children online. A much better approach to the problem would be to use a tried and tested concept in health and safety legislation: the duty of care principle, which has been around in English law since at least the early 1970s. That approach would require companies and organisations to take specific steps to understand the potential harms they are causing to their consumers, and then to take appropriate steps to ameliorate those harms.
If I went to London tonight and built myself an arena and filled it with people, I would rightly be asked to observe all kinds of health and safety measures to ensure that that the people were safe and sound. If I build an online arena, I am under no such obligations and can pretty much do what I want. If I ensure that the arena is a social media platform, I will not be hampered in any way by the orders.
The duty of care principle is a much better approach, but it needs a different kind of regulator. We currently have something like 13 different regulators overseeing different aspects of internet safety, internet regulation, content regulation and financial processing regulation online. That is far too many. That landscape is much too complicated, and those regulators do not have sufficient powers to implement the safeguards against online harm that we as legislators would like. I am not proposing that we reduce those 13 regulators to one this evening, but I am saying that 13 needs to come down to something closer to one. The House needs to ensure that that regulator has the right power to enforce proper duty of care regulation.
The Minister spoke at great length and I am grateful that she took a wide variety of interventions. The orders are important and necessary, and an advance on where we are today, but if we are to get the future right, hon. Members on both sides of the House need to be candid and honest, and work together in identifying the shortcomings of the current approach, which was conceived and constructed in legislation that is a couple of years old. We need to be honest and open about its shortcomings so that we can put in place a better solution when we have the White Paper and, I hope, when the Secretary of State brings the Bill to the House.
Order. As I indicated earlier, we will start with a time limit of four minutes.
I will curtail my comments to the utmost brevity.
I strongly welcome the regulations, but I have a number of reservations and questions for Ministers. I share the concerns that have been expressed that social media needs to be included in the remit. That is not the case currently despite the fact that 500,000 pornographic images are posted daily on such platforms. I hope the Minister reassures the House that she will consider that position on social media through the internet safety White Paper.
Secondly, I share the concerns of the JCSI about the Government’s approach to proportionality and the “one third” approach, which might lead to websites deliberately including additional material with the prime intention of falling outside the scope of the regulations. Will Ministers monitor that carefully and consider reviewing the “one third” principle if the concerns expressed by the Joint Committee materialise?
I want to express concerns about the impact of amendments made in another place to part 3 of the Digital Economy Act 2017, some of which were mentioned by the right hon. Member for Birmingham, Hodge Hill (Liam Byrne). When the Bill left the Commons, it gave the regulator power to block non-photographic child sex abuse images. As we have heard, those images can include incredibly lifelike, animated computer-generated images. Disturbingly, the other place voted to accommodate adult access to such material so long as it is placed behind age-verification checks, but the message that sends is alarming.
As we heard, some suggest that this material could be dealt with by the Internet Watch Foundation, but it can only take action against such material if shown by websites based in the UK. As I have said before in this place, the majority of such material viewed in the UK comes from sites based in other jurisdictions. In 2017, 3,471 reports of alleged non-photographic images of child sexual abuse were made to the IWF, but none was hosted in the UK and it was unable to act. The Digital Economy Bill, as it left this House, empowered the regulator to take the only credible enforcement action that can be taken against such sites when they are based in other jurisdictions: the regulator had the power to block them. That power has now been taken away, unless a site has no age-verification checks. I hope Ministers will look at that again.
I turn now, with no degree of relish, to the other area of concern, violent pornography, which was reconsidered by the other place. When the Bill left this place, it gave the regulator the power to block violent pornography that is illegal to supply to anyone of any age under the Video Recordings Act 1984. However, amendments introduced in the other place accommodate all but a tiny subset of violent pornography, so long as it is behind age-verification checks. The only illegal content that the regulator can take action against when behind age-verification checks is “extreme pornography” which has to be likely to result in severe injury to certain named body parts. That sends out completely the wrong message about the acceptability of sexual violence against women—it is unacceptable, full stop. I welcome the Women and Equalities Committee’s recent report on sexual harassment, which highlighted that point.
In conclusion, I support the suggestion that the Government have a quick way to address the two failings to which I have just referred: looking at Baroness Howe’s Digital Economy Act 2017 (Amendment) (Definition of Extreme Pornography) Bill and giving it time for consideration in this place.
I believe that all of us in the Chamber tonight want to find common ground and a common way forward. I thank the Minister for her very thorough exposition of the issues in her speech and the Opposition Front Bench spokesperson, the right hon. Member for Birmingham, Hodge Hill (Liam Byrne), for covering a lot of the issues in such detail that I will not need to go over them again. I will keep my remarks very short, so that other Members have the opportunity to participate in the debate.
Currently, it is too easy for our children to access explicit material online. Young people today are growing up in an age where information is readily available to them at the touch of a button. That can be a very good thing, of course, as a terrific aid to learning. However, it also means that children can be exposed to explicit materials either in error or because they are simply curious. We have a duty to ensure that all that can be done should be done to protect them.
Studies have shown that when children and young people are exposed to sexually explicit material, they are at a greater risk of developing unrealistic attitudes about sex and consent; more negative attitudes towards roles, identities and relationships; more casual attitudes towards sex and sexual relationships; and an increase in risky sexual behaviour. They also develop unrealistic expectations of body image and performance. Access to genuine educational material is important, but we must ensure that we take these measures to protect children and young people.
The Scottish National party supports measures that will protect children from exposure to pornographic material online. It is only right that there is a requirement that a person making available pornographic material online on a commercial basis to persons in the United Kingdom should ensure that such material is not normally accessible by persons under the age of 18. As I said in my opening remarks, it is currently too easy for children to access explicit adult content on their phones and computers. There is much work to do, especially in the area of social media, and many challenges listed tonight that are still to be addressed, but we support the measures, which as a start, aim to protect our children in a digital age.
I very much welcome this groundbreaking piece of legislation and thank the Minister for going through it so thoroughly today in her opening statement. I think the right hon. Member for Birmingham, Hodge Hill (Liam Byrne) is wrong when he says that this is a set of half-measures, but it is only a start—he is right in that respect. When we look at the scale of the problem we are dealing with, with almost two-thirds of young people seeing pornography online for the first time when they were not expecting it, the Government are right to start the long journey in trying to stop the unexpected exposure to what can be very damaging material.
The Women and Equalities Committee published a number of reports highlighting the damaging impact that exposure to pornography at an early age can have on young children—not only in the report, mentioned by my hon. Friend the Member for Congleton (Fiona Bruce), on sexual harassment in public places, but in the report on sexual harassment in school. The evidence is there and it is clear, but rather than going through those findings again I would like to focus particularly on the amendments made in the other place to the Digital Economy Act 2017 (Amendment) (Definition of Extreme Pornography) Bill. They have caused concern not only this evening but outside this place by setting extreme pornography as the threshold for non-compliance and for the images that appear to be allowed as a result of those changes made in the other place.
There are serious concerns about part 3 of the Digital Economy Act 2017, which has been weakened by Lords amendments. The noble Baroness Howe has been working hard in the other place to try to offer a solution. I hope the Minister can comment on that if time allows this evening. The Lords amendments mean that non-photographic child sexual abuse images, which would be illegal for anybody to possess, could be accommodated behind age-verification checks. Whereas previously the regulator could block that illegal content, the Lords amendments mean that that could happen only if the material was without age verification.
Secondly, the Lords amendments mean that a lot of violent pornography that is illegal to supply to anyone of any age under the Video Recordings Act 1984 will now be accommodated behind age verifications. That sends out all the wrong messages, so will my hon. Friend confirm that the Government will be not only keeping these issues under close review, but examining whether they could take forward the recommendations in Baroness Howe’s Bill, and that these issues can be addressed directly in the online harms White Paper, if not before?
We have the opportunity to return to these issues after 18 months, but I would not want to see what is a good start being hampered by changes in the other place that ComRes polling would suggest almost three-quarters of Members in this place simply would not agree to. Why can we not bring forward measures that would better reflect the will of this House, rather than that of unelected peers? The Front Bench spokespeople often tell me that something that is illegal offline is illegal online as well. They are really close to the edge of breaking their own rule, where things that are actually illegal offline appear to the normal man on the Clapham omnibus to have a different effect online. That is really regrettable.
The Minister was very generous in responding to my earlier comments on social media. I hope she keeps under review the need to put much pressure on social media companies to ensure that they also are within these sorts of parameters.
I will rattle through some points, because I would like them to be on the record for the Minister and the Secretary of State.
On the guidance on the ancillary service providers, under section 15(1)(d) of the Digital Economy Act 2017 and annex one of the guidance, pornography material is defined as a video work or material that has been issued an 18 certificate and that
“it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal”.
This is a neutral definition that fails to recognise that porn is almost always coercive, usually violent, aggressive and degrading, and is gendered. It is also almost always men doing it to women. Other countries are broad in their definition of pornography, to capture that aspect of it. In Spain, it is defined as “pornography, gender violence, mistreatment”, and in Poland as very strong and explicit violence, racist comments, bad language and erotic scenes. Does the Minister agree that our definition could be amended to acknowledge that pornography represents gendered violence, misogyny and abuse?
Am I right that the point my hon. Friend wants to register this evening is that there is much to learn from other countries?
That is absolutely right, and that becomes more apparent as we go forward. This legislation is very UK-based; pornography, of course, is international.
Minister, I am very concerned about the ability of the BBFC to compel ancillary service providers and payment-service providers to block access to non-compliant pornography services, as described under sections 21 and 23 of the Digital Economy Act. What power does the BBFC have to force companies to comply with its enforcement measures? What happens if credit card companies, banks or advertising agencies refuse to comply? I know of pornographic sites that accept supermarket points instead of cash to get around such legislation from other countries. What assessment has the Minister made of the likelihood of opportunistic websites being established to circumvent UK legislation and the child protection risks that follow? It is unclear how the BBFC will appraise sites and what review mechanisms it will put in place to judge whether the scheme is effective in practice.
Under part 1, paragraph 10 of the guidance:
“The BBFC will report annually to the Secretary of State”.
Will the Minister commit to an interim review after six months from the implementation date, so that we can see whether this is working? Under part 1, paragraph 11 of the guidance,
“the BBFC will…carry out research… into the effectiveness of the regime”
with a view to child protection “from time to time”. As that is the very purpose of the legislation, does the Minister agree that this should occur at least every two years? Under part 2, paragraph 7 of the guidance,
“the BBFC will…specify a prompt timeframe for compliance”.
However, there is no detail on what this timeframe is. It could be a week—it might be a year. Will the Minister please explain the timetable for enforcement?
The guidance also details the enforcement measures available to the BBFC in the case of a non-compliant provider. I broadly welcome those enforcement measures, but I am concerned about the ability of the BBFC to take action. Will the Minister tell us which body will be effectively enforcing these punishments? Will it be the Department for Digital, Culture, Media and Sport or the Home Office? Will the Minister put on the record the additional resources being committed both to the BBFC and whichever Government agent is meant to enforce the legislation?
Turning to the BBFC guidance on age-verification arrangements, I want to register my concerns about the standards laid out on what constitutes sufficient age verification from providers. Section 3, paragraph 5 mentions
“an effective control mechanism at the point of registration or access by the end user which verifies that the user is aged 18 or over at the point of registration or access”.
That is very vague and could in practice mean any number of methods, many of which are yet to be effectively put to the test and some of which may jeopardise the security of personal data. That raises concerns about the robustness of the whole scheme, so will the Minister detail how she plans to ensure that the qualifying criteria are not so lax as to be useless?
Part 4, paragraph 3a states that
“age-verification systems must be designed with data protection in mind—ensuring users’ privacy is protected by default”.
Has the Minister also made an assessment of the safeguarding implications for the personal data of children, some of whom may attempt to falsify their age to access pornographic imagery? Following the data hack of Ashley Madison, that has concerning implications for adults and children alike. While age verification certainly is not a silver bullet, as an idea it does have a place in a regulatory child protection framework. However, we need to ensure that that framework is as robust as it can be. Guidelines for websites that host pornographic material must be clear, so that the policy can be rigorously applied and potential loopholes are closed.
I also want to say that this has to work across Government. At the moment, we are still waiting for the Department for Education to bring forward the guidance on relationship and sex education. Unless we prevent, we cannot—
I knew that parliamentarians on both sides of the Chamber agree that we have a duty to provide the framework to protect those who are unable to protect themselves. That is why I welcome the legislative steps to implement the age-verification controls that we are talking about tonight. That is especially the case since I have read some of the evidence, although that also made me question whether we are going far enough—a question on all our lips tonight.
The survey carried out on behalf of the National Society for the Prevention of Cruelty to Children’s Childline service showed that one in five children aged between 11 and 17 said that they had seen pornographic images that had shocked or upset them. That is why this legislation is so important.
We have talked about technology and how fast-moving it is, and that can work both ways. It could perhaps help us to provide stronger controls if we grasp what may happen in the coming months and make sure that we use the technology to the fullest. However, we must also be aware that technology can advantage the online providers of the pornography that we are trying to prevent our young people from seeing. It is important that we keep up to date with what is happening in the technology world. As others have said, the measures should be the starting point, not the end point. I would really appreciate it if the Minister clarified what further steps were being taken to make sure that we moved forward with this over the coming months and years.
The key focus of what we are discussing is that children should be protected online in the same way as they are offline. We have other prohibitions for goods that are inappropriate for different ages, such as tobacco and alcohol—to me, this is an extension of those principles. It is right that we look at how to protect children and young people from inappropriate online images.
There has been a lot of talk tonight about social media and how the legislation does not cover that. Hopefully, some of the responsible providers of social media are watching and listening and, through the nudge effect, will be able to implement good practices based on the new regulations that we are introducing for online providers. We know that the nudge principle works in other areas, so perhaps we can keep an eye on that as well to make sure that we take every possible advantage from what we are discussing across all the different platforms.
It is also important that we do not forget about parental responsibility, because that is still a big way of stopping children seeing inappropriate images. When I was talking to a colleague earlier, she said that as parents she and her partner thought that they had done the right thing by putting the computer in the hallway so that they were walking past all the time, but such actions do not stop parents going out and leaving children at home as young adults. Whatever parents do, sometimes it is not enough, which is why I welcome these measures.
My right hon. Friend the Member for Basingstoke (Mrs Miller) mentioned that what we are doing might be misunderstood by parents, who may feel that they do not need to provide any parental guidance. We need to make sure that parents still understand that they have that responsibility and that the legislation and framework being put in are not a panacea, but the start of a long pathway to making sure that we protect our children from pornography and that they develop healthy, strong relationships and are not affected by what they see as children.
I will keep my remarks brief not only because of the time available, but because many of the concerns that the Liberal Democrats share have been covered, particularly by the right hon. Member for Birmingham, Hodge Hill (Liam Byrne), who spoke from the Opposition Front Bench.
On behalf of the Liberal Democrats, I would like to oppose these flawed—as we see it—Government plans for age verification for access to online pornography. Ostensibly, as we have heard tonight, the main aim of the proposals is to stop children stumbling on to online pornography, but there is little supporting evidence that young people do stumble upon these sites. If they do, it is more likely that they are not the dedicated sites that would be covered by the legislation.
That is one flaw, but my main objection is that the Government’s proposals would mean tracking information from people using these sites by suggesting that a credit card or an address is given to check against the electoral register. An act that is private and, in most cases where the information is asked for, legal, would be recorded and could be tracked on the person’s computer. On top of that, there are concerns about the lack of privacy protections, that the information could be open to hacking and that, like any other bulk data, it could be sold on.
The legislation is also easy to circumvent. Indeed, US websites have already said that they will simply ignore it because, to their mind, it interferes with legal independence and the rights of the individual. However, my main concern is that this flawed legislation could lead to the targeting of sexual minorities who are over 18 and can visit the sites. Some members of the LGBT community may wish to keep their identity or their sexuality secret for several reasons, but the legislation would risk the possibility of their being hacked and that information being leaked. In a nutshell, those are the reasons why Liberal Democrat Members oppose the Government’s proposals.
It is a pleasure to be called to speak in this important debate, and to be part of a debate in which several thoughtful points have been made from hon. Members of all parties.
We should be proud of the legislation that we are introducing. As the Minister said, it is the first time that something like this has been attempted anywhere in the world. Although we could undoubtedly go further, this is a noble first step. I spent the first part of my career as a teacher. I worked for Barnardo’s and for the Office of the Children’s Commissioner and I followed this issue closely through those jobs.
I have been struck by the sheer pace of change and Governments’ and regulators’ failure to keep up with it. The speed with which smartphone technology has changed the environment of childhood is frightening. I have always been wary of the aggressive online libertarian wing, which claims that this new world cannot be regulated and that to introduce any form of restriction is to inhibit the beauty of the online liberal space. That is unacceptable. We have a duty, as adults, to step in and protect children from things that they might not choose to see and that we know they are not ready to see.
As a starting point, we must accept that self-declaration is no form of age verification. Creating a mechanism, however basic, which introduces a decent, verifiable scheme to prevent young people from seeing certain things, is worth while. There are many opportunities to extend the principle elsewhere. I know from friends who are teachers and from the parents of teenage children their concerns about social media and overexposure to it. Many of the social media giants tell us that their sites are suitable for those aged 13 and above, but do nothing to police that. As I said, self-declaration is no form of age verification. The adults have to step in and take control.
I was interested in the shadow Minister’s comments about duty of care. As we look to the future, beyond the legislation, when social media giants, internet companies and providers of any content say that only people of a certain age should use their site, yet do nothing meaningful to enforce that, there must ultimately be some form of sanction against them. Switching the onus on to those companies would be no bad thing. My hon. Friend the Member for Walsall North (Eddie Hughes)—who has morphed into a different hon. Member since I stood up; a no less honourable Member—mentioned mobile phones. Several colleagues and I have raised the important issue of whether schools should have further powers to police mobile phones in schools.
I am pleased to speak in support of the regulations and guidance, although I want to register some concerns. I thank Dan Boucher from Christian Action Research and Education—CARE—who gave us some information about the subject.
It is two years since the Digital Economy Bill left our House, yet the age-verification provisions have still not been implemented. Will the Minister assure us that there will be no further slippage in the timetable? I want to put on record my thanks to the Minister and her Parliamentary Private Secretary, who usually sits behind her, but not tonight—it is a different person—for their courtesy and good manners in helping us to look at the issues, and giving us an assurance, which I am holding on to, that the White Paper will make the necessary changes.
As things stand, neither part 3 of the Digital Economy Act 2017 nor the regulations engage with social media. That point was made in another place last month with real concern. It was pointed out that a staggering 500,000 pornographic images are posted on Twitter every day. I gently remind the Minister that the recent Women and Equalities Committee report on sexual harassment recommended:
“The definition of ‘commercial pornography services’ for the Government’s policy on age verification of pornography websites should be amended to include social media”.
We should be mindful of that recommendation. I hope that the Minister will reassure the House that she is considering the Act’s position on social media as part of her reflections on the White Paper. We need action. Parents are no less troubled by the prospect of their children seeing online pornography because it is on Twitter rather than a website, and neither should we be.
I also gently remind the Minister that the Conservative party manifesto said that
“we will stop children’s exposure to harmful sexualised content online, by requiring age verification for access to all sites containing pornographic material.”
My only worry is about the caveat that states that, if pornography makes up only a third of the content on the site, it does not count. Again, I seek reassurance about that. I also ask the Minister how the Government can justify protecting children from pornographic material online only in certain circumstances, when their manifesto commitment contained no hint of any qualification or limitation on their age-verification checks strategy.
When the Digital Economy Bill left the Commons, the regulator was empowered to block all non-photographic child sex images, regardless of age verification. That made complete sense because section 62 of the Coroners and Justice Act 2009 makes it absolutely clear that it is a criminal offence for anyone of any age, including anyone over 18, to possess such material. However, in the other place, amendments were introduced to accommodate the wishes of adults. I am ever mindful that some 71% of MPs—parliamentarians in this House—want stronger protection, and I know that the Minister wants that, too. I suggest that this must urgently be rectified.
Similarly, when the Bill left the Commons, the regulator had the power to block, regardless of age verification, all violent pornography that it would be a criminal offence to supply to anyone of any age, including those aged 18 and above, under the Video Recordings Act 1984. That is no longer the case. The Digital Economy Act cannot send out the message that the normalisation of sexual violence against women is worth accommodating. Of course I understand that, under section 29, there will be scope for these issues to be reviewed between 12 and 18 months after implementation.
The Digital Economy Act 2017 (Amendment) (Definition of Extreme Pornography) Bill was tabled in the other place—by Baroness Howe—and, during last week’s debate on these regulations a number of peers pressed the Government to give the Bill time. It is notable that, rather than saying no, the Minister, Lord Ashton, undertook to speak to the Chief Whip, and I very much hope that the Minister in this House will do the same. I congratulate her on tabling the motions, and I look forward to our working together to make this stronger, because that is what we all want.
With the leave of the House, Mr Deputy Speaker, I thank the right hon. Member for Birmingham, Hodge Hill (Liam Byrne) for his support for the regulations. I agree with him that—as I said in my opening remarks—they are a contribution to the greater security of children and young people online, but, as I think the whole House agrees, they are not a total solution.
The right hon. Gentleman asked what further steps we were taking, and asked about their timings. I reassure him that the Secretary of State will review the performance of the regulations within 12 to 18 months of their taking effect. As part of that review, and in response to the deep concern that has been expressed by many Members in all parts of the House this evening about the extreme nature of pornography, we will look at the fact that this being behind age verification should not be a licence for the production of that sort of material. The Secretary of State will also be empowered to reconsider the definitions of extreme pornography. I thank him for remaining in the Chamber throughout the debate. I am sure that he has noted the will of the House that we revisit those definitions, which do not appear to me to be fit for purpose.
A White Paper on online harms will be published early in the new year. The right hon. Gentleman raised the issue of the desirability of placing a duty of care on social media platforms, which are relevant to the debate and which have a far wider impact than the issue that we are debating tonight. I reassure him that we are considering a duty of care as part of the development of that White Paper. I look forward to his further contributions on how to make such a duty effective in this context.
My hon. Friend the Member for Congleton (Fiona Bruce) mentioned the risk that pornographic sites would flood themselves with non-pornographic material in order to evade the scope of the regulations. We have considered that. My Department and the British Board of Film Classification have held discussions with commercial providers of pornography sites, and we have encountered a great willingness on the part of those operators to fall in line with age-verification measures. Indeed, they are setting up arrangements to do so. We consider it unlikely that sites will go to the trouble of being flooded with non-pornographic content but, if we turn out to be over-optimistic on that front, my hon. Friend can be assured that that would weigh heavily with the Secretary of State when he reviews the operation of the regulations.
My right hon. Friend the Member for Basingstoke (Mrs Miller), the Chair of the Women and Equalities Committee, mentioned the Lords amendments that had facilitated the availability of extreme pornography involving violence, and even involving children, if generated via technology as opposed to human actors. That strikes me as a grotesque loophole. I agree with my right hon. Friend and others that Baroness Howe’s Bill, which seeks to render this activity illegal, is worthy of our consideration, and I commit the Government to considering it as a potential means of combating that sickening loophole.
The hon. Member for Rotherham (Sarah Champion), who has huge expertise in this area, mentioned the predominantly coercive, violent and gendered basis of the grotesque abuse of women in much of the content, and the effect that that could have on the minds of young people as they develop into adulthood. Let me reassure her, as I reassured my right hon. Friend the Member for Basingstoke, that we will definitely consider her comments. She made a number of worthwhile suggestions, and I will write to her, as time does not permit me to go through all of them in turn.
I am sure I was not alone in my surprise when the hon. Member for Edinburgh West (Christine Jardine) declared that the Liberal Democrats would oppose the regulations, on the broad basis that they do not go far enough. I think that the rest of the House agrees that they do not yet go far enough in tackling the problem before us. She must agree, however, that certain aspects of this are very difficult. Tackling pornographic content on a site like Twitter is very difficult, because to introduce a blanket ban on anyone under the age of 18 accessing a social media platform of that nature would have serious unintended consequences. We need to get these matters right and, rather than the Liberal Democrats just opposing these measures, they would do well to contribute to the debate. I urge the hon. Lady’s party to reconsider its position, which if unchanged will lead it into disrepute.
I am grateful for the comments from the hon. Member for Strangford (Jim Shannon). He reminds me of my party’s manifesto commitment to end the access of children and young people to pornography sites. I agree with him that we need to go further, but I commend these regulations to the House as a very good start and I thank hon. Members for their support this evening.
One and a half hours having elapsed since the commencement of proceedings on the motion, the Speaker put the Question (Standing Order No. 16(1)).
Question agreed to.
That the draft Online Pornography (Commercial Basis) Regulations 2018, which were laid before this House on 10 October, be approved.
Draft British Board of Classification Guidance on Ancillary Service Providers 2018
That the draft British Board of Classification Guidance on Ancillary Service Providers 2018, which was laid before this House on 25 October, be approved.—(Margot James.)
Draft British Board of Classification Guidance on Age-Verification Arrangements 2018
That the draft British Board of Classification Guidance on Age-Verification Arrangements 2018, which was laid before this House on 25 October, be approved.—(Margot James.)
We come now to a series of potentially deferrable motions: motions 4, 5, 6, 7, 8 and 9. Not moved.