Statement
The following Statement was made in the House of Commons on Tuesday 15 December.
“Mr Speaker, we now conduct a huge proportion of our lives online. People in the UK spend an average of four hours and two minutes on the internet every day, and we know that for children it is even longer. That technology has improved our lives in countless ways but, as honourable Members on both sides of the House know, too many people are still exposed to the worst elements of the web: illegal content, racist and misogynistic abuse, and dangerous disinformation.
Those interactions may be virtual, but they are causing real harm. More than three quarters of UK adults express concerns about logging on, while a declining number of parents believe the benefits for their children of being online outweigh the risks. Trust in tech is falling. That is bad for the public and bad for the tech companies, so today the Government are taking decisive action to protect people online.
Through our full response to the online harms White Paper, we are proposing ground-breaking regulations that will make tech companies legally responsible for the online safety of their users. That world-leading regime will rebuild public trust and restore public confidence in the tech that has not only powered us through the pandemic, but will power us into the recovery.
I know that this legislation is highly anticipated on both sides of the House. I want to reassure honourable Members that, when drafting our proposals, I sought to strike a very important balance between shielding people, particularly children, from harm and ensuring a proportionate regime that preserves one of the cornerstones of our democracy—freedom of expression. I am confident that our response strikes that balance.
Under our proposals, online companies will face a new and binding duty of care to their users, overseen by Ofcom. If those platforms fail in that duty of care, they will face steep fines of up to £18 million or 10% of annual global turnover. A number of people, including Ian Russell, the father of Molly Russell, have expressed concerns about that point; I want to reassure him and Members of this House that the maximum fine will be the higher of those two numbers, and platforms will no longer be able to mark their own homework.
To hold major platforms to their responsibilities, I can also announce to the House that they will be required to publish annual transparency reports to track their progress, which could include the number of reports of harmful content received and the action taken as a result. This will be a robust regime, requiring those at the top to take responsibility. I can therefore confirm that we will legislate to introduce criminal sanctions for senior managers, with Parliament taking the final decision on whether to introduce that. Of course, we hope not to use those powers, and for tech companies to engineer the harm out of their platforms from the outset, but people should have no doubt that they remain an option and we will use them if we need to.
Together, those measures make this the toughest and most comprehensive online safety regime in the world. They will have a clear and immediate effect: a 13-year-old should no longer be able to access pornographic images on Twitter; YouTube will not be allowed to recommend videos promoting terrorist ideologies; and anti-Semitic hate crime will need to be removed without delay. Those are just a few examples, but the House will take a keen interest in the details of the legislation, so I shall lay out a few key areas of action.
Our first focus is on illegal content, including child sexual abuse, terrorism and posts that incite violence and hatred. Sadly, many Members present today have been the target of online abuse, some of which might have been illegal, such as threats of violence. Unfortunately, that is particularly true for female Members of the House. This is not a problem suffered only by people in the public eye; close to half of all adults in the United Kingdom say that they have been exposed to hateful content online in the past year.
Under the new laws, all companies in scope will need to take swift and effective action to remove criminal posts—if it is illegal offline, it is illegal online. Users will be better able to report this abhorrent content and can expect to receive more support from platforms. Crucially, the duty of care will apply even when communications are end-to-end encrypted. Encryption cannot serve as a protection blanket for criminals. Given the severity of certain threats, Ofcom will also be given powers to require companies to use technology proactively to identify and remove illegal content involving child sexual abuse or terrorism—that is a power of last resort.
Of course, not all harmful content is illegal. Every day, people are exposed to posts, images and videos that do not break any laws, but still cause a significant amount of harm. We all know that cyberbullying can ruin a child’s life, but I want first to address one particularly horrific form of legal content. Sadly, too many Members present will be aware of cases in which children are drawn into watching videos that can encourage self-harm. Some find themselves bombarded with that content, sometimes ending ultimately in tragedy. It is unforgivable that that sort of content should be circulating unchecked on social media. Given the severity of its consequences, I believe that there is a strong case for making it illegal.
I can therefore announce that the Government have asked the Law Commission to examine how the criminal law will address the encouragement or assistance of self-harm. This is an incredibly sensitive area. We need to take careful steps to ensure that we do not inadvertently punish vulnerable people, but we need to act now to prevent future tragedies.
Many Members are particularly concerned about the effect online harm has on children. We have reserved our strongest and toughest protections for them. All companies will need to consider seriously the risks their platforms may pose to children and to take action. They will no longer be able to abdicate responsibility by claiming that children do not use their services when that is manifestly untrue—we all know examples of that—and we also expect them to prevent children from accessing services that pose the highest risk of harm, including online pornography. Cutting-edge age assurance or verification technologies will be a vital part of keeping children safe online.
At the same time, we are going further than any other country to tackle other categories of legal but harmful content accessed by adults. Major platforms will face additional obligations to enforce their own terms and conditions against things such as dangerous vaccine misinformation and cyberbullying. Where the platforms fall short, they will face the legal consequences.
I know that some honourable Members are worried that the regulations may impose undue burdens on smaller, low-risk companies, so I can reassure them that we have included exemptions for such companies. As a result, less than 3% of UK businesses will fall within the scope of the legislation.
In this House we have always ardently championed freedom of expression. Robust and free debate is what gives our democracy its historic strength. So let me be clear: the purpose of the proposed regime is not to stop adults accessing content with which they disagree. It is not our job to protect people against being offended. I will not allow this legislation to become a weapon against free debate. Therefore, we will not prevent adults from accessing or posting legal content. Companies will not be able arbitrarily to remove controversial viewpoints, and users will be able to seek redress if they feel that content has been removed unfairly.
Nor will I allow this legislation to stifle media freedoms or become a charter to impose our world view and suppress that of others. I can confirm that news publishers’ own content on their sites is not in scope, nor are the comments of users on that content. This legislation is targeted exactly where it needs to be and tightly focused on delivering our core manifesto pledge to empower adult users to stay safe online while ensuring that children are protected.
We have engaged extensively to get to this point and this process is by no means over. We want all parliamentarians to feed into this significant piece of work and will continue to listen to their concerns as we go through pre-legislative scrutiny and beyond. However, I am confident that today’s measures mark a significant step in the continual evolution of our approach to life online, and it is fitting that this should be a step that our country takes. The world wide web was, of course, invented by a Brit, and now the UK is setting a safety standard for the rest of the world to follow. I commend this Statement to the House.”
My Lords, we welcome moves to protect children and the vulnerable online. We have been calling on the Government to introduce legislation in this area for several years. Their recent record, particularly on age verification, has been—let us call it—patchy. The Statement says that the UK will lead the way with online harms legislation, and we agree that this is a once-in-a-generation chance to legislate for the kind of internet that we all want to see—one that allows access to information, entertainment and knowledge on an unparalleled scale but at the same time keeps children and vulnerable adult citizens safe, and allows people to control the kind of content that they and those for whom they are responsible see online. Social media platforms have failed for years to self-regulate and we must not miss the opportunity afforded by the forthcoming legislation.
We welcome the announcement that Ofcom will be the regulator in this area. The duties to be allocated to it play to its founding principles, which require it to have regard to users of the services that it regulates as both consumers and citizens. We endorse the duty of care approach to regulation, which, if properly legislated for, has the potential to transform the way in which companies relate to their users. The excellent work done on that approach by the Carnegie UK Trust—in particular, Professor Lorna Woods and William Perrin—should be recognised. We support the measures announced in the Statement that seek to protect and enhance freedom of expression. In general, in so far as we can judge the Government’s current legislative intentions, there appears to be a workable and effective scheme of regulations here—but they should get on with it.
As to our concerns, does the Minister agree that the essential principle in play is that what is illegal in the real world must be illegal in the virtual world? However, the corollary is that we need to be clear that our existing laws are fit for purpose and up to date. What plans do the Government have in this complex area? The test for regulatory or criminal actions is to be “reasonably foreseeable harm” to individuals, and criminal acts. What happens to concerns about systems? If we lose focus on social networks, harms to society arising from disinformation or other threats to the integrity of the electoral process, for example, may not be in scope. That simply does not make sense. Does she agree that limiting the regulator to cases where individual harm has to be proven seems unduly restrictive?
Only the largest and riskier companies will fall into category 1. If they do, they will need to reduce the chance of harm to adults which, though not illegal, will presumably involve working with the regulator to reduce such harms as hate speech and self-harm. However, many of the most egregious examples of such activity have come from small companies. Why is size selected as a basis for this categorisation?
The financial and other penalties are welcome but there must be concerns about reach and scope, as many of the companies likely to be affected are based outwith the UK. Also, can the noble Baroness explain why the Government are not insisting on primary legislation to ensure that criminal liability will attach to senior executives for serious and repeated breaches of the law? Can she explain precisely what is meant by the move to the novel concept of “age assurance”? Age verification was the preferred option until recently. Has that now been dropped? Can we be assured that some means will be found to include fraud and financial scamming, possibly through joint action between regulators such as the FSA?
Finally, it is proposed that Ofcom will be empowered to accept “super-complaints”. That is welcome but it references the recent failure of the department to review in time the need for a similar power in the Data Protection Act. Can the noble Baroness update me on progress on that situation and confirm that this legislation could be used to redress it?
My Lords, over three years have elapsed and three Secretaries of State have come and gone since the Green Paper, in the face of a rising tide of online harms, not least during the Covid period, as Ofcom has charted. On these Benches, therefore, we welcome the set of concrete proposals we finally have to tackle online harms through a duty of care. We welcome the proposal for pre-legislative scrutiny, but I hope that there is a clear and early timetable for this to take place.
As regards the ambit of the duty of care, children are of course the first priority in prevention of harm, but it is clear that social media companies have failed to tackle the spread of fake news and misinformation on their platforms. I hope that the eventual definition in the secondary legislation includes a wide range of harmful content such as deep fakes, Holocaust denial and anti-Semitism, and misinformation such as anti-vax and QAnon conspiracy theories.
I am heartened too by the Government’s plans to consider criminalising the encouragement of self-harm. I welcome the commitment to keeping a balance with freedom of expression, but surely the below-the-line exemption proposed should depend on the news publisher being Leveson-compliant in how it is regulated. I think I welcome the way that the major impact of the duty of care will fall on big-tech platforms with the greatest reach, but we on these Benches will want to kick the tyres hard on the definition, threshold and duties of category 2 to make sure that this does not become a licence to propagate serious misinformation by some smaller platforms and networks.
I welcome the confirmation that Ofcom will be the regulator, but the key to success in preventing online harms will be whether Ofcom has teeth. Platforms will need to demonstrate how they have reduced the “reasonably foreseeable” risk of harm occurring from the design of their services. In mitigating the risk of “legal but harmful content”, this comes down to the way in which platforms facilitate and even encourage the sharing of extreme or sensationalist content designed to cause harm. As many excellent bodies such as Reset, Avaaz and Carnegie UK have pointed out—as the noble Lord, Lord Stevenson, said, the latter is the begetter of the duty of care proposal—this means having the power of compulsory audit. Inspection of the algorithms that drive traffic on social media is crucial.
Will Ofcom be able to make a direction to amend a recommender algorithm, how a “like” function operates and how content is promoted? Will it be able to inspect the data by which the algorithm trains and operates? Will Ofcom be able to insist that platforms can establish the identity of a user and address the issue of fake accounts, or that paid content is labelled? Will it be able to require platforms to issue fact-checked corrections to scientifically inaccurate posts? Will Ofcom work hand in hand with the Internet Watch Foundation? International co-ordination will be vital.
Ofcom will also need to work closely with the CMA if the Government are to protect vulnerable victims of online scams, fraud, and fake and misleading online reviews, if they are explicitly excluded from this legislation. Ofcom will need to work with the ASA to regulate harmful online advertising, as well. It will also need to work with the Gambling Commission on the harms of online black-market gambling, as was highlighted yesterday by my noble friend Lord Foster.
How will this new duty of care mesh with compliance with the age-appropriate design code, regulated by the ICO? As the noble Lord, Lord Stevenson, has mentioned, the one major fudge in the response is on age verification. The proposals do not meet the objectives of the original Part 3 of the Digital Economy Act. We were promised action when the response arrived, but we have a much watered-down proposal. Pornography is increasingly available and accessible to young people on more sites than just those with user-generated content. How do the Government propose to tackle this ever more pressing problem? There are many other areas that we will want to examine in the pre-legislative process and when the Bill comes to this House.
As my honourable friend Jamie Stone pointed out in the Commons yesterday, a crucial component of minimising risk online is education. Schools need to educate children about how to use social media responsibly. What commitment do the Government have to online media education? When will the strategy appear and what resources will be devoted to it?
These are some of the yet unanswered questions before the draft legislation arrives, but I hope that the Government commit to a full debate early in the new year so that some of these issues can be unpacked at the same time as the pre-legislative scrutiny process starts.
I thank both noble Lords for welcoming this full response to the consultation. I am happy to echo them both in their thanks, in particular to Carnegie UK and the important work it has done. We hope very much that the Bill will bring us into an age of accountability for big tech.
In response to the point made by the noble Lord, Lord Stevenson, what is illegal in the real world should indeed be illegal in the digital world. This Bill, when it comes, will help us move towards that. He raised the question about the focus on individuals. Obviously, the level of harm—in terms of the more individuals who are impacted—will be relevant to the sanctions that Ofcom can enforce. But he also raised a wider and very important point about trust in our institutions; clearly, social media and big tech platforms are institutions where the level of trust has been tremendously eroded in recent years. We want to restore that, so that what the big tech platforms say they will do is actually what happens in practice.
Both noble Lords asked about the category 1 companies, how those are defined and whether we will miss important actors as a result of that definition. Category 1 businesses will be based on size of audience but also on the functionality that they offer. For example, the ability to share content widely or to contact users anonymously, which are obviously higher-risk characteristics, could put a platform with a smaller audience into that category 1. Ofcom will publish the thresholds for these factors, assess companies against those thresholds and then publish a list of them. To be clear, all companies working in this area with user-generated content have to tackle all illegal content, and they have to protect children in relation to legal but harmful content. We are building safety by design into our approach from the get-go.
The noble Lord, Lord Stevenson, asked about criminal liability; we are not shying away from it. Indeed, the powers to introduce criminal liability for directors are, as he knows, being included in the Bill and can be introduced via secondary legislation. We would just rather give the technology companies a chance to get their house in order. The significant fines that can be levied—up to 10% of the turnover of the parent company or £18,000, whichever is higher—are obviously, for the larger tech companies, very substantial sums of money. We think that those fines will help to focus their minds.
The noble Lord, Lord Clement-Jones, talked about legal but harmful content. This is a very important and delicate area. We need to protect freedom of expression; we cannot dictate that legal content should automatically be taken down. That is why we agree with him that a duty of care is the right way forward. He questioned whether this would be sufficient to protect children. Our aim, and our number one priority, throughout this is clearly the protection of children.
The noble Lord, Lord Clement-Jones, asked a number of questions about Ofcom. I might not have time to answer them all now, but we believe that the Bill will give Ofcom the tools it needs to understand how to address the harms that need addressing through transparency reports, and to take action if needed. Ofcom will have extensive powers in order to achieve this. He also mentioned international co-ordination. We are clearly very open to working with other countries and regulators and are keen to do so.
Both noble Lords questioned whether the shift from age verification to age assurance is in some way a step backwards. We really do not believe that this is the case. We think that when the Bill comes, its scope will be very broad. We expect companies to use age-assurance or age-verification technologies to prevent children accessing services that pose the highest risk of harm to them, such as online pornography. The legislation will not mandate the use of specific technological approaches because we want it to be future-proofed. The emphasis will be on the duty of care and the undiluted responsibility of the tech companies to provide sufficient protection to children. We are therefore tech neutral in our approach, but we expect the regulator to be extremely robust towards those sites that pose the highest risk of harm to children.
The noble Lord, Lord Clement-Jones, also asked about our media literacy strategy, which we are working on at the moment.
My Lords, we now come to the 20 minutes allocated to Back-Bench questions. I urge noble Lords who wish to participate to keep their questions short, so that we can get in as many of the 16 who have asked to participate as possible.
My Lords, in clamping down on anonymous abuse online, what can be done to ensure those who need anonymity, such as victims of domestic or sexual abuse, can still have the protection of seeking help anonymously?
I thank my noble friend for her question. We do not intend to ban anonymity online for the very group who she talks about, or for whistleblowers and others, as this would interfere with their safety, privacy and freedom of expression. Our approach is to make sure that platforms tackle abuse online, including anonymous abuse. This is a very challenging area and we are aware that many people in public life, for example, suffer extensive anonymous abuse. It is an area that we will keep under review, but without sacrificing in any way the safety of those who need anonymity to be present online.
My Lords, the arrival of the Government’s response is most welcome, particularly its focus on young people. However, its focus on user-generated content, company size and the large number of exceptions move it away from the earlier and more flexible focus on assessing risk and preventing harm wherever it might be found. Concerningly, it leaves the system open to being gamed as companies redesign themselves to be out of scope rather than to prevent harm. How do the Government intend to tackle problems of explicit and violent content, which is widely reported on remote learning platforms, if edtech is out of scope? How do they intend to limit access to commercial porn sites that try to avoid regulation by not having user-generated content? Can she confirm that any company that introduces strangers who are adults to children via automated friend suggestions will be brought into scope, whatever the nature or size of the service?
My Lords, the noble Baroness raises important points. I stress that we really believe that we have broadened the scope of this legislation substantially from what was previously proposed. The new regime will capture the most-visited pornography sites and pornography on social media, so we think that the vast majority of sites on which children might be exposed to pornography will be within the scope of the legislation.
In relation to the noble Baroness’s specific points, I say that the situation with learning platforms has obviously changed dramatically this year, with Covid and the use and extent of remote learning. The principle that we were following was that there were already safeguarding and regulatory regimes in place within education, but we will obviously keep that dialogue open. On commercial pornography sites that do not host user-generated content, in most cases the user-generated and commercial content—if I can call it that—are closely intertwined, as the noble Baroness knows, so measures such as age verification or age assurance would be in place on those sites that would prevent underage access.
In relation to the noble Baroness’s final point, yes, those sites would be in the scope of the Bill, both because of the nature of the user interaction and because those services would need to assess the likelihood of children accessing them and therefore to have appropriate safeguards in place.
My Lords, I declare my interests, particularly my membership as a board member of the Centre for Data Ethics and Innovation. I sincerely congratulate the Government and other agencies such as the Carnegie UK Trust for these proposals, the way in which they have been developed and their substance. They have a very simple ethical code at their heart: if something is illegal or harmful offline, it should be illegal and considered harmful online. The protection of children is paramount; refinements will be needed, but the main direction is right. The proposals break new ground. I only hope that there will be a due sense of urgency as they are taken forward. I understand the need to focus the legislation, but given the decision to rule fraud and certain other areas out of scope—which will no doubt continue to be debated—when will we see an overall digital strategy so that we can see this Bill as part of a whole?
Part of the reason for defining the scope in a way that excludes, for example, fraud is that it is not typically user-generated content; it is also the result of the point that the right reverend Prelate makes about speed of implementation, which is obviously paramount. The Government have recently announced a new national data strategy, which I am happy to share with him if he has not already seen it.
I also congratulate the Government on bringing forward this White Paper. It is time that those who generate such depravity and abuse of children are challenged. It is an issue in which I have a particular interest because during the 2017 general election campaign, when I fought to retain my parliamentary seat, together with my family I was subjected to a torrent of abuse online from anonymous contributors. Try as I might, I was unable to obtain the assistance of the leading social media companies to take action, so I have a simple question. In the response to the White, Paper, the Government talk of
“setting codes of practice, establishing a transparency, trust and accountability framework and requiring … companies to have effective and accessible mechanisms for users to report concerns”.
If this process is to be effectively policed, what additional resources will be provided to the regulator to enable an effective investigative and prosecuting regime to enforce against not just the social media companies but also the perpetrators? What oversight will there be to ensure that companies are not marking their own homework?
We are absolutely committed to the era of “marking their own homework” being over. We will obviously make sure that Ofcom, in particular, is sufficiently resourced in terms of capacity for the incredibly important task that it faces. Where Ofcom needs specific expertise—for example, a skilled person’s report—we are committed to that being made available.
My Lords, today I was contacted by a very concerned mother, who asked me two questions which I would like to put to the Minister. First, why have the Government decided to seek to protect children only from user-generated pornography when, back in 2015, they committed to stop children’s exposure to harmful sexual content online by requiring age verification to access all sites containing pornographic material? Secondly, how will the Government protect children from user-generated pornography through fines on sites based abroad, when they are not subject to our law enforcement? I plead with the Government, in the interim, to implement Part 3 of the Digital Economy Act. This would protect children from pornographic sites based outside the UK, through its blocking provision, until the proposed watered-down version of age assurance becomes law, which could be in two to five years.
I understand the concerns raised by the noble Baroness and by the mother to whom she has spoken. There are not many parents in the land who have not had some of her concerns. We are focusing on user-generated content because we believe that will capture the vast majority of pornographic and inappropriate behaviour that children witness. However, as I said in response to an earlier question, we will keep it under review. Our ambition is to keep children safe. Ofcom has business disruption and ISP blocking within its powers, which would prevent children in this country seeing international content.
My Lords, I too welcome the Government’s White Paper. However, I have two concerns. The first is that the proposed regime of duty of care puts the onus of responsibility for dealing with legal but harmful content on the platforms alone. There seem to be few sanctions on the individual user who creates the harmful content. Surely the new legislation should contain a requirement for the platform’s terms and conditions to contain a regime for it to suspend serial creators of harmful content. My other concern is that, once the platforms have deleted material, it disappears for ever. As a result, information from posts which are found to be criminal, once deleted, is subsequently unavailable for investigators and police who need to access crucial evidence to prosecute crimes. Will the Minister ensure that the legislation includes a requirement for a safe, secure, digital area to be created by platforms where illegal, deleted material can be stored for future legal use?
In response to the noble Viscount’s second point, I will definitely take back to the department his suggestion about the retention of illegal content. He made a valid point about the duty of care, but companies will need to set out in their terms and conditions what the categories of content are and what acceptable behaviour is on their site. The regulator will expect them to take action against just the sort of people to whom the noble Viscount refers.
My Lords, it was very welcome to see the Government’s response published yesterday, and I offer my congratulations to my noble friend and her fellow Ministers for doing so when so much else is going on. The misinformation about the Covid vaccine demonstrates just why these proposals need to be put into law as soon as possible. How soon will the Bill be ready to be published? Will we see it early in the new year? Will the draft secondary legislation be published alongside the draft Bill, and how long will both Houses and the public have for pre-legislative scrutiny?
The legislation will be ready next year. We will make final decisions on legislative timings nearer the time, but I think that my noble friend will have heard that the Secretary of State is minded to carry out pre-legislative scrutiny. I appreciate that some time has been taken on this. As my noble friend knows, we have taken a deliberately consultative approach on the Bill but are now working at pace to implement it.
[Inaudible.]
My Lords, the noble Lord needs to unmute himself. I am afraid that we still cannot hear him, so perhaps we should move on to my noble friend Lord Vaizey and see whether we can return to the noble Lord, Lord McNally, later.
My Lords, those are big shoes to fill. I begin by congratulating not only the Minister but her incredibly hard-working officials who have produced this exemplary template for online regulation. I make these points only for emphasis, as so many brilliant questions have already been asked. As we seize long-overdue control of our fish, can we at least reach out to our former European partners, who have just published the Digital Services Act, to ensure that we do some joined-up thinking on online regulation in the UK, Brussels, Ireland and, I gather, Canada? Can we also, as the noble Lord, Lord Clement-Jones, pointed out, do joined-up thinking domestically between Ofcom, the ICO, the CMA, the age appropriate design code and any other acronym that I can quickly think of?
My noble friend makes important points. Of course, we are co-operating with all the different three-letter acronyms that he mentioned and maybe many more—who knows? In all seriousness, there is also a balance to be struck in the delivery of this important legislation.
My Lords, this is a welcome move, if achingly slow. I have just a couple of questions. First, in Annexe A, companies are expected to assess themselves on whether their service is likely to be accessed by children. What level of confidence does the Minister have that companies will reveal themselves as having access to children? For instance, WhatsApp has changed its age limit twice since 2018. Is she confident that they will be honest about the number of children under the ages of 16 or 13 using their services? Does she accept that the decision to exempt online news organisations leaves open a back door to online harm? Under these proposals, the Daily Mail is still able to share the video of the Christchurch mosque attack, which Google and Facebook are not. Will she take a look at that issue?
I am aware that if my noble friend Lord McNally were asking a question right now, he would suggest that the pre-legislative scrutiny should be done by a Joint Committee. My plea on that—I declare an interest as a member of one of the relevant committees that will scrutinise this—is that speed is of the essence. Unless we are able to scrutinise swiftly, we leave many vulnerable to the internet. This has been too long in the making.
On the noble Baroness’s first point, I understand why she asks about it and we have given the matter careful consideration. Platforms will need to prove that children are not accessing their content by sharing any existing age verification or assurance information, by reviewing the data on their users. They will need to evidence that in a robust way to satisfy Ofcom. I shall take back the point regarding the Christchurch video. I know that my right honourable friend the Secretary of State talked about how he valued the expertise of both Houses, so I hope that is a warm note regarding scrutiny.
Because of its focus on user-generated content, it is quite clear that the online harms Bill greatly weakens the protection afforded to children in relation to assessing pornographic websites. This House determined that this should be provided through Part 3 of the Digital Economy Act, as the noble Baroness, Lady Benjamin, mentioned. Who has pressed the Government not to implement Part 3? What should I tell a concerned father who contacted me this morning, saying, “The Government promised to protect children from pornographic websites, not just user-generated content on pornographic websites”?
I understand my noble friend’s concern but, as I said to the noble Baroness, Lady Benjamin, the vast majority of pornographic content that children come across is on social media rather than online pornography sites, and those online sites are often intertwined with user-generated content. So we are confident that the vast majority of content will not be accessible to children.
I call the noble Lord, Lord McNally, again.
My Lords, not guilty, but happy to get in. Earlier this year, the noble Lord, Lord Puttnam, chaired a committee of this House which produced the report Digital Technology and the Resurrection of Trust, about the damage caused to our political and democratic system by online harm. The Government are choosing to ignore this. Does that not leave a massive stable door in the legislation? Will she assure me that the noble Lord, Lord Puttnam, will be able to give evidence to pre-legislative scrutiny to make the case for action in this area?
The issue of political and democratic trust is obviously incredibly important. As I mentioned, trust has been severely eroded by social media companies and other platforms. By restoring that trust and managing the content that could be physically or psychologically harmful, we will help to narrow that gap.
Do the Government really believe that this House, which passed Part 3 of the Digital Economy Act 2017 to give effect to the Conservative manifesto commitment of 2015, would accept the much weaker proposal set out by the Government yesterday for protecting children from accessing pornographic websites? The Government seem to think that, because they now propose to do things to address other online harms, including access to pornography on Twitter, we would somehow be prepared to overlook the fact that they propose putting children in a more vulnerable position with respect to their protection from pornographic websites. I urge the Government to adjust their course and ensure that the protections in their online harms Bill are just as robust as those in Part 3 of the Digital Economy Act, and to implement Part 3 in the interim so that children can be protected while we wait for the online harms Act.
I shall avoid repeating what I have said already on this issue. The focus in the Bill will put the responsibility on the platforms to have strong safety measures to protect children from accessing pornographic and other inappropriate content. If they do not do that, parents and children can report them and Ofcom will take enforcement action.